[Bug] 本地的 Ollama 模型无法调用 #2284
-
💻 Operating SystemmacOS 📦 EnvironmentDocker 🌐 BrowserChrome 🐛 Bug DescriptionMac 本地 Docker 部署之后,无法调用本地的 ollama 模型。 🚦 Expected BehaviorNo response 📷 Recurrence Steps正常使用都会必现 📝 Additional Information
|
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 1 reply
-
👀 @jerlinn Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
Beta Was this translation helpful? Give feedback.
-
能否看下控制台的错误? |
Beta Was this translation helpful? Give feedback.
-
Can you check the console for errors? |
Beta Was this translation helpful? Give feedback.
-
Got the same error 17930-c6532f96900818ef.js:1 Route: [ollama] InvalidOllamaArgs:
{error: undefined, errorType: 'InvalidOllamaArgs'}
error
:
undefined
errorType
:
"InvalidOllamaArgs"
[[Prototype]]
:
Object |
Beta Was this translation helpful? Give feedback.
-
Okey I figure it out |
Beta Was this translation helpful? Give feedback.
Okey I figure it out
Check the Interface proxy address and make sure it start with 'http://'