New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Ollama service is unavailable #2337
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
我也是,局域网有一台机器开的ollama,其他的应用比如openwebui和dify都可以调用ollama,就是lobechat不行,错误提示和楼主一样,配置了“ollama origins”也不行 |
Me too, I have a machine running ollama on the local area network. Other applications such as openwebui and diffy can call ollama, but lobechat cannot. The error message is the same as the original poster, even if "ollama origins" is configured. |
我也有同样的问题,局域网 Ollama 用 Open WebUI 都可以访问,用 LobeChat 不行 |
I also have the same problem. I can access Ollama on the LAN using Open WebUI, but not LobeChat. |
我在Windows环境里,局域网内通过lobechat连Ollama遇到了同样的问题。(openweb ui可以正常连接。) 检查了ollama的连接记录,局域网连接的时候显示了以下日志。 我在系统环境变量追加了OLLAMA_ORIGINS=*,重启ollama之后就可以正常连接了。 |
I encountered the same problem in a Windows environment and connected to Ollama through lobechat in the LAN. (openweb ui can connect normally.) I checked ollama's connection record and the following log was displayed when connecting to the LAN. I added OLLAMA_ORIGINS=* to the system environment variable, and after restarting ollama, I can connect normally. |
我已经将 OLLAMA_ORIGINS=* 设置到环境变量了,【沉浸式翻译】都可以正常调用,就是 LobeChat 不行 |
I have set OLLAMA_ORIGINS=* to the environment variable, and [Immersive Translation] can be called normally, but LobeChat cannot. |
暂时没有回复,如果大家只是想有一个UI能够比较方便的调试LLM的话,可以使用一下这个项目,简单易部署:chatbot-ollama |
There is no reply yet. If you just want a UI that can debug LLM more conveniently, you can use this project, which is simple and easy to deploy: Ollama-ui |
本地部署的 LobeChat docker 镜像,如果没改端口,访问 localhost:3210 就行了
你有试过直接会话么? |
For the locally deployed LobeChat docker image, if the port has not been changed, just visit localhost:3210.
Have you tried direct conversation? |
Same issue {
"host": "http://127.0.0.1:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
} |
💻 Operating System
macOS
📦 Environment
Docker
🌐 Browser
Chrome
🐛 Bug Description
问题1, 文档里一致没写,docker run后,打开什么url访问chat,请问是这个么https://chat-preview.lobehub.com/chat?session=inbox&agent=
问题2, 如果是,想请问,为什么是访问一个远程的网页,而不是localhost的呢?
问题3, 3210端口是用来做啥的?
问题4 在本地启动了ollama后, 也启动了 ollama run mistral, terminal可以正常使用,curl localhost:11434也可以. 也执行了launchctl setenv OLLAMA_ORIGINS "*". 但在“语言模型“配置里,仍然链接不上ollama,报错是:
response.OllamaServiceUnavailable
Show Details
json
{
"host": "http://127.0.0.1:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
}
请问可能是什么问题呢?
🚦 Expected Behavior
期望使用lobe-chat访问本地ollama里的mistral模型
📷 Recurrence Steps
在本地启动了ollama后, 也启动了 ollama run mistral, terminal可以正常使用,curl localhost:11434也可以. 也执行了launchctl setenv OLLAMA_ORIGINS "*". 启动:docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 lobehub/lobe-chat
在“语言模型“配置里,仍然链接不上ollama,报错是:
response.OllamaServiceUnavailable
Show Details
json
{
"host": "http://127.0.0.1:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
}
📝 Additional Information
No response
The text was updated successfully, but these errors were encountered: