Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Ollama service is unavailable #2337

Open
yanliang84 opened this issue May 1, 2024 · 14 comments
Open

[Bug] Ollama service is unavailable #2337

yanliang84 opened this issue May 1, 2024 · 14 comments
Labels
🐛 Bug Something isn't working | 缺陷

Comments

@yanliang84
Copy link

💻 Operating System

macOS

📦 Environment

Docker

🌐 Browser

Chrome

🐛 Bug Description

问题1, 文档里一致没写,docker run后,打开什么url访问chat,请问是这个么https://chat-preview.lobehub.com/chat?session=inbox&agent=
问题2, 如果是,想请问,为什么是访问一个远程的网页,而不是localhost的呢?
问题3, 3210端口是用来做啥的?
问题4 在本地启动了ollama后, 也启动了 ollama run mistral, terminal可以正常使用,curl localhost:11434也可以. 也执行了launchctl setenv OLLAMA_ORIGINS "*". 但在“语言模型“配置里,仍然链接不上ollama,报错是:
response.OllamaServiceUnavailable
Show Details
json
{
"host": "http://127.0.0.1:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
}
请问可能是什么问题呢?

🚦 Expected Behavior

期望使用lobe-chat访问本地ollama里的mistral模型

📷 Recurrence Steps

在本地启动了ollama后, 也启动了 ollama run mistral, terminal可以正常使用,curl localhost:11434也可以. 也执行了launchctl setenv OLLAMA_ORIGINS "*". 启动:docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 lobehub/lobe-chat

在“语言模型“配置里,仍然链接不上ollama,报错是:
response.OllamaServiceUnavailable
Show Details
json
{
"host": "http://127.0.0.1:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
}

📝 Additional Information

No response

@yanliang84 yanliang84 added the 🐛 Bug Something isn't working | 缺陷 label May 1, 2024
@lobehubbot
Copy link
Member

👀 @yanliang84

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@kampchen
Copy link

kampchen commented May 3, 2024

我也是,局域网有一台机器开的ollama,其他的应用比如openwebui和dify都可以调用ollama,就是lobechat不行,错误提示和楼主一样,配置了“ollama origins”也不行

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Me too, I have a machine running ollama on the local area network. Other applications such as openwebui and diffy can call ollama, but lobechat cannot. The error message is the same as the original poster, even if "ollama origins" is configured.

@Mrered
Copy link

Mrered commented May 4, 2024

我也有同样的问题,局域网 Ollama 用 Open WebUI 都可以访问,用 LobeChat 不行

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I also have the same problem. I can access Ollama on the LAN using Open WebUI, but not LobeChat.

@ninjadogz
Copy link

我在Windows环境里,局域网内通过lobechat连Ollama遇到了同样的问题。(openweb ui可以正常连接。)

检查了ollama的连接记录,局域网连接的时候显示了以下日志。
[GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags"
可能是从origin “http://10.168.10.1:3210” 获取 “http://(ollama):11434/api/tags” 的访问被 ollama的CORS 策略阻止了。

我在系统环境变量追加了OLLAMA_ORIGINS=*,重启ollama之后就可以正常连接了。
set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=*; ollama serve
linux用export或者修改.bashrc应该可以达到一样的效果。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I encountered the same problem in a Windows environment and connected to Ollama through lobechat in the LAN. (openweb ui can connect normally.)

I checked ollama's connection record and the following log was displayed when connecting to the LAN.
[GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags"
It may be that access to "http://(ollama):11434/api/tags" from origin "http://10.168.10.1:3210" is blocked by ollama's CORS policy.

I added OLLAMA_ORIGINS=* to the system environment variable, and after restarting ollama, I can connect normally.
set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=*; ollama serve
Linux should be able to achieve the same effect by using export or modifying .bashrc.

@Mrered
Copy link

Mrered commented May 5, 2024

我在Windows环境里,局域网内通过lobechat连Ollama遇到了同样的问题。(openweb ui可以正常连接。)

检查了ollama的连接记录,局域网连接的时候显示了以下日志。 [GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags" 可能是从origin “http://10.168.10.1:3210” 获取 “http://(ollama):11434/api/tags” 的访问被 ollama的CORS 策略阻止了。

我在系统环境变量追加了OLLAMA_ORIGINS=*,重启ollama之后就可以正常连接了。 set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=*; ollama serve linux用export或者修改.bashrc应该可以达到一样的效果。

我已经将 OLLAMA_ORIGINS=* 设置到环境变量了,【沉浸式翻译】都可以正常调用,就是 LobeChat 不行

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I encountered the same problem in a Windows environment and connected to Ollama through lobechat in the LAN. (openweb ui can connect normally.)

Checked ollama's connection record, and the following log was displayed when connecting to the LAN. [GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags" may be from origin "http://10.168.10.1:3210" Access to "http://(ollama):11434/api/tags" is blocked by ollama's CORS policy.

I added OLLAMA_ORIGINS=* to the system environment variable, and after restarting ollama, I can connect normally. set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=*; ollama serve The same effect should be achieved by using export or modifying .bashrc on Linux.

I have set OLLAMA_ORIGINS=* to the environment variable, and [Immersive Translation] can be called normally, but LobeChat cannot.

@yanliang84
Copy link
Author

yanliang84 commented May 5, 2024

暂时没有回复,如果大家只是想有一个UI能够比较方便的调试LLM的话,可以使用一下这个项目,简单易部署:chatbot-ollama

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


There is no reply yet. If you just want a UI that can debug LLM more conveniently, you can use this project, which is simple and easy to deploy: Ollama-ui

@arvinxx
Copy link
Contributor

arvinxx commented May 6, 2024

问题1, 文档里一致没写,docker run后,打开什么url访问chat

本地部署的 LobeChat docker 镜像,如果没改端口,访问 localhost:3210 就行了

在“语言模型“配置里,仍然链接不上ollama,报错是:

你有试过直接会话么?

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Question 1, it is not written in the document. After docker run, what URL should be opened to access chat?

For the locally deployed LobeChat docker image, if the port has not been changed, just visit localhost:3210.

In the "Language Model" configuration, ollama still cannot be connected, and the error is:

Have you tried direct conversation?

@holycrypto
Copy link

Same issue

{
  "host": "http://127.0.0.1:11434",
  "message": "please check whether your ollama service is available or set the CORS rules",
  "provider": "ollama"
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

8 participants
@arvinxx @holycrypto @Mrered @yanliang84 @ninjadogz @lobehubbot @kampchen and others