New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Ollama service is unavailable - MacOs - Vercel #2327
Comments
👀 @RVciTo Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
|
同样的问题,是浏览器的问题 要换个浏览器就行了 那个 Safari浏览器 那个请求头有问题 |
The same problem is caused by the browser. Just change the browser. There is something wrong with the Safari browser and the request header. |
我用 Arc 浏览器操作的,从来没有正常运行过 |
I use Arc Browser to operate it and it has never worked properly. |
Works on Brave browser for me, atleast I am not getting the CORS error message and the connectivity check in the Language Model settings passes |
@Mrered 换成 Chrome 试试呢? |
目前只安装了 Arc 和 Safari ,有点不想单独再装一个 Chrome 了,等之后再测吧,目前用 OpenCat 和 Open WebUI 可正常访问。 |
At present, only Arc and Safari are installed. I don’t want to install Chrome separately. I will test it later. Currently, OpenCat and Open WebUI can be accessed normally. |
💻 Operating System
macOS
📦 Environment
Vercel / Zeabur / Sealos
🌐 Browser
Safari
🐛 Bug Description
I can't use Ollama as a language model.
Ollama service is unavailable. Please check if Ollama is running properly or if the cross-origin configuration of Ollama is set correctly.
Show Details
json
{
"host": "http://localhost:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
}
Also tried with IP address instead of Local Host. Did confirm that Ollama is running from http://localhost:11434
🚦 Expected Behavior
Being able to use Ollama as a language model
📷 Recurrence Steps
Run LobeChat through vercel
Run Ollama Locally
Try to use Ollama with LobeChat through vercel
📝 Additional Information
I followed those steps : https://lobehub.com/docs/usage/providers/ollama
The text was updated successfully, but these errors were encountered: