Replies: 29 comments 25 replies
-
Read the Ollama instructions for setting environment variables on Linux and then change your |
Beta Was this translation helpful? Give feedback.
-
i still have the problem.
i did
after that: |
Beta Was this translation helpful? Give feedback.
-
Set the environment variable for the Ollama host in your docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main You may also need |
Beta Was this translation helpful? Give feedback.
-
i have the same result. what can help you to know where is the problem? |
Beta Was this translation helpful? Give feedback.
-
Try adding this into the
And be sure you're removing the previous containers you've launched. If it's still not working, we should probably be certain that your Ollama is reachable: curl http://your_ip_address_not_localhost:11434/api The result should say "OK". If this can't be done then Ollama is not properly using the environment variable you set. |
Beta Was this translation helpful? Give feedback.
-
i use the docker run that you provided and i set the envvars in /etc/systemd/system/ollama.service.d/override.conf. the envvars taht i set are: OLLAMA_HOST=0.0.0.0 and OLLAMA_ORIGINS=*. my ollama work on the same device as open-webui and it work for exemple if i do curl http://localhost:11434/api/version i have {"version":"0.1.25"}. i always remove the previous as otherwise i can't create a new one. i used : docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main and it still doesn't work |
Beta Was this translation helpful? Give feedback.
-
Let's try an alternative approach from here: docker run -d --network=host -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main You'll access the WebUI from http://localhost:8080 instead now. |
Beta Was this translation helpful? Give feedback.
-
when i do that i have: "Unable to connect" with that. i think it s because:
|
Beta Was this translation helpful? Give feedback.
-
When we use Is it possible you have something else that already claimed port 8080? |
Beta Was this translation helpful? Give feedback.
-
i dont think and i don't find 8080 when i do |
Beta Was this translation helpful? Give feedback.
-
Well, now I'm unsure. Maybe this will need fresh eyes to take a look. |
Beta Was this translation helpful? Give feedback.
-
Could you try the following commands:
and visit: http://127.0.0.1:8080/ Keep us updated! |
Beta Was this translation helpful? Give feedback.
-
when i do that i have: "Unable to connect" |
Beta Was this translation helpful? Give feedback.
-
I remove the container every time. I can't install a new OS. I tried to install it on my host, not in docker and it work. but i would prefer to find the problem to use it in docker. What can i do ? Because I have no clue about the problem. For me it seem to be a network problem as the docker don't have access to Ollama and when i use the --network=host i don't have access to it but i don't find where the problem is. |
Beta Was this translation helpful? Give feedback.
-
first get docker host ip, host.docker.internal is for docker desktop, I just use docker, so we need find docker network for host, and replace it. |
Beta Was this translation helpful? Give feedback.
-
when i do that after i have "Connection Issue or Update Needed". Because it doesn't have access to Ollama. my docker0 is 172.17.0.1 too. |
Beta Was this translation helpful? Give feedback.
-
on the host:
in the docker:
|
Beta Was this translation helpful? Give feedback.
-
I don't think i have a firewall enabled |
Beta Was this translation helpful? Give feedback.
-
I have the same problem |
Beta Was this translation helpful? Give feedback.
-
Try to clear your cache, it worked for me. |
Beta Was this translation helpful? Give feedback.
-
I had the same issue, it seems that docker could not access the API from outside. In overall, I solved the issue by using the bridge and changing the default port. |
Beta Was this translation helpful? Give feedback.
-
have the same problem as osafaimal . Ollama running on local machine, can connect from cli and chat, can do http://0.0.0.0:11434/api/version and get the version as response. Tried all solutions proposed above and so far no luck. always the same results. If I launch the container with the --network="host" (doesn't matter if I add a -e PORT=8080 or not) the container says it available on 0.0.0.0:8080 but it's not, not at localhost, not at localIP and port 8080 is NOT opened. Really tried everything I could read and re-read all the docs and comments here. No luck. Please note that in terminal with ollama run llama2:13b-chat I can chat fine and everything works really well... |
Beta Was this translation helpful? Give feedback.
-
sorry my mistake, already tried as this is the first suggestion on the Readme page... |
Beta Was this translation helpful? Give feedback.
-
not sure if this is useful to anybody. Part of the docker logs:
|
Beta Was this translation helpful? Give feedback.
-
OK, my problem solved. where xxx is my local LAN IP address Once logged in I changed the connection string to From there on it works like magic!!! |
Beta Was this translation helpful? Give feedback.
-
well update. It doesn't work. It worked once and that was it. Was not logical to start with, not sure how it managed to work. |
Beta Was this translation helpful? Give feedback.
-
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main works for me, Ollama Base URL becomes http://127.0.0.1:11434 |
Beta Was this translation helpful? Give feedback.
-
Log
EnvironmentOS: Sonoma 14.3 Docker command
ollama serveUrl: Docker logs
|
Beta Was this translation helpful? Give feedback.
-
change on settings to: http://host.docker.internal:11434 |
Beta Was this translation helpful? Give feedback.
-
Bug Report
Description
Bug Summary:
open-webui doesn't detect ollama
Steps to Reproduce:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
Expected Behavior:
to be able to use it
Actual Behavior:
On the webpage i have:
Connection Issue or Update Needed
Oops! It seems like your Ollama needs a little attention.
We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version
(version 0.1.16 or higher) or check your connection.
Trouble accessing Ollama? Click here for help.
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
with docker, i tried:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
anddocker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
same result and withdocker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
i have on the webpage http://localhost:3000:Open WebUI Backend Required
Oops! You're using an unsupported method (frontend only). Please serve the WebUI from the backend.
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions