Is it possible to use the container without an internet connection? #820
-
I have an unreliable internet connection. Today, I tried to use the Open WebUI container from localhost but was unable to connect to the container because it requires an internet connection to start. Is there a way for me to run it totally offline? I've been using Ollama as an alternative to ChatGPT when my internet connection is down. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
Can't speak for the container, but I'm just working through installing offline, or rather without hugging face access. Build ok, but even though I've copied in the model all-MiniLM-L6-v2 into place it still seems to want to connect to hugging face, will need bit of troubleshooting. |
Beta Was this translation helpful? Give feedback.
-
Yes, you can run the Open WebUI locally. without interner connection. Of course you need to have a connection to pull the image. |
Beta Was this translation helpful? Give feedback.
-
I just created a podman quadlet and it seems to have been fixed for a while now, I'm closing the discussion. |
Beta Was this translation helpful? Give feedback.
I just created a podman quadlet and it seems to have been fixed for a while now, I'm closing the discussion.