Struggle to make it work - ollama url not saved #56
Unanswered
HyperUpscale
asked this question in
Q&A
Replies: 1 comment
-
Please read the troubleshooting guide. Also based on the directory in the stack trace, you are using Windows. Please use a Linux host if possible. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Running the docker image...
I put:
extra_hosts:
But still in the Ollama Endpoint the url is : "http://localhost:11434"
When I manually edit it to: "http://host.docker.internal:11434" then it finds the models and seem to start working , but doesn't keep that setting.
🗂️ GitHub Repo
Select a GitHub.com repo
Processing...
✔️ LLM Initialized
✔️ Embedding Model Created
FileNotFoundError: [Errno 2] No such file or directory: '/home/appuser/data'
Traceback:
File "/home/appuser/utils/rag_pipeline.py", line 119, in rag_pipeline
documents = llama_index.load_documents(save_dir)
File "/home/appuser/utils/llama_index.py", line 99, in load_documents
for file in os.scandir(data_dir):
I cant make it work.
Beta Was this translation helpful? Give feedback.
All reactions