[Question] How can I connect the UI to text-generation-webui #1091
-
🧐 问题描述 | Proposed SolutionI can see in the readme that it connects to openAI api, and text-generation webui created an api at 127.0.0.1:5000/v1 Thanks for the help! 📝 补充信息 | Additional InformationNo response |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments
-
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
Beta Was this translation helpful? Give feedback.
-
The current custom service domain does not seem to support filling in the locally deployed model link, as the request to chat goes through cloud servers, and the cloud servers cannot access the local model service. |
Beta Was this translation helpful? Give feedback.
-
So running my own llamacpp server is not supported? only official openai api ? |
Beta Was this translation helpful? Give feedback.
-
Currently, only support online services with OpenAI interface style. The request code can be found here: This is initiated by a cloud server and cannot be accessed by local or intranet services. |
Beta Was this translation helpful? Give feedback.
-
As far as I understand, most UI's like text-generation-webui provide a openai compatible api, it has completion / chat / v1 etc, so its not compatible with this project? |
Beta Was this translation helpful? Give feedback.
-
@iChristGit if you run a docker on local, and then add the proxy url to here, I think it will work.
Yes, we plan to support the offline models. please follow #151 |
Beta Was this translation helpful? Give feedback.
@iChristGit if you run a docker on local, and then add the proxy url to here, I think it will work.
Yes, we plan to support the offline models. please follow #151