LiteLLM Integration in the Backend #604
Replies: 3 comments 3 replies
-
I've actually been thinking about incorporating LiteLLM directly to the webui as well, this will be helpful for a lot of people! |
Beta Was this translation helpful? Give feedback.
-
Ollama now has OpenAI API compatibility: https://github.com/ollama/ollama/releases/tag/v0.1.24 I've grown even more convinced that the project could significantly simplify its codebase by exclusively using one API middleware library. This would allow for the efficient aggregation of multiple endpoints within the interface. Observing that a vast majority of inference backends are aligning with the OpenAI API specification, it seems feasible to make the UI backend-agnostic by solely supporting OAI. This move could greatly reduce complexity and increase the system's flexibility.
|
Beta Was this translation helpful? Give feedback.
-
+1 Is there any progress going in this direction? |
Beta Was this translation helpful? Give feedback.
-
Hi everyone,
I'd like to propose integrating LiteLLM directly into the Ollama WebUI project. This would allow us to have greater flexibility in the OpenAI-compatible external APIs we can use, as well as giving us more control over how we use and configure those APIs.
There are a few different ways we could go about integrating LiteLLM:
Regardless of which approach we take, I believe that integrating LiteLLM into Ollama WebUI would provide significant benefits in terms of flexibility and control. I'd love to hear everyone's thoughts on this proposal!
Beta Was this translation helpful? Give feedback.
All reactions