-
-
Notifications
You must be signed in to change notification settings - Fork 548
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorporate Gemini & Gemini Vision support #716
Comments
Hey @supermomo668 ! First of all, agreed with your feature request, I want that too haha. Implementation-wise, we've been thinking that the best approach for adding support for additional models will be through setting up instructions for using a proxy server like LiteLLM and allowing you to set a custom openai base URL in the chat configuration. If we do take that up as recommendation, we should certainly improve our documentation as well. |
Provide Chat Model support for Gemini
With the recent advances with Gemini and long context window features, I would like to add Gemini as a chat model available for administrator to include that as a choice of chat model.
Potentially I would like this to be assigned to me since I'd like to use it also.
The text was updated successfully, but these errors were encountered: