-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I use other privatized and open-source LLM models? #94
Comments
Yes of course but it needs a bit of modifications. What do you need ? |
Was going to make an issue, but will hop onto this one. Would like to operate with LLMs hosted with either vLLM or llama.cpp with models like llama 3.1 70b or llava for multi-modal? |
+1 it would be nice to have an option to use with local ollama server, and models like llama-vision and llava. |
+1 for ollama and maybe LM Studio |
Ollama now supports |
No description provided.
The text was updated successfully, but these errors were encountered: