Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can I use other privatized and open-source LLM models? #94

Open
JAVA-LW opened this issue Oct 31, 2024 · 5 comments
Open

Can I use other privatized and open-source LLM models? #94

JAVA-LW opened this issue Oct 31, 2024 · 5 comments

Comments

@JAVA-LW
Copy link

JAVA-LW commented Oct 31, 2024

No description provided.

@StanGirard
Copy link
Contributor

Yes of course but it needs a bit of modifications. What do you need ?

@tblattner
Copy link

Was going to make an issue, but will hop onto this one.

Would like to operate with LLMs hosted with either vLLM or llama.cpp with models like llama 3.1 70b or llava for multi-modal?

@heltonteixeira
Copy link

heltonteixeira commented Dec 5, 2024

+1 it would be nice to have an option to use with local ollama server, and models like llama-vision and llava.

@mzeidhassan
Copy link

+1 for ollama and maybe LM Studio

@netandreus
Copy link

Ollama now supports llama3-vision model - https://ollama.com/library/llama3.2-vision It will be very nice to support it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants