Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow developers to specify local LLMs (via Ollama APIs) in llm_providers #312

Open
salmanap opened this issue Nov 27, 2024 · 0 comments
Open
Assignees

Comments

@salmanap
Copy link
Contributor

Today, archgw lets developers configure LLMs (via the llm_provider primitive) that are API keys based. We need to add support for LLMs that might be running locally via Ollama. Support for vLLM isn't available on mac, so will add that as a separate roadmap item in the near future.

@salmanap salmanap converted this from a draft issue Nov 27, 2024
@salmanap salmanap modified the milestone: release 0.2.0 Nov 27, 2024
@salmanap salmanap changed the title Add support for LLMs configured in llm_providers to point to local instances (via Ollama) Allow developers to specify local LLMs (via Ollama APIs) in llm_providers Nov 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Todo
Development

No branches or pull requests

2 participants