Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add groq API #1407

Open
iamnvt opened this issue Apr 21, 2024 · 16 comments
Open

Add groq API #1407

iamnvt opened this issue Apr 21, 2024 · 16 comments

Comments

@iamnvt
Copy link

iamnvt commented Apr 21, 2024

this is the game changer for the speed with llama 3.

@spikecodes
Copy link

+1 for this ^

@yangcheng
Copy link

it can be as simple as let cursor respect 'OPENAI_API_BASE_URL' env

@iamnvt
Copy link
Author

iamnvt commented Apr 23, 2024 via email

@CalamariDude
Copy link

This would be awesome - has anyone figured out a workaround for now it would be a gamechanger prob 10x my dev speed

@yangcheng
Copy link

yangcheng commented Apr 24, 2024

This would be awesome - has anyone figured out a workaround for now it would be a gamechanger prob 10x my dev speed

yes! just override openai base url to groq https://api.groq.com/openai/v1

Screenshot 2024-04-24 at 11 30 34 AM

@CalamariDude
Copy link

@yangcheng Thank you! For some reason the speed seems around the same as current gpt4 in cursor, is this your experience?

@yangcheng
Copy link

for me the chat is noticeably faster. commnad+K is about same, maybe the bottleneck is somewhere else. maybe you can try 8b models to be sure?

@CalamariDude
Copy link

for me the chat is noticeably faster. commnad+K is about same, maybe the bottleneck is somewhere else. maybe you can try 8b models to be sure?

its about the same for small models on groq : /

@kcolemangt
Copy link

Today I made llm-router, enabling the use of ⌘ K or ⌘ L, followed by ⌘ /, to toggle between openai and groq. If there's interest, I can open source it and cut a release.

@spikecodes
Copy link

@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:

API keys are required for accessing the APIs.

@kcolemangt
Copy link

kcolemangt commented May 2, 2024

@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:

API keys are required for accessing the APIs.

@spikecodes Set the base URL and API key to Groq. However, you will lose access to OpenAI models until you remove the base URL and replace your Groq key with your OpenAI key.

The llm-router option I mentioned above allows you to provide the Groq API key via an environment variable, enabling you to use OpenAI models and others, like OLLAMA, simultaneously.

@spikecodes
Copy link

Thanks @kcolemangt! I didn't realize, I could pass the Groq API key the same as the OpenAI API key.

Regarding llm-router, I don't think it would help me much as I don't really need to switch back and forth between models. Just want a good fast option.

@CalamariDude
Copy link

CalamariDude commented May 7, 2024

Does anyone know why cursor is still slow even using groq? I would imagine the completion should be slightly faster if the llm part is done in 1/8th of the time... is this a limitation of extensions built with vscodium ? increasing the speed of cursor seems like the next important milestone to get this to become the best AI editor. @truell20 @Sanger2000 Do you know who would be the best person to ask regarding this ?

@yangcheng
Copy link

@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:

API keys are required for accessing the APIs.

@spikecodes Set the base URL and API key to Groq. However, you will lose access to OpenAI models until you remove the base URL and replace your Groq key with your OpenAI key.

The llm-router option I mentioned above allows you to provide the Groq API key via an environment variable, enabling you to use OpenAI models and others, like OLLAMA, simultaneously.

saw you tweet, very cool,how can we use it?

@krishnapraveen7
Copy link

Today I made llm-router, enabling the use of ⌘ K or ⌘ L, followed by ⌘ /, to toggle between openai and groq. If there's interest, I can open source it and cut a release.

Yes please, god i need this

@kcolemangt
Copy link

https://github.com/kcolemangt/llm-router

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants