You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of the challenges when getting started with these llm lsps is getting a feeling for what is reasonable to run on different hardware.
Would it be within scope to have a dedicated documentation section with a few tables, recommending what model / config to use for different hardware? For example if I have on a CPU, what is a reasonable LLM to run? If I have a 4080 GPU, what is a good choice? I wonder if it feels a bit like a jungle otherwise for newcomers (at least it is my feeling now :) ).
The text was updated successfully, but these errors were encountered:
This is a really good point. This project has an education problem. To use this project effectively you must not only have a decent understanding of LLMs, you must have a good understanding of the LSP.
This is something I have been thinking about and I have a few ideas. The main one being an auto configurator website where you are asked a series of questions and tick a series of boxes and are given a configuration.
Thanks for an amazing project!
One of the challenges when getting started with these llm lsps is getting a feeling for what is reasonable to run on different hardware.
Would it be within scope to have a dedicated documentation section with a few tables, recommending what model / config to use for different hardware? For example if I have on a CPU, what is a reasonable LLM to run? If I have a 4080 GPU, what is a good choice? I wonder if it feels a bit like a jungle otherwise for newcomers (at least it is my feeling now :) ).
The text was updated successfully, but these errors were encountered: