You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It appears local LLMs are at a state now where it's easy enough to use them for the average user, and I just got a M3 Pro MacBook Pro for testing.
The servers packaged with apps like LM Studio follow OpenAI's API spec, which means that it should be fine to work with simpleaichat, pending a few warnings about unsupported config parameters.
The text was updated successfully, but these errors were encountered:
It appears local LLMs are at a state now where it's easy enough to use them for the average user, and I just got a M3 Pro MacBook Pro for testing.
The servers packaged with apps like LM Studio follow OpenAI's API spec, which means that it should be fine to work with simpleaichat, pending a few warnings about unsupported config parameters.
The text was updated successfully, but these errors were encountered: