Working Within Context Limits #230
Replies: 3 comments 1 reply
-
Same here. I'm struggling trying to use other models than those in the default agent zero files, but I find a lot of issues. Lately I decided to unistall and start from scratch. Now I can't make it work even wuth the default models. It doesn't find my GRoq Key although it is correctly set in the .env file. |
Beta Was this translation helpful? Give feedback.
-
Hello, how should I configure it to use free or local models with "ollama"? Thank you |
Beta Was this translation helpful? Give feedback.
-
The modeling language has a limited memory mode. To avoid exceeding this limit, summarize previous conversations regularly. Break larger tasks into smaller steps and ask specific questions because the information request is too broad. Use a brief, illustrative paragraph. If you need to process a large list, run the command outside the chat and provide the results for each small section. |
Beta Was this translation helpful? Give feedback.
-
I'm curious how others are managing to work within the context limit of the models they are working with? I've only just started using the project but I've very quickly found myself running into the context limits of the models I'm working with. In one example, the agent ran a command that listed the NIM packages installed in it's current code execution environment and the entirety of the context limit was completely consumed. I haven't dove into too far behind the scenes, but in the UI at least, it's not immediately clear if there is a way to delete a message from the current chat window?
Beta Was this translation helpful? Give feedback.
All reactions