-
Error if you insert a lot of text, it is better to read the text in the input window and not skip such a request Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Hi @seoeaa, can you please share your code snippet that caused this error? |
Beta Was this translation helpful? Give feedback.
-
Hello @seoeaa, we appeciate you feedbacks on the
Your issue will be fixed once you select lower chunk_size as 512 or 1024. 👍 Check this comment for more context: https://github.com/safevideo/autollm/issues/77#issuecomment-1792349351 Comment:
@seoeaa you should be using smaller chunk sizes such as 1024 or 512. Context size automatically arranges by |
Beta Was this translation helpful? Give feedback.
Hello @seoeaa, we appeciate you feedbacks on the
autollm
🙏autollm
supports documents with infinite length. Your issue is not related to inserting a lot of text.Your issue will be fixed once you select lower chunk_size as 512 or 1024. 👍
Check this comment for more context: https://github.com/safevideo/autollm/issues/77#issuecomment-1792349351
Comment:
I changed it otherwise it didn't work service_context_params = { "chunk size": 4096, "embed_model": "local" }
@seoeaa you should be using smaller chunk sizes such as 1024 or 512.
chunk_size
is a different parameter thancontext_size
.Context size automatically arranges by
autollm
for target llm, you don't have to set it manually.