Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

error with very large chunk_size #98

Answered by fcakyon
seoeaa asked this question in Q&A
Discussion options

You must be logged in to vote

Hello @seoeaa, we appeciate you feedbacks on the autollm 🙏

autollm supports documents with infinite length. Your issue is not related to inserting a lot of text.

Your issue will be fixed once you select lower chunk_size as 512 or 1024. 👍

Check this comment for more context: https://github.com/safevideo/autollm/issues/77#issuecomment-1792349351

Comment:

I changed it otherwise it didn't work service_context_params = { "chunk size": 4096, "embed_model": "local" }

@seoeaa you should be using smaller chunk sizes such as 1024 or 512. chunk_size is a different parameter than context_size.

Context size automatically arranges by autollm for target llm, you don't have to set it manually.

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by fcakyon
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants
Converted from issue

This discussion was converted from issue #85 on November 03, 2023 12:49.