-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat/llama index version update #142
base: main
Are you sure you want to change the base?
Conversation
…Zipstack/unstract-sdk into feat/llama-index-version-update
Signed-off-by: Praveen Kumar <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM overall, left some comments / suggestions around the version and the use of MistralError instead of its SDKError
which might confuse us
Co-authored-by: Chandrasekharan M <[email protected]> Signed-off-by: Praveen Kumar <[email protected]>
Co-authored-by: Chandrasekharan M <[email protected]> Signed-off-by: Praveen Kumar <[email protected]>
Co-authored-by: Chandrasekharan M <[email protected]> Signed-off-by: Praveen Kumar <[email protected]>
Co-authored-by: Chandrasekharan M <[email protected]> Signed-off-by: Praveen Kumar <[email protected]>
"llama-index-vector-stores-qdrant==0.4.2", | ||
"llama-index-llms-openai==0.3.12", | ||
"llama-index-llms-palm==0.3.0", | ||
"llama-index-llms-mistralai==0.3.1", | ||
#Todo: Remove the version locking for mistralai below | ||
#once the the following error is resolved from mistralai side | ||
#Unable to import llm adapters : No module named 'mistralai.models.chat_completion' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pk-zipstack Hope this mistral error is not occuring with 1.2.5. Earlier we had hard-pinned to 0.4.2 because of thie error as mentioned in ToDo.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gaya3-zipstack This error is not occuring with 1.2.5. Can I go ahead and remove the comments?
@@ -35,7 +35,7 @@ def parse_llm_err(e: Exception, llm_adapter: LLMAdapter) -> LLMError: | |||
err = OpenAILLM.parse_llm_err(e) | |||
elif isinstance(e, AnthropicAPIError): | |||
err = AnthropicLLM.parse_llm_err(e) | |||
elif isinstance(e, MistralException): | |||
elif isinstance(e, MistralError): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hope we did a static check on mistralai.exceptions and mistralai.models to see that MistralError now includes MistralException.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gaya3-zipstack I checked the latest release notes and ran a static analysis on mistralai.exceptions
and mistralai.models.SDKError(MistralError)
now replaces MistralException
and handles the same errors, ensuring compatibility.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
llama-index is also directly used as a dependency in OSS
Hope there will be a corresponding OSS PR to take care of these other repos where we use it.
Ideally, we will need to remove all this. We need to pick it up soemtime.
Pls add it as a technical debt by creating a JIRA for the same @pk-zipstack
What
MistralException
withSDKError
for the updatedmistralai 1.2.5
dependency.Why
How
MistralException
withSDKError
.Related Issues or PRs
Dependencies Versions / Env Variables
Notes on Testing
single-pass extraction
,summarizer
,LLMChallenger
etc..Screenshots
Checklist
I have read and understood the Contribution Guidelines.