Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/llama index version update #142

Open
wants to merge 12 commits into
base: main
Choose a base branch
from

Conversation

pk-zipstack
Copy link
Contributor

What

  • Updated LlamaIndex libraries to their latest versions.
  • Replaced deprecated MistralException with SDKError for the updated mistralai 1.2.5 dependency.

Why

  • LlamaIndex library updates address performance improvements and bug fixes.

How

  • Bumped LlamaIndex dependencies to their latest versions in the toml file.
  • Replaced all instances of MistralException with SDKError.

Related Issues or PRs

Dependencies Versions / Env Variables

  • LlamaIndex: Updated to latest stable release.

Notes on Testing

  • Tested the LLMs in prompt-studio
  • Tested plugins like single-pass extraction, summarizer, LLMChallenger etc..

Screenshots

  • N/A

Checklist

I have read and understood the Contribution Guidelines.

Copy link
Contributor

@chandrasekharan-zipstack chandrasekharan-zipstack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM overall, left some comments / suggestions around the version and the use of MistralError instead of its SDKError which might confuse us

src/unstract/sdk/__init__.py Show resolved Hide resolved
src/unstract/sdk/adapters/llm/exceptions.py Outdated Show resolved Hide resolved
src/unstract/sdk/adapters/llm/exceptions.py Outdated Show resolved Hide resolved
src/unstract/sdk/adapters/llm/mistral/src/mistral.py Outdated Show resolved Hide resolved
src/unstract/sdk/adapters/llm/mistral/src/mistral.py Outdated Show resolved Hide resolved
pk-zipstack and others added 4 commits January 8, 2025 00:22
Co-authored-by: Chandrasekharan M <[email protected]>
Signed-off-by: Praveen Kumar <[email protected]>
Co-authored-by: Chandrasekharan M <[email protected]>
Signed-off-by: Praveen Kumar <[email protected]>
Co-authored-by: Chandrasekharan M <[email protected]>
Signed-off-by: Praveen Kumar <[email protected]>
Co-authored-by: Chandrasekharan M <[email protected]>
Signed-off-by: Praveen Kumar <[email protected]>
"llama-index-vector-stores-qdrant==0.4.2",
"llama-index-llms-openai==0.3.12",
"llama-index-llms-palm==0.3.0",
"llama-index-llms-mistralai==0.3.1",
#Todo: Remove the version locking for mistralai below
#once the the following error is resolved from mistralai side
#Unable to import llm adapters : No module named 'mistralai.models.chat_completion'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pk-zipstack Hope this mistral error is not occuring with 1.2.5. Earlier we had hard-pinned to 0.4.2 because of thie error as mentioned in ToDo.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gaya3-zipstack This error is not occuring with 1.2.5. Can I go ahead and remove the comments?

@@ -35,7 +35,7 @@ def parse_llm_err(e: Exception, llm_adapter: LLMAdapter) -> LLMError:
err = OpenAILLM.parse_llm_err(e)
elif isinstance(e, AnthropicAPIError):
err = AnthropicLLM.parse_llm_err(e)
elif isinstance(e, MistralException):
elif isinstance(e, MistralError):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hope we did a static check on mistralai.exceptions and mistralai.models to see that MistralError now includes MistralException.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gaya3-zipstack I checked the latest release notes and ran a static analysis on mistralai.exceptions and mistralai.models.SDKError(MistralError) now replaces MistralException and handles the same errors, ensuring compatibility.

Copy link
Contributor

@gaya3-zipstack gaya3-zipstack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

llama-index is also directly used as a dependency in OSS
image

Hope there will be a corresponding OSS PR to take care of these other repos where we use it.
Ideally, we will need to remove all this. We need to pick it up soemtime.
Pls add it as a technical debt by creating a JIRA for the same @pk-zipstack

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants