Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] Unified LLM Whispeper adapters #144

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

jagadeeswaran-zipstack
Copy link
Contributor

What

  • Remove the LLMWhisperer v1 adapter and add it as an option in the V2 adapter. The versions v1 and v2 should be a dropdown with v2 being the default.
    ...

Why

  • For better UI/UX experience
    ...

How

...

Relevant Docs

Related Issues or PRs

Dependencies Versions / Env Variables

Notes on Testing

...

Screenshots

image

...

Checklist

I have read and understood the Contribution Guidelines.

@jagadeeswaran-zipstack jagadeeswaran-zipstack changed the title [Fix] Unifyied LLM Whispeper adapters [Fix] Unified LLM Whispeper adapters Jan 9, 2025
Copy link
Contributor

@chandrasekharan-zipstack chandrasekharan-zipstack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jagadeeswaran-zipstack

  1. please consult with @gaya3-zipstack and update the version in __init__.py
  2. Please help address the ask of separating envs
  3. What about removing the other v2 sub-package?

Comment on lines 56 to 57
POLL_INTERVAL = "ADAPTER_LLMW_POLL_INTERVAL"
MAX_POLLS = "ADAPTER_LLMW_MAX_POLLS"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jagadeeswaran-zipstack please help add and distinguish this env for v1 and v2 LLMW versions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants