Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to incorporate vLLM in Lightning for LLM inference? #19829

Open
YuWang916 opened this issue Apr 30, 2024 · 0 comments
Open

How to incorporate vLLM in Lightning for LLM inference? #19829

YuWang916 opened this issue Apr 30, 2024 · 0 comments
Labels
feature Is an improvement or enhancement needs triage Waiting to be triaged by maintainers

Comments

@YuWang916
Copy link

YuWang916 commented Apr 30, 2024

Description & Motivation

vLLM is one of the most popular and effective tool for quick, large-scale LLM inference. Are there any existing examples of incorporating vLLM in Lightning? I have not found any so far.

Pitch

Adding inference via vLLM under the Lightning framework.

Alternatives

No response

Additional context

No response

cc @Borda

@YuWang916 YuWang916 added feature Is an improvement or enhancement needs triage Waiting to be triaged by maintainers labels Apr 30, 2024
@YuWang916 YuWang916 changed the title Are there any existing examples incorporating vLLM in Lightning? How to incorporate vLLM in Lightning for LLM inference? Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement needs triage Waiting to be triaged by maintainers
Projects
None yet
Development

No branches or pull requests

1 participant