Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] pretrain_mm_mlp_adapter of llava v1.6 7B not found #1490

Open
YQYI opened this issue May 7, 2024 · 1 comment
Open

[Question] pretrain_mm_mlp_adapter of llava v1.6 7B not found #1490

YQYI opened this issue May 7, 2024 · 1 comment

Comments

@YQYI
Copy link

YQYI commented May 7, 2024

Question

I try to train llava v1.6 7B in lora mode, but can not find pretrain_mm_mlp_adapter file, where can i find it?

@Davidup1
Copy link

Davidup1 commented May 7, 2024

@YQYI I guess you can just use the mlp_adapter of llava v1.5 7B as the blog says
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants