Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error "input lengths of input ids is 0" #21

Open
NuiMrme opened this issue Feb 19, 2024 · 2 comments
Open

Error "input lengths of input ids is 0" #21

NuiMrme opened this issue Feb 19, 2024 · 2 comments

Comments

@NuiMrme
Copy link

NuiMrme commented Feb 19, 2024

Hello,
The only thing that is different from the instruction of the installation is I used Vicuna v.1.5, it downloads the weights locally from HF. In the bliva_vicuna7b.yaml, llm_model: I set it to the locally downloaded weights folder", not sure if this the source of the problem. Otherwise, I used an image of the same size 224. Here is my traceback:

File "/home/user/BLIVA/evaluate.py", line 93, in <module> main(args) File "/home/user/BLIVA/evaluate.py", line 85, in main eval_one(image, question, model) File "/home/user/BLIVA/evaluate.py", line 46, in eval_one outputs = model.generate({"image": image, "prompt": question}) File "/home/user/conda_env/bliva/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/user/BLIVA/bliva/models/bliva_vicuna7b.py", line 382, in generate outputs = self.llm_model.generate( File "/home/user/conda_env/bliva/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/user/conda_env/bliva/lib/python3.9/site-packages/transformers/generation/utils.py", line 1447, in generate self._validate_generated_length(generation_config, input_ids_length, has_default_max_length) File "/home/user/conda_env/bliva/lib/python3.9/site-packages/transformers/generation/utils.py", line 1166, in _validate_generated raise ValueError( ValueError: Input length of input_ids is 0, butmax_length is set to -39. This can lead to unexpected behavior. You should consider length or, better yet, setting max_new_tokens.

Thanks in advance
`

@lendrick
Copy link

The latest transformers seems to be the cause of this.

pip install transformers==4.28.0

Note: Using Vicuna 1.5 doesn't work, but isn't the cause of this error. You'll have to follow their instructions for applying the delta.

@chuanshen-chen
Copy link

The latest transformers seems to be the cause of this.

pip install transformers==4.28.0

Note: Using Vicuna 1.5 doesn't work, but isn't the cause of this error. You'll have to follow their instructions for applying the delta.

thanks!!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants