-
Notifications
You must be signed in to change notification settings - Fork 340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AdapterConfig's leave_out not work well in EncoderDecoderModel #472
Comments
Also, it seems
|
Hey @ZeguanXiao, I see why this is unexpected behavior. Unfortunately it is not as easy as changing the |
@hSterz My current workaround is setting |
This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label. |
Environment info
adapter-transformers
version: 3.1.0Information
Model I am using (Bert, XLNet ...): EncoderDecoderModel
Language I am using the model on (English, Chinese ...): English
Adapter setup I am using (if any): AdapterConfig
The problem arises when using:
The tasks I am working on is:
To reproduce
When not leaving out layers, it's okay.
When trying to leave out all encoder layers, not any adapter is added.
When only leaving out the first 6 layers of the encoder, we see adapters are only added to the encoder, leaving the decoder.
Expected behavior
The EncoderDecoderModel class work like BART-like models.
The text was updated successfully, but these errors were encountered: