You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For question-answering over long documents, especially when the answers are long (surpass the 384 or 512 max sequence length supported by other LMs e.g. BERT, RoBERTa... It is impossible for users to get the correct answers...
So it seems necessary for us to use longformer or bigbird models which accept longer input sequences... however longer sequences will lead to a increase of fine-tuning computation..
I read the paper about adapter-transformer and I found this solution to be elegant and perfectly mitigate the problems of long-answer question answering.
Unfortunately, there is not yet a support in the adapter library the supports for longformer architecture. I would like to ask if you are planning a support for the longformer models?
Thanks : )
The text was updated successfully, but these errors were encountered:
🚀 Feature request
Adapter Support For the Longformer models
Motivation
For question-answering over long documents, especially when the answers are long (surpass the
384
or512
max sequence length supported by other LMs e.g.BERT
,RoBERTa
... It is impossible for users to get the correct answers...So it seems necessary for us to use
longformer
orbigbird
models which accept longer input sequences... however longer sequences will lead to a increase of fine-tuning computation..I read the paper about adapter-transformer and I found this solution to be elegant and perfectly mitigate the problems of long-answer question answering.
Unfortunately, there is not yet a support in the adapter library the supports for longformer architecture. I would like to ask if you are planning a support for the longformer models?
Thanks : )
The text was updated successfully, but these errors were encountered: