-
Notifications
You must be signed in to change notification settings - Fork 348
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bigbird fusion #425
base: legacy
Are you sure you want to change the base?
Bigbird fusion #425
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @Soham2000,
thanks a lot for your efforts in adding adapter support to the BigBird model architecture. I've done an initial review and left some comments below. To complete the model integration, please make sure to follow the remaining implementation steps of our contribution guide, especially in the testing and documentation sections.
Let us know if you need any further help!
@@ -146,7 +146,7 @@ | |||
"models.bert_generation": ["BertGenerationConfig"], | |||
"models.bert_japanese": ["BertJapaneseTokenizer", "CharacterTokenizer", "MecabTokenizer"], | |||
"models.bertweet": ["BertweetTokenizer"], | |||
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig"], | |||
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig","BigBirdTokenizer"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig","BigBirdTokenizer"], | |
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig"], |
@@ -2985,7 +2986,7 @@ | |||
from .models.bert_generation import BertGenerationConfig | |||
from .models.bert_japanese import BertJapaneseTokenizer, CharacterTokenizer, MecabTokenizer | |||
from .models.bertweet import BertweetTokenizer | |||
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig | |||
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig,BigBirdTokenizer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig,BigBirdTokenizer | |
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig |
@@ -0,0 +1,240 @@ | |||
# flake8: noqa |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove this file
|
||
# print("#=================================================================================================") | ||
# # print(self) #Robertamodelwithheads #Big BirdModel with heads | ||
# print("#====================================================================================================") | ||
# print(self.base_model) #Robertamodel #BigBirdmodel | ||
# print("#======================================================================================================") | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove
@@ -0,0 +1,74 @@ | |||
import warnings |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove
self.add_prediction_head(head, overwrite_ok=overwrite_ok) | ||
|
||
|
||
class BigBirdModelWithHeads(BigBirdAdapterModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since the ...ModelWithHeads
classes are deprecated, we don't want to add them for new model architectures anymore. Thus, please remove this class, thanks.
@@ -30,6 +30,7 @@ | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove changes in this file
@@ -917,7 +921,7 @@ def check_model_type(self, supported_models: Union[List[str], dict]): | |||
else: | |||
supported_models_names.append(model.__name__) | |||
supported_models = supported_models_names | |||
for item in ADAPTER_MODEL_MAPPING.values(): | |||
for item in MODEL_WITH_HEADS_MAPPING.values(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please revert this change
Hey, thanks again for your efforts in contributing new model architectures to adapter-transformers and sorry for the silence on our side. In the last few weeks, we've been working on a large refactoring of our project, which will ultimately result in the release of Adapters, the next-generation adapters library. See #584. As a consequence, we plan to merge any new model integrations directly to the new codebase, which currently can be found on this branch. Unfortunately, this necessitates some changes in the model integration code (detailed here, see already integrated models such as BERT, BART etc. for reference). If you'd be willing to update your model integration to target the new library yourself, we'd be super happy to help you on this. Otherwise, we might look into upgrading and merging some of the open model integration PRs ourselves in the future. For more details, again see #584. |
Adding Adapter support for BigBird Transformer Architecture