Adapters 0.1.0
Blog post: https://adapterhub.ml/blog/2023/11/introducing-adapters/
With the new Adapters library, we fundamentally refactored the adapter-transformers library and added support for new models and adapter methods.
This version is compatible with Hugging Face Transformers version 4.35.2.
For a guide on how to migrate from adapter-transformers to Adapters have a look at https://docs.adapterhub.ml/transitioning.md.
Changes are given compared to the latest adapters-transformers v3.2.1.
New Models & Adapter Methods
- Add LLaMA model integration (@hSterz)
- Add X-MOD model integration (@calpt via #581)
- Add Electra model integration (@hSterz via #583, based on work of @amitkumarj441 and @pauli31 in #400)
- Add adapter output & parameter averaging (@calpt)
- Add Prompt Tuning (@lenglaender and @calpt via #595)
- Add Composition Support to LoRA and (IA)³ (@calpt via #598)
Breaking Changes
- Renamed bottleneck adapter configs and config strings. The new names can be found here: https://docs.adapterhub.ml/overview.html (@calpt)
- Removed the XModelWithHeads classes (@lenglaender) (XModelWithHeads have been deprecated since adapter-transformers version 3.0.0)
Changes Due to the Refactoring
- Refactored the implementation of all already supported models (@calpt, @lenglaender, @hSterz, @TimoImhof)
- Separate the model config (
PretrainedConfig
) from the adapters config (ModelAdaptersConfig
) (@calpt) - Updated the whole documentation, Jupyter Notebooks and example scripts (@hSterz, @lenglaender, @TimoImhof, @calpt)
- Introduced the
load_model
function to load models containing adapters. This replaces the Hugging Facefrom_pretrained
function used in theadapter-transformers
library (@lenglaender) - Sharing more logic for adapter composition between different composition blocks (@calpt via #591)
- Added Backwards Compatibility Tests which allow for testing if adaptations of the codebase, such as Refactoring, impair the functionality of the library (@TimoImhof via #596)
- Refactored the EncoderDecoderModel by introducing a new mixin (
ModelUsingSubmodelsAdaptersMixin
) for models that contain other models (@lenglaender) - Rename the class
AdapterConfigBase
intoAdapterConfig
(@hSterz via #603)
Fixes and Minor Improvements
- Fixed EncoderDecoderModel generate function (@lenglaender)
- Fixed deletion of invertible adapters (@TimoImhof)
- Automatically convert heads when loading with XAdapterModel (@calpt via #594)
- Fix training T5 adapter models with Trainer (@calpt via #599)
- Ensure output embeddings are frozen during adapter training (@calpt #537)