You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With the release of version 3.0 of adapter-transformers today, we're taking the first steps at integrating the grown and diversified landscape of efficient fine-tuning methods. Version 3.0 adds support for a first batch of recently proposed methods, including Prefix Tuning, Parallel adapters, Mix-and-Match adapters and Compacters. Further, improvements and changes to various aspects of the library are introduced.
AdapterHub - Adapter-Transformers v3 - Unifying Efficient Fine-Tuning
With the release of version 3.0 of adapter-transformers today, we're taking the first steps at integrating the grown and diversified landscape of efficient fine-tuning methods. Version 3.0 adds support for a first batch of recently proposed methods, including Prefix Tuning, Parallel adapters, Mix-and-Match adapters and Compacters. Further, improvements and changes to various aspects of the library are introduced.
https://adapterhub.ml/blog/2022/03/adapter-transformers-v3-unifying-efficient-fine-tuning/
The text was updated successfully, but these errors were encountered: