Skip to content

v3.0.1

Compare
Choose a tag to compare
@calpt calpt released this 18 May 08:48

Based on transformers v4.17.0

New

  • Support float reduction factors in bottleneck adapter configs (@calpt via #339)

Fixed

  • [AdapterTrainer] add missing preprocess_logits_for_metrics argument (@stefan-it via #317)
  • Fix save_all_adapters such that with_head is not ignored (@hSterz via #325)
  • Fix inferring batch size for prefix tuning (@calpt via #335)
  • Fix bug when using compacters with AdapterSetup context (@calpt via #328)
  • [Trainer] Fix issue with AdapterFusion and load_best_model_at_end (@calpt via #341)
  • Fix generation with GPT-2, T5 and Prefix Tuning (@calpt via #343)