Skip to content

Commit

Permalink
add check for distributed optimizer which is unsupported for PEFT (#8323
Browse files Browse the repository at this point in the history
)

Signed-off-by: Chen Cui <[email protected]>
  • Loading branch information
cuichenx authored Feb 6, 2024
1 parent c2ea202 commit 9940ec6
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions nemo/collections/nlp/parts/mixins/nlp_adapter_mixins.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,8 @@ def add_adapter(self, peft_cfgs: Union[PEFTConfig, List[PEFTConfig]]):

if self.cfg.get('virtual_pipeline_model_parallel_size', None):
raise ValueError('Virtual pipeline model parallel is not supported when using PEFT')
if self.cfg.optim.name == "distributed_fused_adam":
raise ValueError('distributed_fused_adam is not supported for PEFT. Please use fused_adam')

if not isinstance(peft_cfgs, List):
peft_cfgs = [peft_cfgs]
Expand Down

0 comments on commit 9940ec6

Please sign in to comment.