Skip to content

Commit

Permalink
add check for distributed optimizer which is unsupported for PEFT (NV…
Browse files Browse the repository at this point in the history
…IDIA#8323)

Signed-off-by: Chen Cui <[email protected]>
Signed-off-by: Sasha Meister <[email protected]>
  • Loading branch information
cuichenx authored and sashameister committed Feb 15, 2024
1 parent 3284fb0 commit 379a768
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions nemo/collections/nlp/parts/mixins/nlp_adapter_mixins.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,8 @@ def add_adapter(self, peft_cfgs: Union[PEFTConfig, List[PEFTConfig]]):

if self.cfg.get('virtual_pipeline_model_parallel_size', None):
raise ValueError('Virtual pipeline model parallel is not supported when using PEFT')
if self.cfg.optim.name == "distributed_fused_adam":
raise ValueError('distributed_fused_adam is not supported for PEFT. Please use fused_adam')

if not isinstance(peft_cfgs, List):
peft_cfgs = [peft_cfgs]
Expand Down

0 comments on commit 379a768

Please sign in to comment.