Skip to content

Commit

Permalink
[LLM] add assertion for enable_stage1_overlap in lora mode (PaddlePad…
Browse files Browse the repository at this point in the history
  • Loading branch information
SylarTiaNII authored May 13, 2024
1 parent 17fb497 commit 53ad2da
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions llm/finetune_generation.py
Original file line number Diff line number Diff line change
Expand Up @@ -462,6 +462,10 @@ def neft_post_hook(module, input, output):
model.print_trainable_parameters()

if model_args.lora:
if training_args.sharding_parallel_degree > 1:
assert (
"enable_stage1_overlap" not in training_args.sharding_parallel_config
), "Currently not support enabling sharding_stage1_overlap in lora mode."
if model_args.lora_path is None:
target_modules = get_lora_target_modules(model)
lora_config = LoRAConfig(
Expand Down

0 comments on commit 53ad2da

Please sign in to comment.