Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HfArgumentParser error when using LoraConfig dataclass #34834

Open
2 of 4 tasks
JoseSantosAMD opened this issue Nov 20, 2024 · 3 comments
Open
2 of 4 tasks

HfArgumentParser error when using LoraConfig dataclass #34834

JoseSantosAMD opened this issue Nov 20, 2024 · 3 comments
Labels

Comments

@JoseSantosAMD
Copy link

System Info

  • transformers version: 4.46.3
  • Platform: Linux-5.14.0-162.18.1.el9_1.x86_64-x86_64-with-glibc2.35
  • Python version: 3.10.15
  • Huggingface_hub version: 0.26.2
  • Safetensors version: 0.4.5
  • Accelerate version: 1.1.1
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.5.0.dev20240802+rocm6.0 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using distributed or parallel set-up in script?:
  • Using GPU in script?:
  • GPU type: AMD Instinct MI250X/MI250
  • peft version: 0.13.2

Who can help?

@SunMarc @MekkCyber

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Looks like HfArgumentParser Throws an error when loading the LoraConfig dataclass from peft that I load in from a yaml config

Some of my code:

from peft import LoraConfig, get_peft_model
from transformers import (
AutoModelForCausalLM,
BitsAndBytesConfig,
HfArgumentParser,
TrainingArguments,
)
..

parser = HfArgumentParser([LoraConfig])
(parsed_dataclass,) = parser.parse_dict(flat_config, allow_extra_keys=True)

...

Leads to the following Traceback:

Traceback (most recent call last):
File "/work/scripts/omnihub.py", line 12, in
main()
File "/work/scripts/omnihub.py", line 8, in main
o.run()
File "/work/scripts/omnihub/init.py", line 177, in run
self.func(self.extra_args, self.config)
File "/work/applications/hf-finetune/finetune.py", line 225, in run
FineTuner(*args, **kwargs).run()
File "/work/applications/hf-finetune/finetune.py", line 186, in init
self.parse_args(supported_dataclass, custom_args, config)
File "/work/applications/hf-finetune/finetune.py", line 150, in parse_args
parser = HfArgumentParser([dataclass])
File "/opt/conda/lib/python3.10/site-packages/transformers/hf_argparser.py", line 137, in init
self._add_dataclass_arguments(dtype)
File "/opt/conda/lib/python3.10/site-packages/transformers/hf_argparser.py", line 277, in _add_dataclass_arguments
self._parse_dataclass_field(parser, field)
File "/opt/conda/lib/python3.10/site-packages/transformers/hf_argparser.py", line 167, in _parse_dataclass_field
raise ValueError(
ValueError: Only Union[X, NoneType] (i.e., Optional[X]) is allowed for Union because the argument parser only supports one type per argument. Problem encountered in field 'init_lora_weights'.

Expected behavior

I expect the dataclass to be created when using LoraConfig dataclass imported from peft

@SunMarc
Copy link
Member

SunMarc commented Nov 21, 2024

cc @BenjaminBossan

@BenjaminBossan
Copy link
Member

I checked and could reproduce the error with this snippet:

from peft import LoraConfig
from transformers import HfArgumentParser
parser = HfArgumentParser([LoraConfig])

The problem is that HfArgumentParser cannot deal with 2 different types in the type annotation (except for NoneType), but PEFT has such annotations, e.g.:

https://github.com/huggingface/peft/blob/029faf6eeaae4f6392b7792e63308caaf6eece70/src/peft/tuners/lora/config.py#L263

IMO, this is not something we should fix in PEFT, as having 2 different types is perfectly normal. Either HfArgumentParser needs to be updated to be able to deal with multiple types (but I don't know how hard that would be) or you cannot use HfArgumentParser for LoraConfig and need to use other means for argument parsing like argparse.ArgumentParser.

@leobianco
Copy link

I used the following (ugly) workaround while they don't fix this...

@dataclass
class CustomLoraConfig(LoraConfig):
    init_lora_weights: bool = field(default=True)
    layers_to_transform: int = field(default=None)
    loftq_config: dict = field(default_factory=dict)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants