Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

master: Fixed hyperparameter collisions #806

Merged
merged 33 commits into from
Jul 2, 2024
Merged

Conversation

MatteoWohlrapp
Copy link
Collaborator

Introduced a unique id for every object, which is added to the parameter name.

@MatteoWohlrapp MatteoWohlrapp linked an issue Apr 15, 2024 that may be closed by this pull request
@smilesun
Copy link
Collaborator

model JiGen is still missing?

@smilesun
Copy link
Collaborator

Trainers like mldg dial also have gamma, that can be done also with id?

@MatteoWohlrapp
Copy link
Collaborator Author

From what I can see, jigen has not hyper_init and hyper_update functions, so I assumed hyperparameters are not updated.

To the other question: In your initial issue, you split it into two parts. The first part, the gamma collision when combining trainers, is not addressed with this. The second part, where the naming of model parameters collides, is. Will need to think about the first part, as this won't work with ids

@smilesun
Copy link
Collaborator

From what I can see, jigen has not hyper_init and hyper_update functions, so I assumed hyperparameters are not updated.

To the other question: In your initial issue, you split it into two parts. The first part, the gamma collision when combining trainers, is not addressed with this. The second part, where the naming of model parameters collides, is. Will need to think about the first part, as this won't work with ids

you are right, i forgt, sorry, jigen is a subclass of dann

@smilesun
Copy link
Collaborator

I read the pr several times since it first launched, the key point i want to make sure is that this design will be consistent with extensions, for example, if we have hyperscheduler_dial_mldg trainer, training diva_hduva model.

…s different gamma values for different trainers and models based on the naming of the class. Also added parameter printing
@MatteoWohlrapp MatteoWohlrapp marked this pull request as ready for review April 22, 2024 09:36
@MatteoWohlrapp MatteoWohlrapp changed the title Fixed second part of the issue where hyperparameters are colliding Fixed hyperparameter collisions Apr 22, 2024
@MatteoWohlrapp
Copy link
Collaborator Author

MatteoWohlrapp commented Apr 22, 2024

Solved gamma_reg naming collision by introducing functionality to pass different gamma values for different trainers and models based on the naming of the class. With this comes a slight change in how the command line or yaml arguments are passed, however, the previous format is still supported. Also added parameter printing for models and trainers.

@codecov-commenter
Copy link

codecov-commenter commented Apr 22, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 95.23%. Comparing base (faf7589) to head (0d62919).

Current head 0d62919 differs from pull request most recent head bdf84c2

Please upload reports for the commit bdf84c2 to get more accurate results.

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #806      +/-   ##
==========================================
+ Coverage   95.17%   95.23%   +0.05%     
==========================================
  Files         128      129       +1     
  Lines        5080     5138      +58     
==========================================
+ Hits         4835     4893      +58     
  Misses        245      245              
Flag Coverage Δ
unittests 95.23% <100.00%> (+0.05%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@smilesun
Copy link
Collaborator

@MatteoWohlrapp , could you fix the yaml according to my review?

@MatteoWohlrapp
Copy link
Collaborator Author

Can't see a review

@smilesun
Copy link
Collaborator

image
see the screenshot, the same scene not on your side?

examples/conf/vlcs_diva_mldg_dial.yaml Outdated Show resolved Hide resolved
tests/test_hyperparameter_retrieval.py Outdated Show resolved Hide resolved
domainlab/algos/trainers/a_trainer.py Show resolved Hide resolved
@smilesun
Copy link
Collaborator

smilesun commented May 3, 2024

@MatteoWohlrapp, the test coverage must be above 95%

currently

Attention: Patch coverage is 88.05970% with 8 lines in your changes are missing coverage. Please review.

@smilesun
Copy link
Collaborator

smilesun commented May 3, 2024

Codecov Report

Attention: Patch coverage is 88.05970% with 8 lines in your changes are missing coverage. Please review.

Project coverage is 95.00%. Comparing base (af8a238) to head (cc650ad).
Report is 4 commits behind head on master.

Files Patch % Lines
domainlab/models/a_model.py 78.57% 3 Missing ⚠️
domainlab/algos/trainers/a_trainer.py 33.33% 2 Missing ⚠️
domainlab/arg_parser.py 83.33% 2 Missing ⚠️
domainlab/utils/hyperparameter_retrieval.py 88.88% 1 Missing ⚠️
Additional details and impacted files
☔ View full report in Codecov by Sentry. 📢 Have feedback on the report? Share it here.

@MatteoWohlrapp see detailed coverage report in quotes.

@smilesun smilesun added the priority Further information is requested label May 7, 2024
Copy link
Collaborator

@smilesun smilesun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So Fbopt can run with changes from this branch? If that is the case I will merge this branch into master

@smilesun
Copy link
Collaborator

@MatteoWohlrapp

Matteo Wohlrapp and others added 2 commits June 11, 2024 11:25
docs/doc_usage_cmd.md Show resolved Hide resolved
domainlab/arg_parser.py Outdated Show resolved Hide resolved
domainlab/arg_parser.py Show resolved Hide resolved
domainlab/arg_parser.py Outdated Show resolved Hide resolved
examples/conf/vlcs_diva_mldg_dial.yaml Show resolved Hide resolved
@smilesun smilesun self-assigned this Jul 2, 2024
@smilesun smilesun changed the title Fixed hyperparameter collisions master: Fixed hyperparameter collisions Jul 2, 2024
docs/doc_usage_cmd.md Outdated Show resolved Hide resolved

- **Command Line Usage:**
- For a single value: `python script.py --gamma_reg=0.1`
- For multiple values: `python script.py --gamma_reg='default=0.1,dann=0.05,diva=0.2'`
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually, due to the specific implementation of DIVA, instead of $$\ell(\cdot) + \mu R(\cdot)$$ type of loss, for DIVA, it is $$\mu \ell(\cdot) + R(\cdot)$$, here $\mu$ is gamma_reg in the code

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But where is this specified for model diva. I can only find this initialization of diva which does not have a regularization :
model = mk_diva(list_str_y=task.list_str_y)(
node,
zd_dim=args.zd_dim,
zy_dim=args.zy_dim,
zx_dim=args.zx_dim,
list_d_tr=task.list_domain_tr,
gamma_d=args.gamma_d,
gamma_y=args.gamma_y,
beta_x=args.beta_x,
beta_y=args.beta_y,
beta_d=args.beta_d,
)
It's here

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is specified here:

and

def multiplier4task_loss(self):

gamma_reg: 1.0 # hyperparameter of DANN
gamma_reg:
default: 1.0
dann: 1.5
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MatteoWohlrapp , how about mldg?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will get the default values assigned

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this behavior tested somewhere? in which test, for instance, now the gamma_reg value for mldg should be 1.0? @MatteoWohlrapp

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i made a pull request to test his, still under work @MatteoWohlrapp

#847

@smilesun
Copy link
Collaborator

smilesun commented Jul 2, 2024

@MatteoWohlrapp

Run poetry run python examples/api/jigen_dann_transformer.py
Traceback (most recent call last):
  File "/home/runner/work/DomainLab/DomainLab/examples/api/jigen_dann_transformer.py", line 9, in <module>
    from domainlab.mk_exp import mk_exp
  File "/home/runner/work/DomainLab/DomainLab/domainlab/mk_exp.py", line 4, in <module>
    from domainlab.arg_parser import mk_parser_main
  File "/home/runner/work/DomainLab/DomainLab/domainlab/arg_parser.py", line 33
    my_dict = {}        if "=" in values:
                                        ^
SyntaxError: invalid syntax
Error: Process completed with exit code 1.

@smilesun
Copy link
Collaborator

smilesun commented Jul 2, 2024

@MatteoWohlrapp

Run poetry run python examples/api/jigen_dann_transformer.py
Traceback (most recent call last):
  File "/home/runner/work/DomainLab/DomainLab/examples/api/jigen_dann_transformer.py", line 9, in <module>
    from domainlab.mk_exp import mk_exp
  File "/home/runner/work/DomainLab/DomainLab/domainlab/mk_exp.py", line 4, in <module>
    from domainlab.arg_parser import mk_parser_main
  File "/home/runner/work/DomainLab/DomainLab/domainlab/arg_parser.py", line 33
    my_dict = {}        if "=" in values:
                                        ^
SyntaxError: invalid syntax
Error: Process completed with exit code 1.

lookes like error introduced by my commit

@smilesun
Copy link
Collaborator

smilesun commented Jul 2, 2024

@MatteoWohlrapp

Run poetry run python examples/api/jigen_dann_transformer.py
Traceback (most recent call last):
  File "/home/runner/work/DomainLab/DomainLab/examples/api/jigen_dann_transformer.py", line 9, in <module>
    from domainlab.mk_exp import mk_exp
  File "/home/runner/work/DomainLab/DomainLab/domainlab/mk_exp.py", line 4, in <module>
    from domainlab.arg_parser import mk_parser_main
  File "/home/runner/work/DomainLab/DomainLab/domainlab/arg_parser.py", line 33
    my_dict = {}        if "=" in values:
                                        ^
SyntaxError: invalid syntax
Error: Process completed with exit code 1.

lookes like error introduced by my commit

you corrected that already in the last two commits.

@smilesun smilesun merged commit ca493e5 into master Jul 2, 2024
1 of 2 checks passed
@smilesun smilesun deleted the gamma_reg_collision branch July 2, 2024 14:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority Further information is requested
Projects
None yet
3 participants