Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential inconsistency between paper and implemented code #18

Open
KBlansit opened this issue Sep 26, 2023 · 0 comments
Open

Potential inconsistency between paper and implemented code #18

KBlansit opened this issue Sep 26, 2023 · 0 comments

Comments

@KBlansit
Copy link

Hi. I was looking at your implimentation of CORN loss for ordinal regression. When looking at your paper on archive (the June 1 2023 version), I noticed in the supplemental materials S1 the corn loss pytorch definition does appear to match what is implemented in this repository.

More specifically, while the archive paper appears to normalize the loss as return losses/num_classes, the repository normalizes the loss as return losses/num_examples.

I was wondering the source of the discrepancy between the two. Thank you for your time and your contributions.

KB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant