You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi. I was looking at your implimentation of CORN loss for ordinal regression. When looking at your paper on archive (the June 1 2023 version), I noticed in the supplemental materials S1 the corn loss pytorch definition does appear to match what is implemented in this repository.
More specifically, while the archive paper appears to normalize the loss as return losses/num_classes, the repository normalizes the loss as return losses/num_examples.
I was wondering the source of the discrepancy between the two. Thank you for your time and your contributions.
KB
The text was updated successfully, but these errors were encountered:
Hi. I was looking at your implimentation of CORN loss for ordinal regression. When looking at your paper on archive (the June 1 2023 version), I noticed in the supplemental materials S1 the corn loss pytorch definition does appear to match what is implemented in this repository.
More specifically, while the archive paper appears to normalize the loss as
return losses/num_classes
, the repository normalizes the loss asreturn losses/num_examples
.I was wondering the source of the discrepancy between the two. Thank you for your time and your contributions.
KB
The text was updated successfully, but these errors were encountered: