Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Focal loss #109

Open
kirin-from-16 opened this issue Aug 9, 2024 · 0 comments
Open

About Focal loss #109

kirin-from-16 opened this issue Aug 9, 2024 · 0 comments

Comments

@kirin-from-16
Copy link

kirin-from-16 commented Aug 9, 2024

I'm dealing with a heavy imbalanced dataset of 3 class, with unchange class occupies for 90% of the total training images, but up to 99% of total pixels. For this reason, i choose Focal loss.

However, in your implementation, the alpha parameter for each class is calculated in a way that make them larger than 1. This goes against the original paper, where the alpha parameters are calculated to be less than 1.
image

Could you please explain the reason behind this choice?

Also, the dataloader used to get_alpha was the training set, which applied transform for each run, which lead to different values of alpha for each run. This seem to be a problem.
image

The datasets in your paper is also heavily imbalanced. Can you explain the reason why you choose CrossEntropy instead? Did this cause difficulties in training the model?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant