Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why there is no intermediate fine-tuning for one-class ImageNet-30? #6

Open
wangherr opened this issue Oct 23, 2023 · 1 comment
Open

Comments

@wangherr
Copy link

In my view, there is no difference between one-class ImageNet-30 and one-class cifar10.

why intermediate fine-tuning is ok for one-class cifar10, but is not ok for one-class ImageNet-30?

@JulietLJY
Copy link
Member

Hi! The reason is that there is an overlap between ImageNet-30 and ImageNet-1k. Consequently, if we conduct fine-tuning on ImageNet-30, it would transform into a straightforward binary-classification task rather than an out-of-distribution (OOD) detection task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants