Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low accuracy on Jester dataset with pre-trained model #40

Open
stefanobini opened this issue May 21, 2021 · 1 comment
Open

Low accuracy on Jester dataset with pre-trained model #40

stefanobini opened this issue May 21, 2021 · 1 comment

Comments

@stefanobini
Copy link

stefanobini commented May 21, 2021

Hi, I downloaded you repository and the Jester dataset, then I followed the instructions contained in the "README.md" file to preprocess the dataset in order to obtain the files required by the framework and finally I ran the system in test mode both on the validation and test set with the pre-trained model ResNeXt-101 (jester_resnext_101_RGB_16_best.pth) and MobileNetv2 1.0x (jester_mobilenetv2_1.0x_RGB_16_best.pth). But the performance is completely different: around 3% on validation set and 10% on test set. So, I wondered if you can share with me how to reproduce your results.

@pestrstr
Copy link

Hi Stefano, have you solved the issue?
I'm having the same issue too. I cannot reproduce the results of this paper.
I have tested pre-trained weights on test videos and by closely looking at the output of the softmax layer, the model seems to always produce 0 as predicted label. I'm wondering if I'm testing it with wrong transforms / normalization criteria.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants