-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Number of Epochs to replicate the acc results #2
Comments
Also, It'd be great if you could mention the batch sizes you utilized for each of the three datasets and the time it took for training. Thanks |
Hi, Sorry for the epoch number, I don't remember it accurately. You can set your epoch number by observing the log printout. For the batch size, I remember it's 1 as the number of iteration for each question is different so we only set it as 1. Thanks! |
Hey thanks for the clarification. I used batch size of 1. I am able to get the accuracy mentioned in the paper on MovieQA and WC datasets but for PathQuestions, it saturates at roughly 87%. I trained for 600 epochs using batch size 1 and default parameters. Here's the run in question: https://wandb.ai/afzal/kbqa-proj/runs/2p0cu4tg/overview Could you kindly make a wild guess as to what I might be doing wrong? |
Did you solve the issue? I tried to find my previous log file but I cannot find it as previously I deleted some folders by accident. Can you share your log file to me? |
Hi, |
Hi, I have read your plots. Can you slightly increase the dropout ratio? You can slightly tune it as I feel it might be a little bit overfitting. |
Hi,
I am currently training your model to replicate the accuracy results. Could you kindly let me know the number of epochs you trained for for each of the three datasets (MetaQA, PathQuestions, WC2014) in order to get the Top 1 scores in Table 2 in your paper. I know that WC2014 saturates to ~100% within the first few Epochs. Don't know about the remaining two datasets.
Thanks,
The text was updated successfully, but these errors were encountered: