You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am facing some problems when I used the CIFAR100 dataset. For CIFAR10 it works well. I got the same result that you showed in the paper for CIFAR10.
But for CIFAR100 dataset, When I select 0.9 pruning away for 60 epochs even 1000 epochs, it just prunes till 80% flops where targeted flops should be 26.58M for 90% prunes. I tried in a multiple-way to change the hyper-parameters but it does not work.
Would you please, give me some suggestions that how I can solve this problem?
Here are my used hyper-parameters for your reference,
Hello @RudyChin ,
Thank you for your brilliant work.
I am facing some problems when I used the CIFAR100 dataset. For CIFAR10 it works well. I got the same result that you showed in the paper for CIFAR10.
But for CIFAR100 dataset, When I select 0.9 pruning away for 60 epochs even 1000 epochs, it just prunes till 80% flops where targeted flops should be 26.58M for 90% prunes. I tried in a multiple-way to change the hyper-parameters but it does not work.
Would you please, give me some suggestions that how I can solve this problem?
Here are my used hyper-parameters for your reference,
Namespace(datapath='./data', dataset='torchvision.datasets.CIFAR100', epoch=1000, name='prune_90', model='./mbnetv2c100-best.pth', batch_size=128, lr=0.01, lbda=3e-09, prune_away=0.9, constraint='flops', large_input=False, no_grow=False, pruner='FilterPrunerMBNetV2')
Thank you and eagerly waiting for your valuable reply.
The text was updated successfully, but these errors were encountered: