-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[email protected] #30
Comments
Thank you for your interest in our work. Mask modeling-based pre-training is commonly utilized to enhance performance, with the pre-trained weights serving as a starting point for fine-tuning downstream tasks. We use |
Could you please reply to this message? I'm really eager to learn about the related work. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am currently facing a few issues:
(1) What is the purpose of pretraining?
(2) In the model section of PointMamba/cfgs/finetune_modelnet.yaml, the parameter NAME:PointMamba, which indicates that the PointMamba module from point_mamba.py is being used. However, it seems that no other sections, such as maskmamba, are utilized. Does this mean that when classifying the ModelNet40 dataset, there is no process of masking and reconstruction?
The text was updated successfully, but these errors were encountered: