-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the loss #8
Comments
Hi, you may check whether you have added the DiffKD module to your optimizer. |
Hello, I checked the code and found that it seems that I do have not add the DiffKD module to the optimizer. However, I checked the train.py you provided and it seems that there is no such operation, either. Have I missed something? Can you provide a solution idea? A million thanks! The diff code I created is below: ` class DiffKD(nn.Module):
class DiffPro(nn.Module):
` |
I added the module into the student via the following code: Sorry about this, I'll consider a better way to aachiev it. |
Understood. I will try to fix it. Thanks again! |
Dear Hunto, |
Dear Hunto,
I tried your method on my task. I found that while the original loss (student's output - ground truth) descended, the other losses, like autoencoder loss, diffusion loss seemed do not change. What do you think is the potential reason?
Best!
The text was updated successfully, but these errors were encountered: