Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swap Loss not Decreasing in Contrastive Learning #11

Open
P1terQ opened this issue Aug 10, 2024 · 1 comment
Open

Swap Loss not Decreasing in Contrastive Learning #11

P1terQ opened this issue Aug 10, 2024 · 1 comment

Comments

@P1terQ
Copy link

P1terQ commented Aug 10, 2024

In the paper, the authors highlight that using contrastive learning can carry more precious environmental information, and the t-SNE visualization indicates the stronger representation ability of this method.

However, when I attempted to run the open-source code, the swap loss didn't drop, regardless of short or long iterations.

I am wondering if this is an expected behavior for contrastive learning algorithms, or if there might be an issue with my implementation.
image
image

@kc-ustc
Copy link

kc-ustc commented Oct 25, 2024

Thanks for your sharing! I have met the same problms about swap loss.And I also try ablation study without contrastive learning by setting swap loss = 0,finding that the linear velocity tracking reward during training seems to make no difference with original version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants