Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The influence of the hyperparameter encoder_size #6

Open
garrywrj opened this issue Nov 25, 2020 · 1 comment
Open

The influence of the hyperparameter encoder_size #6

garrywrj opened this issue Nov 25, 2020 · 1 comment

Comments

@garrywrj
Copy link

In your paper "Acoustic echo cancellation with the dual-signal transformation LSTM network", it is mentioned that the size of the learned feature representation is also 512. Is it means the encoder_size is 512? And in your DNS-Challenge paper, the encoder_size is 256. I want to know the reason you changing the encoder size.
encoder_size form 256 to 512, will it influence the model size, number of parameters, objective and subjective metric and execution time?
Thanks a lot!

@breizhn
Copy link
Owner

breizhn commented Feb 12, 2021

Hi!

I just wanted to squeeze a tiny little bit more of performance out of the model for the challenge, that is the reason for the higher encoder size.
So in general the higher encoder size will slightly increase the performance, but often it is not significant. It will definitely increase the model size and the needed calculations. But it will not influence the execution time in an significant amount.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants