Skip to content
This repository has been archived by the owner on Jan 1, 2025. It is now read-only.

Transfer Learning/ Improving performance of the base model #89

Open
qalabeabbas49 opened this issue Apr 25, 2023 · 0 comments
Open

Transfer Learning/ Improving performance of the base model #89

qalabeabbas49 opened this issue Apr 25, 2023 · 0 comments

Comments

@qalabeabbas49
Copy link

Hi, I have an idea to make a svoice for specific domains, i.e medicine-related conversations, financial conversations, or maybe a different language, etc.
I am training the base model using librimix and it works pretty well. but sometimes when I test with datasets from a different language the results are not that great.

I trained the base model with librimix up to 100 epochs.
And I further train the same model with a small dataset of different languages and continue training for 20~30 epochs, I hoped it would improve the model performance on the other language while keeping at least the original performance. But unfortunately, It did not work like that. It's like the model reset all its weights when I change the dataset for training.

my question is how would I go about transfer learning so that the base model does not lose its performance but improves based on the custom data provided?

Thank you

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant