Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Freezing layers during training #26

Open
birdmw opened this issue Jan 6, 2020 · 0 comments
Open

Freezing layers during training #26

birdmw opened this issue Jan 6, 2020 · 0 comments

Comments

@birdmw
Copy link

birdmw commented Jan 6, 2020

When training I see progress followed by degradation. This is (likely) because the model is over fitting due to the limited corpus size of 8k samples. What is happening is we are overwriting the pre-trained weights in the fine-tuning task. What we would like to do is freeze the original layers. We need to figure out how to do this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant