Skip to content

Devised pre-training procedure for a BERT-style transformer model from scratch and achieved 92% test accuracy after fine-tuning for sentiment analysis.

Notifications You must be signed in to change notification settings

parnerka/Transformers-for-Language-Modeling-and-Sentiment-Analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Transformers-for-Language-Modeling-and-Sentiment-Analysis

About the project: Devised pre-training procedure for a BERT-style transformer model from scratch and achieved 92% test accuracy after fine-tuning for sentiment analysis.


Python Libraries Used: PyTorch, NumPy, Math, Transformers


Jupyter notebook for the project: bert.ipynb.
Note that the notebook is self-contained and can be run on Colab directly.

About

Devised pre-training procedure for a BERT-style transformer model from scratch and achieved 92% test accuracy after fine-tuning for sentiment analysis.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published