Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 462 Bytes

File metadata and controls

12 lines (7 loc) · 462 Bytes

Transformers-for-Language-Modeling-and-Sentiment-Analysis

About the project: Devised pre-training procedure for a BERT-style transformer model from scratch and achieved 92% test accuracy after fine-tuning for sentiment analysis.


Python Libraries Used: PyTorch, NumPy, Math, Transformers


Jupyter notebook for the project: bert.ipynb.
Note that the notebook is self-contained and can be run on Colab directly.