Skip to content

amitness/roberta-base-ne

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

roberta-base-ne

Train roberta-base from scratch for Nepali language using the CC-100 subset.

Training

To start the training, run:

python train.py

The default configuration used is stored at config/default.yaml. You can also view all the configuration options using the --help command.

python train.py --help

You can override any configuration from the CLI using the hydra syntax. For example, to train using only 100 sentences for 1 epoch, run:

python train.py dataset.portion=100 model.epochs=1

Citations

Our model has been featured in the following papers:

About

A BERT-based model trained from scratch for the Nepali language

Resources

License

Stars

Watchers

Forks

Languages