Bert-Base-Uncased-Go-Emotion

Model description:

Training Parameters:

Num examples = 169208
Num Epochs = 3
Instantaneous batch size per device = 16
Total train batch size (w. parallel, distributed & accumulation) = 16
Gradient Accumulation steps = 1
Total optimization steps = 31728

TrainOutput:

'train_loss': 0.12085497042373672, 

Evalution Output:

 'eval_accuracy_thresh': 0.9614765048027039,
 'eval_loss': 0.1164659634232521

Colab Notebook:

Notebook

Downloads last month
5,117
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train bhadresh-savani/bert-base-go-emotion

Spaces using bhadresh-savani/bert-base-go-emotion 17