esuriddick commited on
Commit
ab20ce4
·
1 Parent(s): 7da8bbf

Update README.md

Browse files

Added link to Kaggle's notebook where the model is fine-tuned.

Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -38,6 +38,8 @@ It achieves the following results on the evaluation set:
38
  - Accuracy: 0.9375
39
  - F1: 0.9379
40
 
 
 
41
  ## Model description
42
 
43
  DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a
 
38
  - Accuracy: 0.9375
39
  - F1: 0.9379
40
 
41
+ The notebook to fine-tune this model may be found [HERE](https://www.kaggle.com/marcoloureno/distilbert-base-uncased-finetuned-emotion).
42
+
43
  ## Model description
44
 
45
  DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a