jhan21 commited on
Commit
389ae0a
·
verified ·
1 Parent(s): 2d69098

End of training

Browse files
README.md ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: distilbert-base-uncased
5
+ tags:
6
+ - generated_from_trainer
7
+ metrics:
8
+ - accuracy
9
+ - precision
10
+ - recall
11
+ - f1
12
+ model-index:
13
+ - name: amazon-reviews-sentiment-distilbert-base-uncased
14
+ results: []
15
+ ---
16
+
17
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
18
+ should probably proofread and complete it, then remove this comment. -->
19
+
20
+ # amazon-reviews-sentiment-distilbert-base-uncased
21
+
22
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
23
+ It achieves the following results on the evaluation set:
24
+ - Loss: 0.5171
25
+ - Accuracy: 0.7862
26
+ - Precision: 0.7876
27
+ - Recall: 0.7860
28
+ - F1: 0.7867
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 5e-05
48
+ - train_batch_size: 8
49
+ - eval_batch_size: 8
50
+ - seed: 0
51
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
52
+ - lr_scheduler_type: linear
53
+ - num_epochs: 1
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
58
+ |:-------------:|:------:|:-----:|:---------------:|:--------:|:---------:|:------:|:------:|
59
+ | 0.7283 | 0.0299 | 500 | 0.6867 | 0.7073 | 0.7038 | 0.7071 | 0.7030 |
60
+ | 0.6718 | 0.0598 | 1000 | 0.6067 | 0.7340 | 0.7478 | 0.7340 | 0.7377 |
61
+ | 0.6473 | 0.0898 | 1500 | 0.6154 | 0.7390 | 0.7508 | 0.7390 | 0.7416 |
62
+ | 0.616 | 0.1197 | 2000 | 0.6448 | 0.7423 | 0.7373 | 0.7420 | 0.7377 |
63
+ | 0.6123 | 0.1496 | 2500 | 0.6286 | 0.7241 | 0.7677 | 0.7243 | 0.7284 |
64
+ | 0.5874 | 0.1795 | 3000 | 0.5774 | 0.7516 | 0.7539 | 0.7515 | 0.7523 |
65
+ | 0.5746 | 0.2095 | 3500 | 0.5708 | 0.7564 | 0.7636 | 0.7563 | 0.7582 |
66
+ | 0.5917 | 0.2394 | 4000 | 0.5839 | 0.7596 | 0.7602 | 0.7595 | 0.7598 |
67
+ | 0.5774 | 0.2693 | 4500 | 0.6225 | 0.7526 | 0.7482 | 0.7524 | 0.7492 |
68
+ | 0.594 | 0.2992 | 5000 | 0.5531 | 0.7662 | 0.7694 | 0.7661 | 0.7673 |
69
+ | 0.5591 | 0.3292 | 5500 | 0.5770 | 0.7665 | 0.7645 | 0.7663 | 0.7645 |
70
+ | 0.5548 | 0.3591 | 6000 | 0.5805 | 0.7613 | 0.7579 | 0.7611 | 0.7584 |
71
+ | 0.5742 | 0.3890 | 6500 | 0.5592 | 0.7639 | 0.7665 | 0.7638 | 0.7636 |
72
+ | 0.5374 | 0.4189 | 7000 | 0.5548 | 0.7712 | 0.7776 | 0.7711 | 0.7735 |
73
+ | 0.5488 | 0.4489 | 7500 | 0.5622 | 0.7747 | 0.7747 | 0.7745 | 0.7746 |
74
+ | 0.5557 | 0.4788 | 8000 | 0.5698 | 0.7642 | 0.7822 | 0.7643 | 0.7670 |
75
+ | 0.556 | 0.5087 | 8500 | 0.5380 | 0.7754 | 0.7777 | 0.7753 | 0.7764 |
76
+ | 0.5325 | 0.5386 | 9000 | 0.5791 | 0.7754 | 0.7746 | 0.7751 | 0.7736 |
77
+ | 0.5301 | 0.5686 | 9500 | 0.5569 | 0.7753 | 0.7738 | 0.7751 | 0.7744 |
78
+ | 0.5232 | 0.5985 | 10000 | 0.5391 | 0.7782 | 0.7806 | 0.7780 | 0.7789 |
79
+ | 0.5462 | 0.6284 | 10500 | 0.5499 | 0.7729 | 0.7698 | 0.7726 | 0.7683 |
80
+ | 0.5614 | 0.6583 | 11000 | 0.5243 | 0.7803 | 0.7818 | 0.7801 | 0.7808 |
81
+ | 0.5376 | 0.6883 | 11500 | 0.5406 | 0.7795 | 0.7772 | 0.7794 | 0.7780 |
82
+ | 0.5287 | 0.7182 | 12000 | 0.5227 | 0.7797 | 0.7852 | 0.7796 | 0.7806 |
83
+ | 0.5149 | 0.7481 | 12500 | 0.5423 | 0.7803 | 0.7788 | 0.7801 | 0.7792 |
84
+ | 0.5312 | 0.7780 | 13000 | 0.5338 | 0.7771 | 0.7860 | 0.7771 | 0.7781 |
85
+ | 0.5204 | 0.8079 | 13500 | 0.5183 | 0.7843 | 0.7857 | 0.7841 | 0.7849 |
86
+ | 0.5412 | 0.8379 | 14000 | 0.5192 | 0.7844 | 0.7893 | 0.7843 | 0.7860 |
87
+ | 0.515 | 0.8678 | 14500 | 0.5135 | 0.7845 | 0.7858 | 0.7843 | 0.7850 |
88
+ | 0.5033 | 0.8977 | 15000 | 0.5254 | 0.7862 | 0.7882 | 0.7860 | 0.7870 |
89
+ | 0.5023 | 0.9276 | 15500 | 0.5251 | 0.7863 | 0.7853 | 0.7861 | 0.7856 |
90
+ | 0.5042 | 0.9576 | 16000 | 0.5215 | 0.7865 | 0.7864 | 0.7864 | 0.7864 |
91
+ | 0.5237 | 0.9875 | 16500 | 0.5171 | 0.7862 | 0.7876 | 0.7860 | 0.7867 |
92
+
93
+
94
+ ### Framework versions
95
+
96
+ - Transformers 4.50.3
97
+ - Pytorch 2.6.0+cu124
98
+ - Tokenizers 0.21.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c0628ad797c061d1fb978774a1450f57702cc605eb1e2309241968da97b999b1
3
  size 267835644
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7591df3d50649bacf198cb638851c18b5f87780d552367a6b0d840611baf41d0
3
  size 267835644
runs/Apr04_03-32-49_792637296ff0/events.out.tfevents.1743737621.792637296ff0.1795.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0e2cd719002cf8bea6e444ab58040c2bf075971130c39f0a43936ecf805b5858
3
- size 27789
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0fca1ddc5979a200d0b213650f4eb166c1ebe3367dc95b81851df90375655ec3
3
+ size 28149