yangwooko commited on
Commit
35d2c81
·
verified ·
1 Parent(s): 74e4839

Model save

Browse files
README.md ADDED
@@ -0,0 +1,161 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: PowerInfer/SmallThinker-3B-Preview
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: smartmind-cyberone-20250410_x10
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # smartmind-cyberone-20250410_x10
15
+
16
+ This model is a fine-tuned version of [PowerInfer/SmallThinker-3B-Preview](https://huggingface.co/PowerInfer/SmallThinker-3B-Preview) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0078
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 1e-05
38
+ - train_batch_size: 8
39
+ - eval_batch_size: 8
40
+ - seed: 42
41
+ - distributed_type: multi-GPU
42
+ - gradient_accumulation_steps: 8
43
+ - total_train_batch_size: 64
44
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
45
+ - lr_scheduler_type: cosine_with_restarts
46
+ - lr_scheduler_warmup_ratio: 0.1
47
+ - num_epochs: 5
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss |
53
+ |:-------------:|:------:|:-----:|:---------------:|
54
+ | 0.5867 | 0.0499 | 310 | 0.1835 |
55
+ | 0.2091 | 0.0998 | 620 | 0.1088 |
56
+ | 0.1618 | 0.1497 | 930 | 0.0802 |
57
+ | 0.1325 | 0.1996 | 1240 | 0.0467 |
58
+ | 0.1496 | 0.2495 | 1550 | 0.0908 |
59
+ | 0.1206 | 0.2994 | 1860 | 0.0129 |
60
+ | 0.0787 | 0.3493 | 2170 | 0.0497 |
61
+ | 0.1031 | 0.3992 | 2480 | 0.0679 |
62
+ | 0.1326 | 0.4491 | 2790 | 0.1064 |
63
+ | 0.0932 | 0.4990 | 3100 | 0.0284 |
64
+ | 0.0869 | 0.5488 | 3410 | 0.0149 |
65
+ | 0.0765 | 0.5987 | 3720 | 0.0170 |
66
+ | 0.074 | 0.6486 | 4030 | 0.0338 |
67
+ | 0.073 | 0.6985 | 4340 | 0.0443 |
68
+ | 0.0862 | 0.7484 | 4650 | 0.0349 |
69
+ | 0.0961 | 0.7983 | 4960 | 0.0203 |
70
+ | 0.1037 | 0.8482 | 5270 | 0.0373 |
71
+ | 0.0705 | 0.8981 | 5580 | 0.0240 |
72
+ | 0.0695 | 0.9480 | 5890 | 0.0704 |
73
+ | 0.0686 | 0.9979 | 6200 | 0.0189 |
74
+ | 0.061 | 1.0478 | 6510 | 0.0178 |
75
+ | 0.0562 | 1.0977 | 6820 | 0.0262 |
76
+ | 0.0707 | 1.1476 | 7130 | 0.0189 |
77
+ | 0.0538 | 1.1975 | 7440 | 0.0137 |
78
+ | 0.0498 | 1.2474 | 7750 | 0.0146 |
79
+ | 0.0419 | 1.2973 | 8060 | 0.0193 |
80
+ | 0.0373 | 1.3472 | 8370 | 0.0120 |
81
+ | 0.0305 | 1.3971 | 8680 | 0.0126 |
82
+ | 0.0276 | 1.4470 | 8990 | 0.0098 |
83
+ | 0.0257 | 1.4969 | 9300 | 0.0125 |
84
+ | 0.0288 | 1.5468 | 9610 | 0.0128 |
85
+ | 0.0281 | 1.5967 | 9920 | 0.0072 |
86
+ | 0.0273 | 1.6465 | 10230 | 0.0085 |
87
+ | 0.0238 | 1.6964 | 10540 | 0.0157 |
88
+ | 0.0237 | 1.7463 | 10850 | 0.0088 |
89
+ | 0.0227 | 1.7962 | 11160 | 0.0125 |
90
+ | 0.0237 | 1.8461 | 11470 | 0.0107 |
91
+ | 0.0244 | 1.8960 | 11780 | 0.0063 |
92
+ | 0.0201 | 1.9459 | 12090 | 0.0047 |
93
+ | 0.023 | 1.9958 | 12400 | 0.0049 |
94
+ | 0.0211 | 2.0457 | 12710 | 0.0038 |
95
+ | 0.0171 | 2.0956 | 13020 | 0.0057 |
96
+ | 0.0229 | 2.1455 | 13330 | 0.0097 |
97
+ | 0.018 | 2.1954 | 13640 | 0.0060 |
98
+ | 0.0162 | 2.2453 | 13950 | 0.0089 |
99
+ | 0.0202 | 2.2952 | 14260 | 0.0098 |
100
+ | 0.0171 | 2.3451 | 14570 | 0.0072 |
101
+ | 0.0195 | 2.3950 | 14880 | 0.0044 |
102
+ | 0.0195 | 2.4449 | 15190 | 0.0043 |
103
+ | 0.0173 | 2.4948 | 15500 | 0.0046 |
104
+ | 0.015 | 2.5447 | 15810 | 0.0039 |
105
+ | 0.0149 | 2.5946 | 16120 | 0.0041 |
106
+ | 0.0204 | 2.6445 | 16430 | 0.0041 |
107
+ | 0.0173 | 2.6944 | 16740 | 0.0041 |
108
+ | 0.0181 | 2.7442 | 17050 | 0.0041 |
109
+ | 0.0165 | 2.7941 | 17360 | 0.0067 |
110
+ | 0.0326 | 2.8440 | 17670 | 0.0464 |
111
+ | 0.0732 | 2.8939 | 17980 | 0.0393 |
112
+ | 0.0367 | 2.9438 | 18290 | 0.0190 |
113
+ | 0.0515 | 2.9937 | 18600 | 0.0347 |
114
+ | 0.0348 | 3.0436 | 18910 | 0.0107 |
115
+ | 0.0288 | 3.0935 | 19220 | 0.0103 |
116
+ | 0.0363 | 3.1434 | 19530 | 0.0140 |
117
+ | 0.0409 | 3.1933 | 19840 | 0.0131 |
118
+ | 0.0211 | 3.2432 | 20150 | 0.0091 |
119
+ | 0.0279 | 3.2931 | 20460 | 0.0164 |
120
+ | 0.0286 | 3.3430 | 20770 | 0.0212 |
121
+ | 0.0244 | 3.3929 | 21080 | 0.0140 |
122
+ | 0.0301 | 3.4428 | 21390 | 0.0317 |
123
+ | 0.0274 | 3.4927 | 21700 | 0.0140 |
124
+ | 0.0245 | 3.5426 | 22010 | 0.0175 |
125
+ | 0.0216 | 3.5925 | 22320 | 0.0160 |
126
+ | 0.0209 | 3.6424 | 22630 | 0.0150 |
127
+ | 0.0243 | 3.6923 | 22940 | 0.0137 |
128
+ | 0.0255 | 3.7422 | 23250 | 0.0192 |
129
+ | 0.0233 | 3.7920 | 23560 | 0.0168 |
130
+ | 0.021 | 3.8419 | 23870 | 0.0210 |
131
+ | 0.021 | 3.8918 | 24180 | 0.0104 |
132
+ | 0.0174 | 3.9417 | 24490 | 0.0121 |
133
+ | 0.0195 | 3.9916 | 24800 | 0.0090 |
134
+ | 0.0168 | 4.0415 | 25110 | 0.0100 |
135
+ | 0.0198 | 4.0914 | 25420 | 0.0093 |
136
+ | 0.0208 | 4.1413 | 25730 | 0.0103 |
137
+ | 0.0197 | 4.1912 | 26040 | 0.0103 |
138
+ | 0.0204 | 4.2411 | 26350 | 0.0097 |
139
+ | 0.0156 | 4.2910 | 26660 | 0.0101 |
140
+ | 0.0163 | 4.3409 | 26970 | 0.0120 |
141
+ | 0.0168 | 4.3908 | 27280 | 0.0104 |
142
+ | 0.0192 | 4.4407 | 27590 | 0.0095 |
143
+ | 0.0175 | 4.4906 | 27900 | 0.0089 |
144
+ | 0.0185 | 4.5405 | 28210 | 0.0089 |
145
+ | 0.0163 | 4.5904 | 28520 | 0.0077 |
146
+ | 0.0135 | 4.6403 | 28830 | 0.0074 |
147
+ | 0.0136 | 4.6902 | 29140 | 0.0078 |
148
+ | 0.0138 | 4.7401 | 29450 | 0.0077 |
149
+ | 0.016 | 4.7900 | 29760 | 0.0076 |
150
+ | 0.0136 | 4.8399 | 30070 | 0.0078 |
151
+ | 0.0199 | 4.8897 | 30380 | 0.0078 |
152
+ | 0.0155 | 4.9396 | 30690 | 0.0078 |
153
+ | 0.0136 | 4.9895 | 31000 | 0.0078 |
154
+
155
+
156
+ ### Framework versions
157
+
158
+ - Transformers 4.51.3
159
+ - Pytorch 2.5.1+cu124
160
+ - Datasets 3.5.0
161
+ - Tokenizers 0.21.1
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d7e88aec398ab9f79d8870031b6a9a41f61737ee8b5a7aeabc448f38c73fdbb8
3
  size 4957559960
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fdfe3ae51f6c902d0f0e969cd4ea82b30e411e0461e029ba2b81d6a3e17aa60f
3
  size 4957559960
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:dc13b33b24d07d547f1feb7ed830e784ff78ab58fe1ab7bb88d8d33f16878abf
3
  size 1214374880
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c420697b8714c1abcef031e08f179f8a3180bae46dc6ac4c7e291ed405d4e2e
3
  size 1214374880
runs/Apr15_08-05-39_927160a6de2e/events.out.tfevents.1744704397.927160a6de2e.30706.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:bac704143885d13c23743ca2f2a76421886ba004a3c2f598674d29bb35e6acbf
3
- size 54832
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c40d1e65f75504d364314b49ec192674ec6d5c7259f02982a5bdcd8eb09b8cc6
3
+ size 55192