update model card README.md
Browse files
README.md
ADDED
@@ -0,0 +1,139 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: cc-by-nc-4.0
|
3 |
+
tags:
|
4 |
+
- generated_from_trainer
|
5 |
+
metrics:
|
6 |
+
- accuracy
|
7 |
+
model-index:
|
8 |
+
- name: videomae-base-ipm_all_videos_gb
|
9 |
+
results: []
|
10 |
+
---
|
11 |
+
|
12 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
13 |
+
should probably proofread and complete it, then remove this comment. -->
|
14 |
+
|
15 |
+
# videomae-base-ipm_all_videos_gb
|
16 |
+
|
17 |
+
This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 1.2748
|
20 |
+
- Accuracy: 0.6870
|
21 |
+
|
22 |
+
## Model description
|
23 |
+
|
24 |
+
More information needed
|
25 |
+
|
26 |
+
## Intended uses & limitations
|
27 |
+
|
28 |
+
More information needed
|
29 |
+
|
30 |
+
## Training and evaluation data
|
31 |
+
|
32 |
+
More information needed
|
33 |
+
|
34 |
+
## Training procedure
|
35 |
+
|
36 |
+
### Training hyperparameters
|
37 |
+
|
38 |
+
The following hyperparameters were used during training:
|
39 |
+
- learning_rate: 5e-05
|
40 |
+
- train_batch_size: 4
|
41 |
+
- eval_batch_size: 4
|
42 |
+
- seed: 42
|
43 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
+
- lr_scheduler_type: linear
|
45 |
+
- lr_scheduler_warmup_ratio: 0.1
|
46 |
+
- training_steps: 4800
|
47 |
+
|
48 |
+
### Training results
|
49 |
+
|
50 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
51 |
+
|:-------------:|:-----:|:----:|:---------------:|:--------:|
|
52 |
+
| 2.5051 | 0.01 | 60 | 2.5234 | 0.0870 |
|
53 |
+
| 2.4957 | 1.01 | 120 | 2.5401 | 0.1217 |
|
54 |
+
| 2.5475 | 2.01 | 180 | 2.5675 | 0.0870 |
|
55 |
+
| 2.4659 | 3.01 | 240 | 2.5836 | 0.0957 |
|
56 |
+
| 2.2644 | 4.01 | 300 | 2.5035 | 0.0696 |
|
57 |
+
| 2.3548 | 5.01 | 360 | 2.2569 | 0.1217 |
|
58 |
+
| 2.0341 | 6.01 | 420 | 2.3958 | 0.1565 |
|
59 |
+
| 2.2919 | 7.01 | 480 | 2.6096 | 0.0696 |
|
60 |
+
| 2.0857 | 8.01 | 540 | 2.3223 | 0.1217 |
|
61 |
+
| 1.7473 | 9.01 | 600 | 2.5414 | 0.1652 |
|
62 |
+
| 1.885 | 10.01 | 660 | 1.7822 | 0.3043 |
|
63 |
+
| 1.9496 | 11.01 | 720 | 1.8052 | 0.3130 |
|
64 |
+
| 1.2315 | 12.01 | 780 | 2.1955 | 0.2435 |
|
65 |
+
| 1.3549 | 13.01 | 840 | 2.1262 | 0.3130 |
|
66 |
+
| 1.5121 | 14.01 | 900 | 2.0316 | 0.2783 |
|
67 |
+
| 1.4504 | 15.01 | 960 | 1.7596 | 0.2957 |
|
68 |
+
| 1.2991 | 16.01 | 1020 | 1.6413 | 0.3652 |
|
69 |
+
| 1.2299 | 17.01 | 1080 | 1.5417 | 0.4087 |
|
70 |
+
| 1.2965 | 18.01 | 1140 | 1.7243 | 0.3739 |
|
71 |
+
| 1.2431 | 19.01 | 1200 | 1.7556 | 0.3478 |
|
72 |
+
| 1.3807 | 20.01 | 1260 | 1.4580 | 0.4435 |
|
73 |
+
| 1.3961 | 21.01 | 1320 | 1.6514 | 0.4 |
|
74 |
+
| 1.0119 | 22.01 | 1380 | 1.5449 | 0.3391 |
|
75 |
+
| 1.3799 | 23.01 | 1440 | 1.5126 | 0.3304 |
|
76 |
+
| 1.6871 | 24.01 | 1500 | 2.0675 | 0.2783 |
|
77 |
+
| 1.2707 | 25.01 | 1560 | 1.7128 | 0.3739 |
|
78 |
+
| 1.1495 | 26.01 | 1620 | 1.6387 | 0.3217 |
|
79 |
+
| 1.6151 | 27.01 | 1680 | 1.6192 | 0.3913 |
|
80 |
+
| 1.0587 | 28.01 | 1740 | 1.6008 | 0.4522 |
|
81 |
+
| 1.2169 | 29.01 | 1800 | 1.6739 | 0.4348 |
|
82 |
+
| 1.1116 | 30.01 | 1860 | 1.7693 | 0.3913 |
|
83 |
+
| 1.0939 | 31.01 | 1920 | 1.6540 | 0.3913 |
|
84 |
+
| 0.9307 | 32.01 | 1980 | 1.5583 | 0.4957 |
|
85 |
+
| 0.9539 | 33.01 | 2040 | 1.8836 | 0.4174 |
|
86 |
+
| 0.9804 | 34.01 | 2100 | 1.5656 | 0.4522 |
|
87 |
+
| 1.334 | 35.01 | 2160 | 1.5375 | 0.4609 |
|
88 |
+
| 1.0897 | 36.01 | 2220 | 1.4327 | 0.4087 |
|
89 |
+
| 0.864 | 37.01 | 2280 | 1.6372 | 0.3913 |
|
90 |
+
| 0.9678 | 38.01 | 2340 | 1.4537 | 0.4609 |
|
91 |
+
| 1.3184 | 39.01 | 2400 | 1.3085 | 0.4783 |
|
92 |
+
| 1.1462 | 40.01 | 2460 | 1.4954 | 0.4696 |
|
93 |
+
| 0.7875 | 41.01 | 2520 | 1.4692 | 0.4870 |
|
94 |
+
| 0.9552 | 42.01 | 2580 | 1.3797 | 0.4174 |
|
95 |
+
| 0.8053 | 43.01 | 2640 | 1.3216 | 0.5043 |
|
96 |
+
| 0.9231 | 44.01 | 2700 | 1.2134 | 0.5739 |
|
97 |
+
| 0.734 | 45.01 | 2760 | 1.1676 | 0.5304 |
|
98 |
+
| 0.5427 | 46.01 | 2820 | 1.2179 | 0.4783 |
|
99 |
+
| 0.7171 | 47.01 | 2880 | 1.2749 | 0.5304 |
|
100 |
+
| 0.6977 | 48.01 | 2940 | 1.3707 | 0.5304 |
|
101 |
+
| 0.6911 | 49.01 | 3000 | 1.2520 | 0.5478 |
|
102 |
+
| 0.6166 | 50.01 | 3060 | 1.3687 | 0.5304 |
|
103 |
+
| 0.4025 | 51.01 | 3120 | 1.4041 | 0.5652 |
|
104 |
+
| 0.6147 | 52.01 | 3180 | 1.3030 | 0.6435 |
|
105 |
+
| 0.5787 | 53.01 | 3240 | 1.4109 | 0.5913 |
|
106 |
+
| 0.7157 | 54.01 | 3300 | 1.3183 | 0.6 |
|
107 |
+
| 0.3391 | 55.01 | 3360 | 1.4333 | 0.5913 |
|
108 |
+
| 0.7482 | 56.01 | 3420 | 1.4549 | 0.5826 |
|
109 |
+
| 0.5182 | 57.01 | 3480 | 1.4193 | 0.5652 |
|
110 |
+
| 0.7383 | 58.01 | 3540 | 1.4043 | 0.5565 |
|
111 |
+
| 0.8862 | 59.01 | 3600 | 1.4041 | 0.6 |
|
112 |
+
| 0.3481 | 60.01 | 3660 | 1.3164 | 0.6435 |
|
113 |
+
| 0.763 | 61.01 | 3720 | 1.2947 | 0.5913 |
|
114 |
+
| 0.7397 | 62.01 | 3780 | 1.2785 | 0.6696 |
|
115 |
+
| 0.514 | 63.01 | 3840 | 1.3180 | 0.6522 |
|
116 |
+
| 0.6582 | 64.01 | 3900 | 1.3520 | 0.6696 |
|
117 |
+
| 0.3929 | 65.01 | 3960 | 1.3391 | 0.6609 |
|
118 |
+
| 0.7623 | 66.01 | 4020 | 1.4349 | 0.6348 |
|
119 |
+
| 0.6235 | 67.01 | 4080 | 1.2897 | 0.6522 |
|
120 |
+
| 0.449 | 68.01 | 4140 | 1.3150 | 0.6696 |
|
121 |
+
| 0.639 | 69.01 | 4200 | 1.4241 | 0.6087 |
|
122 |
+
| 0.473 | 70.01 | 4260 | 1.2578 | 0.6609 |
|
123 |
+
| 0.5478 | 71.01 | 4320 | 1.2770 | 0.6522 |
|
124 |
+
| 0.4732 | 72.01 | 4380 | 1.2893 | 0.6783 |
|
125 |
+
| 0.5489 | 73.01 | 4440 | 1.2195 | 0.7043 |
|
126 |
+
| 0.3907 | 74.01 | 4500 | 1.2523 | 0.6957 |
|
127 |
+
| 0.2572 | 75.01 | 4560 | 1.2149 | 0.7043 |
|
128 |
+
| 0.5022 | 76.01 | 4620 | 1.2934 | 0.6696 |
|
129 |
+
| 0.2958 | 77.01 | 4680 | 1.2726 | 0.6783 |
|
130 |
+
| 0.7009 | 78.01 | 4740 | 1.2779 | 0.6957 |
|
131 |
+
| 0.49 | 79.01 | 4800 | 1.2748 | 0.6870 |
|
132 |
+
|
133 |
+
|
134 |
+
### Framework versions
|
135 |
+
|
136 |
+
- Transformers 4.29.1
|
137 |
+
- Pytorch 2.0.1+cu117
|
138 |
+
- Datasets 2.12.0
|
139 |
+
- Tokenizers 0.13.3
|