lyleokoth commited on
Commit
8c0c078
·
verified ·
1 Parent(s): 34b69d8

lyleokoth/paligemma-code-extraction-v2

Browse files
Files changed (1) hide show
  1. README.md +28 -2
README.md CHANGED
@@ -17,6 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
  # code-extraction
18
 
19
  This model is a fine-tuned version of [google/paligemma-3b-pt-224](https://huggingface.co/google/paligemma-3b-pt-224) on the imagefolder dataset.
 
 
20
 
21
  ## Model description
22
 
@@ -37,12 +39,36 @@ More information needed
37
  The following hyperparameters were used during training:
38
  - learning_rate: 2e-05
39
  - train_batch_size: 1
40
- - eval_batch_size: 8
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
  - lr_scheduler_warmup_steps: 2
45
- - num_epochs: 3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
46
 
47
  ### Framework versions
48
 
 
17
  # code-extraction
18
 
19
  This model is a fine-tuned version of [google/paligemma-3b-pt-224](https://huggingface.co/google/paligemma-3b-pt-224) on the imagefolder dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.2910
22
 
23
  ## Model description
24
 
 
39
  The following hyperparameters were used during training:
40
  - learning_rate: 2e-05
41
  - train_batch_size: 1
42
+ - eval_batch_size: 1
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
  - lr_scheduler_warmup_steps: 2
47
+ - num_epochs: 2
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss |
52
+ |:-------------:|:------:|:----:|:---------------:|
53
+ | 1.1102 | 0.1064 | 10 | 0.9836 |
54
+ | 0.9563 | 0.2128 | 20 | 0.8361 |
55
+ | 0.8725 | 0.3191 | 30 | 0.7021 |
56
+ | 0.8441 | 0.4255 | 40 | 0.5871 |
57
+ | 0.6958 | 0.5319 | 50 | 0.5101 |
58
+ | 0.6931 | 0.6383 | 60 | 0.4598 |
59
+ | 0.5352 | 0.7447 | 70 | 0.4224 |
60
+ | 0.4966 | 0.8511 | 80 | 0.3931 |
61
+ | 0.6237 | 0.9574 | 90 | 0.3646 |
62
+ | 0.4289 | 1.0638 | 100 | 0.3423 |
63
+ | 0.5224 | 1.1702 | 110 | 0.3226 |
64
+ | 0.5532 | 1.2766 | 120 | 0.3140 |
65
+ | 0.3561 | 1.3830 | 130 | 0.3053 |
66
+ | 0.3985 | 1.4894 | 140 | 0.3027 |
67
+ | 0.39 | 1.5957 | 150 | 0.2992 |
68
+ | 0.3741 | 1.7021 | 160 | 0.2943 |
69
+ | 0.2028 | 1.8085 | 170 | 0.2898 |
70
+ | 0.3935 | 1.9149 | 180 | 0.2910 |
71
+
72
 
73
  ### Framework versions
74