berel_finetuned_on_shuffled_HB_3_epochs_general

This model is a fine-tuned version of dicta-il/BEREL on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2321

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss
3.3021 0.0568 500 2.9186
3.372 0.1136 1000 2.9741
3.2939 0.1703 1500 2.9741
3.3757 0.2271 2000 2.9197
3.2192 0.2839 2500 2.9396
3.2966 0.3407 3000 2.9476
3.2634 0.3975 3500 2.8903
3.1503 0.4542 4000 2.8886
3.2281 0.5110 4500 nan
3.2165 0.5678 5000 nan
3.1273 0.6246 5500 2.8244
3.1743 0.6814 6000 2.8228
3.054 0.7381 6500 nan
3.0531 0.7949 7000 2.8213
3.0738 0.8517 7500 2.7571
3.0572 0.9085 8000 2.7764
3.0247 0.9653 8500 2.8006
2.9202 1.0220 9000 2.7288
2.8448 1.0788 9500 2.7300
2.866 1.1356 10000 2.6930
2.7476 1.1924 10500 2.6738
2.7801 1.2491 11000 2.6572
2.8685 1.3059 11500 2.6302
2.7265 1.3627 12000 2.6264
2.8173 1.4195 12500 2.6205
2.7561 1.4763 13000 2.6139
2.717 1.5330 13500 2.5742
2.5455 1.5898 14000 2.5571
2.7207 1.6466 14500 2.5352
2.7243 1.7034 15000 2.5022
2.5593 1.7602 15500 2.4617
2.5833 1.8169 16000 2.4913
2.5604 1.8737 16500 nan
2.6395 1.9305 17000 2.4648
2.5545 1.9873 17500 2.4603
2.5088 2.0441 18000 2.4281
2.4837 2.1008 18500 2.4210
2.3461 2.1576 19000 2.4174
2.3152 2.2144 19500 2.3661
2.4186 2.2712 20000 2.3611
2.416 2.3280 20500 2.3640
2.3894 2.3847 21000 2.3449
2.3077 2.4415 21500 2.3038
2.3199 2.4983 22000 2.3256
2.2644 2.5551 22500 2.2911
2.2888 2.6119 23000 2.2873
2.3648 2.6686 23500 2.2728
2.3545 2.7254 24000 2.2645
2.2642 2.7822 24500 2.2473
2.2299 2.8390 25000 nan
2.2673 2.8958 25500 2.2445
2.2081 2.9525 26000 2.2321

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
7
Safetensors
Model size
184M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for martijn75/berel_finetuned_on_shuffled_HB_3_epochs_general

Base model

dicta-il/BEREL
Finetuned
(17)
this model