You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

mdeberta-v3-base_ordinal_5_seed420_EN-NL

This model is a fine-tuned version of microsoft/mdeberta-v3-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3573
  • Mse: 2.7856
  • Rmse: 1.6690
  • Mae: 0.9730
  • R2: 0.2132
  • F1: 0.7591
  • Precision: 0.7626
  • Recall: 0.7648
  • Accuracy: 0.7648

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mse Rmse Mae R2 F1 Precision Recall Accuracy
3.5638 0.2141 100 3.5430 3.6731 1.9165 1.8782 -0.0652 0.4761 0.3859 0.6212 0.6212
3.2282 0.4283 200 3.1915 3.7250 1.9300 1.6743 -0.0802 0.4761 0.3859 0.6212 0.6212
3.0493 0.6424 300 3.0534 3.1846 1.7845 1.6116 0.0765 0.4761 0.3859 0.6212 0.6212
2.9037 0.8565 400 2.9377 3.3112 1.8197 1.4246 0.0398 0.4761 0.3859 0.6212 0.6212
2.7728 1.0707 500 2.7732 3.0639 1.7504 1.3655 0.1115 0.6107 0.6498 0.6586 0.6586
2.6147 1.2848 600 2.6980 3.0434 1.7445 1.3281 0.1174 0.6937 0.6983 0.6912 0.6912
2.5372 1.4989 700 2.5946 3.0024 1.7327 1.2220 0.1293 0.6905 0.6908 0.6984 0.6984
2.4555 1.7131 800 2.4696 2.7563 1.6602 1.1809 0.2007 0.7178 0.7173 0.7226 0.7226
2.4496 1.9272 900 2.4942 2.6828 1.6379 1.2232 0.2220 0.7327 0.7355 0.7310 0.7310
2.298 2.1413 1000 2.5051 3.0072 1.7341 1.1086 0.1279 0.7176 0.7205 0.7262 0.7262
2.2482 2.3555 1100 2.4543 2.8938 1.7011 1.0748 0.1608 0.7215 0.7344 0.7358 0.7358
2.0678 2.5696 1200 2.3826 2.8914 1.7004 1.0338 0.1615 0.7434 0.7460 0.7419 0.7419
2.0865 2.7837 1300 2.3957 2.8504 1.6883 1.0145 0.1734 0.7383 0.7397 0.7443 0.7443
2.1771 2.9979 1400 2.3659 2.7370 1.6544 1.0579 0.2063 0.7438 0.7434 0.7443 0.7443
2.0164 3.2120 1500 2.3783 2.9071 1.7050 1.0398 0.1570 0.7462 0.7497 0.7443 0.7443
1.9577 3.4261 1600 2.4072 2.8902 1.7001 1.0543 0.1619 0.7471 0.7549 0.7443 0.7443
1.8874 3.6403 1700 2.3547 2.7913 1.6707 0.9795 0.1905 0.7523 0.7537 0.7575 0.7575
1.8746 3.8544 1800 2.3244 2.6767 1.6361 1.0024 0.2238 0.7507 0.7499 0.7527 0.7527
1.9356 4.0685 1900 2.3361 2.8166 1.6783 1.0217 0.1832 0.7568 0.7573 0.7563 0.7563
1.7507 4.2827 2000 2.3419 2.7575 1.6606 0.9867 0.2003 0.7506 0.7512 0.7551 0.7551
1.7485 4.4968 2100 2.3295 2.8323 1.6830 1.0109 0.1786 0.7573 0.7587 0.7563 0.7563
1.7336 4.7109 2200 2.3332 2.7262 1.6511 0.9578 0.2094 0.7565 0.7590 0.7624 0.7624
1.739 4.9251 2300 2.4113 2.9928 1.7300 1.0314 0.1321 0.7415 0.7454 0.7394 0.7394
1.6066 5.1392 2400 2.4276 3.0145 1.7362 1.0265 0.1258 0.7613 0.7686 0.7587 0.7587
1.6271 5.3533 2500 2.3737 2.9373 1.7138 0.9783 0.1482 0.7597 0.7595 0.7600 0.7600

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
279M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Amala3/mdeberta-v3-base_ordinal_5_seed420_EN-NL

Finetuned
(193)
this model

Collection including Amala3/mdeberta-v3-base_ordinal_5_seed420_EN-NL