chemberta-clintox-tunned-1

This model is a fine-tuned version of DeepChem/ChemBERTa-77M-MLM on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2171
  • Auc: 0.7873

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Auc
0.4703 1.0 74 0.4073 0.4833
0.227 2.0 148 0.2215 0.5420
0.2968 3.0 222 0.1996 0.6373
0.2641 4.0 296 0.1979 0.6768
0.2579 5.0 370 0.1940 0.6626
0.1752 6.0 444 0.1894 0.6748
0.339 7.0 518 0.1903 0.6879
0.1903 8.0 592 0.1897 0.7082
0.2189 9.0 666 0.1874 0.7285
0.2125 10.0 740 0.1840 0.7568
0.226 11.0 814 0.1851 0.7781
0.2072 12.0 888 0.1916 0.7801
0.236 13.0 962 0.1900 0.7923
0.188 14.0 1036 0.1897 0.7984
0.2019 15.0 1110 0.1807 0.8055
0.217 16.0 1184 0.1956 0.8105
0.1673 17.0 1258 0.1989 0.8176
0.2159 18.0 1332 0.1784 0.8298
0.1879 19.0 1406 0.1943 0.8237
0.1627 20.0 1480 0.1993 0.8227
0.1617 21.0 1554 0.1808 0.8369
0.1151 22.0 1628 0.1700 0.8440
0.1272 23.0 1702 0.1960 0.8430
0.1553 24.0 1776 0.1914 0.8389
0.1288 25.0 1850 0.2069 0.8328
0.1726 26.0 1924 0.1907 0.8419
0.1407 27.0 1998 0.2305 0.8298
0.1589 28.0 2072 0.2238 0.8328
0.0726 29.0 2146 0.2218 0.8369
0.1347 30.0 2220 0.2073 0.8399
0.1131 31.0 2294 0.2275 0.8399
0.1416 32.0 2368 0.2386 0.8389
0.1882 33.0 2442 0.2342 0.8379
0.2006 34.0 2516 0.2283 0.8359
0.1176 35.0 2590 0.2544 0.8308
0.0942 36.0 2664 0.2247 0.8369
0.1113 37.0 2738 0.2486 0.8379
0.119 38.0 2812 0.2303 0.8409
0.0865 39.0 2886 0.2623 0.8379
0.1399 40.0 2960 0.2463 0.8379
0.1771 41.0 3034 0.2440 0.8399
0.1106 42.0 3108 0.2550 0.8369
0.128 43.0 3182 0.2553 0.8359
0.1155 44.0 3256 0.2500 0.8409
0.1184 45.0 3330 0.2531 0.8409
0.1274 46.0 3404 0.2517 0.8409
0.0638 47.0 3478 0.2561 0.8409
0.1279 48.0 3552 0.2554 0.8409
0.1023 49.0 3626 0.2589 0.8399
0.1278 50.0 3700 0.2594 0.8399

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
6
Safetensors
Model size
3.43M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for HassanCS/chemberta-clintox-tunned-1

Finetuned
(7)
this model