Bisher's picture
End of training
94a9921 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - precision
  - f1
model-index:
  - name: wav2vec2_ASV_deepfake_audio_detection
    results: []

wav2vec2_ASV_deepfake_audio_detection

This model is a fine-tuned version of facebook/wav2vec2-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5628
  • Accuracy: 0.8999
  • Precision: 0.9057
  • F1: 0.8612
  • Tp: 181
  • Tn: 16068
  • Fn: 1800
  • Fp: 8
  • Auc Roc: 0.9372

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 100
  • eval_batch_size: 100
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 400
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision F1 Tp Tn Fn Fp Auc Roc
0.693 0.1143 10 0.6628 0.8854 0.8117 0.8385 23 15964 1958 112 0.5001
0.6589 0.2286 20 0.4915 0.8903 0.7926 0.8386 0 16076 1981 0 0.5030
0.5546 0.3429 30 0.3825 0.8865 0.8231 0.8406 39 15969 1942 107 0.5748
0.3566 0.4571 40 0.3403 0.8909 0.8620 0.8419 28 16059 1953 17 0.6201
0.2115 0.5714 50 0.3617 0.8923 0.8908 0.8442 43 16070 1938 6 0.7028
0.1636 0.6857 60 0.3428 0.8958 0.8756 0.8586 182 15993 1799 83 0.7968
0.1415 0.8 70 0.3899 0.8925 0.9015 0.8440 41 16075 1940 1 0.6722
0.11 0.9143 80 0.3756 0.8930 0.9024 0.8452 50 16075 1931 1 0.7490
0.1041 1.0286 90 0.3885 0.8960 0.9006 0.8526 110 16069 1871 7 0.6362
0.0888 1.1429 100 0.3484 0.8995 0.8936 0.8630 207 16036 1774 40 0.8231
0.0669 1.2571 110 0.3386 0.9049 0.9040 0.8734 299 16041 1682 35 0.8354
0.0552 1.3714 120 0.4530 0.8942 0.9055 0.8480 71 16076 1910 0 0.8554
0.071 1.4857 130 0.4327 0.8963 0.8937 0.8545 128 16057 1853 19 0.8543
0.0665 1.6 140 0.4547 0.8947 0.9045 0.8491 80 16075 1901 1 0.8065
0.054 1.7143 150 0.3210 0.9148 0.9064 0.8970 592 15926 1389 150 0.8851
0.0575 1.8286 160 0.4901 0.8934 0.9012 0.8462 58 16074 1923 2 0.7591
0.0437 1.9429 170 0.4849 0.8979 0.9036 0.8568 144 16069 1837 7 0.6435
0.0471 2.0571 180 0.3822 0.9071 0.9103 0.8767 324 16056 1657 20 0.9277
0.0377 2.1714 190 0.5301 0.8928 0.8962 0.8450 49 16072 1932 4 0.9112
0.0327 2.2857 200 0.5534 0.8920 0.9036 0.8426 30 16076 1951 0 0.8755
0.0522 2.4 210 0.2332 0.9260 0.9192 0.9162 865 15856 1116 220 0.9448
0.0449 2.5143 220 0.3034 0.9102 0.9104 0.8835 397 16038 1584 38 0.9453
0.0338 2.6286 230 0.4001 0.9018 0.9072 0.8654 218 16066 1763 10 0.9153
0.0337 2.7429 240 0.4761 0.8973 0.9056 0.8552 130 16073 1851 3 0.8789
0.0347 2.8571 250 0.5613 0.8921 0.9037 0.8429 32 16076 1949 0 0.9068
0.0301 2.9714 260 0.4896 0.8967 0.9025 0.8540 121 16070 1860 6 0.9480
0.0208 3.0857 270 0.5223 0.8983 0.9053 0.8575 149 16071 1832 5 0.9471
0.0197 3.2 280 0.5003 0.9024 0.9068 0.8669 232 16063 1749 13 0.9445
0.0167 3.3143 290 0.4328 0.9087 0.9123 0.8796 351 16057 1630 19 0.9561
0.0235 3.4286 300 0.3612 0.9097 0.9115 0.8821 380 16047 1601 29 0.9596
0.0207 3.5429 310 0.3538 0.9158 0.9169 0.8934 498 16038 1483 38 0.9591
0.0192 3.6571 320 0.4185 0.9145 0.9171 0.8907 465 16049 1516 27 0.9404
0.0176 3.7714 330 0.6594 0.8926 0.9017 0.8443 43 16075 1938 1 0.8734
0.0174 3.8857 340 0.5727 0.8995 0.9073 0.8600 170 16072 1811 4 0.9276
0.021 4.0 350 0.5943 0.8937 0.8988 0.8471 65 16072 1916 4 0.9460
0.02 4.1143 360 0.5183 0.8982 0.9040 0.8574 149 16069 1832 7 0.9507
0.015 4.2286 370 0.5329 0.8980 0.9037 0.8570 146 16069 1835 7 0.9477
0.0139 4.3429 380 0.5545 0.8967 0.9017 0.8541 122 16069 1859 7 0.9438
0.0103 4.4571 390 0.5638 0.8969 0.9021 0.8546 126 16069 1855 7 0.9403
0.0099 4.5714 400 0.5094 0.9030 0.9078 0.8679 241 16064 1740 12 0.9419
0.0121 4.6857 410 0.5066 0.9049 0.9099 0.8717 275 16064 1706 12 0.9406
0.0122 4.8 420 0.5700 0.8992 0.9047 0.8596 168 16068 1813 8 0.9326
0.0155 4.9143 430 0.5628 0.8999 0.9057 0.8612 181 16068 1800 8 0.9372

Framework versions

  • Transformers 4.44.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1