ft-ja4-2.25sec
This model is a fine-tuned version of pyannote/segmentation-3.0 on the objects76/synthetic-ja4-speaker-overlap-6400 dataset. It achieves the following results on the evaluation set:
- Loss: 0.3024
- Der: 0.0934
- False Alarm: 0.0422
- Missed Detection: 0.0401
- Confusion: 0.0111
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 2048
- eval_batch_size: 2048
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 200
Training results
Training Loss | Epoch | Step | Confusion | Der | False Alarm | Validation Loss | Missed Detection |
---|---|---|---|---|---|---|---|
No log | 1.0 | 7 | 0.0611 | 0.2939 | 0.1347 | 0.9642 | 0.0982 |
No log | 2.0 | 14 | 0.0771 | 0.2589 | 0.0527 | 0.8482 | 0.1291 |
No log | 3.0 | 21 | 0.0839 | 0.2499 | 0.0303 | 0.8020 | 0.1358 |
0.8969 | 4.0 | 28 | 0.0664 | 0.2162 | 0.0686 | 0.7508 | 0.0813 |
0.8969 | 5.0 | 35 | 0.0654 | 0.2029 | 0.0680 | 0.6863 | 0.0695 |
0.8969 | 6.0 | 42 | 0.0615 | 0.1919 | 0.0725 | 0.6313 | 0.0580 |
0.8969 | 7.0 | 49 | 0.0535 | 0.1808 | 0.0752 | 0.5859 | 0.0522 |
0.6817 | 8.0 | 56 | 0.0468 | 0.1698 | 0.0708 | 0.5505 | 0.0522 |
0.6817 | 9.0 | 63 | 0.0415 | 0.1619 | 0.0694 | 0.5233 | 0.0510 |
0.6817 | 10.0 | 70 | 0.0366 | 0.1550 | 0.0688 | 0.4992 | 0.0496 |
0.5316 | 11.0 | 77 | 0.0316 | 0.1459 | 0.0634 | 0.4725 | 0.0509 |
0.5316 | 12.0 | 84 | 0.0271 | 0.1389 | 0.0608 | 0.4490 | 0.0510 |
0.5316 | 13.0 | 91 | 0.0250 | 0.1322 | 0.0556 | 0.4286 | 0.0517 |
0.5316 | 14.0 | 98 | 0.0220 | 0.1263 | 0.0544 | 0.4123 | 0.0499 |
0.4403 | 15.0 | 105 | 0.0203 | 0.1213 | 0.0523 | 0.3977 | 0.0487 |
0.4403 | 16.0 | 112 | 0.0191 | 0.1190 | 0.0536 | 0.3904 | 0.0462 |
0.4403 | 17.0 | 119 | 0.0188 | 0.1161 | 0.0484 | 0.3784 | 0.0489 |
0.3873 | 18.0 | 126 | 0.0171 | 0.1144 | 0.0520 | 0.3725 | 0.0453 |
0.3873 | 19.0 | 133 | 0.0183 | 0.1128 | 0.0458 | 0.3659 | 0.0487 |
0.3873 | 20.0 | 140 | 0.0177 | 0.1129 | 0.0516 | 0.3636 | 0.0437 |
0.3873 | 21.0 | 147 | 0.0171 | 0.1099 | 0.0470 | 0.3569 | 0.0458 |
0.3577 | 22.0 | 154 | 0.0178 | 0.1097 | 0.0451 | 0.3541 | 0.0468 |
0.3577 | 23.0 | 161 | 0.0164 | 0.1076 | 0.0495 | 0.3487 | 0.0417 |
0.3577 | 24.0 | 168 | 0.0165 | 0.1066 | 0.0459 | 0.3436 | 0.0443 |
0.3417 | 25.0 | 175 | 0.0166 | 0.1059 | 0.0446 | 0.3385 | 0.0447 |
0.3417 | 26.0 | 182 | 0.0155 | 0.1052 | 0.0487 | 0.3379 | 0.0411 |
0.3417 | 27.0 | 189 | 0.0155 | 0.1043 | 0.0443 | 0.3352 | 0.0445 |
0.3417 | 28.0 | 196 | 0.0157 | 0.1050 | 0.0464 | 0.3352 | 0.0429 |
0.3257 | 29.0 | 203 | 0.0151 | 0.1040 | 0.0474 | 0.3327 | 0.0415 |
0.3257 | 30.0 | 210 | 0.0150 | 0.1014 | 0.0419 | 0.3270 | 0.0445 |
0.3257 | 31.0 | 217 | 0.0133 | 0.1000 | 0.0451 | 0.3240 | 0.0417 |
0.3257 | 32.0 | 224 | 0.3204 | 0.0989 | 0.0425 | 0.0431 | 0.0133 |
0.3111 | 33.0 | 231 | 0.3191 | 0.0995 | 0.0439 | 0.0424 | 0.0133 |
0.3111 | 34.0 | 238 | 0.3159 | 0.0989 | 0.0431 | 0.0424 | 0.0134 |
0.3111 | 35.0 | 245 | 0.3153 | 0.0980 | 0.0433 | 0.0417 | 0.0130 |
0.3077 | 36.0 | 252 | 0.3133 | 0.0967 | 0.0409 | 0.0429 | 0.0128 |
0.3077 | 37.0 | 259 | 0.3145 | 0.0975 | 0.0418 | 0.0430 | 0.0127 |
0.3077 | 38.0 | 266 | 0.3164 | 0.0981 | 0.0417 | 0.0436 | 0.0128 |
0.3077 | 39.0 | 273 | 0.3145 | 0.0978 | 0.0402 | 0.0446 | 0.0130 |
0.3047 | 40.0 | 280 | 0.3138 | 0.0974 | 0.0441 | 0.0406 | 0.0128 |
0.3047 | 41.0 | 287 | 0.3182 | 0.0979 | 0.0398 | 0.0450 | 0.0131 |
0.3047 | 42.0 | 294 | 0.3117 | 0.0955 | 0.0393 | 0.0434 | 0.0128 |
0.3023 | 43.0 | 301 | 0.3078 | 0.0953 | 0.0408 | 0.0421 | 0.0124 |
0.3023 | 44.0 | 308 | 0.3103 | 0.0969 | 0.0409 | 0.0435 | 0.0126 |
0.3023 | 45.0 | 315 | 0.3067 | 0.0949 | 0.0379 | 0.0447 | 0.0124 |
0.3023 | 46.0 | 322 | 0.3076 | 0.0950 | 0.0434 | 0.0395 | 0.0121 |
0.295 | 47.0 | 329 | 0.3035 | 0.0937 | 0.0384 | 0.0430 | 0.0123 |
0.295 | 48.0 | 336 | 0.3054 | 0.0954 | 0.0411 | 0.0419 | 0.0124 |
0.295 | 49.0 | 343 | 0.3038 | 0.0937 | 0.0391 | 0.0428 | 0.0118 |
0.2845 | 50.0 | 350 | 0.3048 | 0.0942 | 0.0409 | 0.0415 | 0.0118 |
0.2845 | 51.0 | 357 | 0.3039 | 0.0942 | 0.0388 | 0.0435 | 0.0119 |
0.2845 | 52.0 | 364 | 0.3029 | 0.0935 | 0.0403 | 0.0416 | 0.0116 |
0.2845 | 53.0 | 371 | 0.2978 | 0.0922 | 0.0389 | 0.0418 | 0.0115 |
0.2796 | 54.0 | 378 | 0.2995 | 0.0933 | 0.0422 | 0.0399 | 0.0111 |
0.2796 | 55.0 | 385 | 0.2984 | 0.0922 | 0.0358 | 0.0448 | 0.0115 |
0.2796 | 56.0 | 392 | 0.3003 | 0.0933 | 0.0423 | 0.0390 | 0.0120 |
0.2796 | 57.0 | 399 | 0.3000 | 0.0933 | 0.0393 | 0.0417 | 0.0123 |
0.2768 | 58.0 | 406 | 0.3037 | 0.0942 | 0.0428 | 0.0395 | 0.0120 |
0.2768 | 59.0 | 413 | 0.2987 | 0.0921 | 0.0366 | 0.0438 | 0.0117 |
0.2768 | 60.0 | 420 | 0.3036 | 0.0946 | 0.0472 | 0.0362 | 0.0113 |
0.2774 | 61.0 | 427 | 0.3010 | 0.0929 | 0.0384 | 0.0431 | 0.0114 |
0.2774 | 62.0 | 434 | 0.3014 | 0.0924 | 0.0401 | 0.0409 | 0.0114 |
0.2774 | 63.0 | 441 | 0.3024 | 0.0934 | 0.0422 | 0.0401 | 0.0111 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for objects76/ft-ja4-2.25sec-250513_1443
Base model
pyannote/segmentation-3.0