synthetic-speaker-all2
This model is a fine-tuned version of pyannote/segmentation-3.0 on the objects76/synthetic-all2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.1979
- Der: 0.0669
- False Alarm: 0.0186
- Missed Detection: 0.0336
- Confusion: 0.0147
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 2048
- eval_batch_size: 2048
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 80
Training results
Training Loss | Epoch | Step | Validation Loss | Der | False Alarm | Missed Detection | Confusion |
---|---|---|---|---|---|---|---|
No log | 1.0 | 4 | 0.3853 | 0.1303 | 0.0358 | 0.0547 | 0.0397 |
No log | 2.0 | 8 | 0.3610 | 0.1214 | 0.0390 | 0.0497 | 0.0327 |
No log | 3.0 | 12 | 0.3485 | 0.1163 | 0.0350 | 0.0501 | 0.0312 |
No log | 4.0 | 16 | 0.3372 | 0.1129 | 0.0317 | 0.0505 | 0.0307 |
No log | 5.0 | 20 | 0.3276 | 0.1102 | 0.0322 | 0.0479 | 0.0300 |
No log | 6.0 | 24 | 0.3166 | 0.1065 | 0.0315 | 0.0460 | 0.0290 |
0.3199 | 7.0 | 28 | 0.3054 | 0.1020 | 0.0299 | 0.0458 | 0.0263 |
0.3199 | 8.0 | 32 | 0.3013 | 0.1001 | 0.0289 | 0.0451 | 0.0261 |
0.3199 | 9.0 | 36 | 0.3018 | 0.1001 | 0.0273 | 0.0454 | 0.0274 |
0.3199 | 10.0 | 40 | 0.2993 | 0.0984 | 0.0273 | 0.0444 | 0.0268 |
0.3199 | 11.0 | 44 | 0.2946 | 0.0976 | 0.0272 | 0.0434 | 0.0269 |
0.3199 | 12.0 | 48 | 0.2914 | 0.0955 | 0.0273 | 0.0422 | 0.0260 |
0.259 | 13.0 | 52 | 0.2825 | 0.0932 | 0.0259 | 0.0428 | 0.0245 |
0.259 | 14.0 | 56 | 0.2771 | 0.0904 | 0.0235 | 0.0438 | 0.0231 |
0.259 | 15.0 | 60 | 0.2708 | 0.0889 | 0.0243 | 0.0419 | 0.0227 |
0.259 | 16.0 | 64 | 0.2660 | 0.0877 | 0.0262 | 0.0396 | 0.0219 |
0.259 | 17.0 | 68 | 0.2631 | 0.0873 | 0.0258 | 0.0396 | 0.0219 |
0.259 | 18.0 | 72 | 0.2605 | 0.0859 | 0.0221 | 0.0425 | 0.0213 |
0.2314 | 19.0 | 76 | 0.2603 | 0.0856 | 0.0209 | 0.0443 | 0.0204 |
0.2314 | 20.0 | 80 | 0.2593 | 0.0850 | 0.0238 | 0.0402 | 0.0211 |
0.2314 | 21.0 | 84 | 0.2532 | 0.0840 | 0.0232 | 0.0391 | 0.0217 |
0.2314 | 22.0 | 88 | 0.2496 | 0.0835 | 0.0216 | 0.0396 | 0.0224 |
0.2314 | 23.0 | 92 | 0.2434 | 0.0816 | 0.0211 | 0.0396 | 0.0209 |
0.2314 | 24.0 | 96 | 0.2419 | 0.0816 | 0.0213 | 0.0385 | 0.0217 |
0.2087 | 25.0 | 100 | 0.2368 | 0.0801 | 0.0211 | 0.0375 | 0.0216 |
0.2087 | 26.0 | 104 | 0.2315 | 0.0784 | 0.0214 | 0.0362 | 0.0209 |
0.2087 | 27.0 | 108 | 0.2270 | 0.0776 | 0.0209 | 0.0363 | 0.0204 |
0.2087 | 28.0 | 112 | 0.2263 | 0.0772 | 0.0199 | 0.0372 | 0.0201 |
0.2087 | 29.0 | 116 | 0.2229 | 0.0761 | 0.0204 | 0.0362 | 0.0195 |
0.2087 | 30.0 | 120 | 0.2201 | 0.0752 | 0.0206 | 0.0356 | 0.0190 |
0.2087 | 31.0 | 124 | 0.2207 | 0.0751 | 0.0205 | 0.0355 | 0.0191 |
0.194 | 32.0 | 128 | 0.2191 | 0.0743 | 0.0197 | 0.0358 | 0.0189 |
0.194 | 33.0 | 132 | 0.2162 | 0.0738 | 0.0196 | 0.0355 | 0.0186 |
0.194 | 34.0 | 136 | 0.2127 | 0.0729 | 0.0197 | 0.0351 | 0.0182 |
0.194 | 35.0 | 140 | 0.2105 | 0.0725 | 0.0198 | 0.0348 | 0.0179 |
0.194 | 36.0 | 144 | 0.2097 | 0.0722 | 0.0194 | 0.0348 | 0.0181 |
0.194 | 37.0 | 148 | 0.2092 | 0.0720 | 0.0197 | 0.0348 | 0.0175 |
0.1824 | 38.0 | 152 | 0.2094 | 0.0723 | 0.0195 | 0.0350 | 0.0179 |
0.1824 | 39.0 | 156 | 0.2079 | 0.0720 | 0.0192 | 0.0348 | 0.0180 |
0.1824 | 40.0 | 160 | 0.2069 | 0.0715 | 0.0192 | 0.0346 | 0.0177 |
0.1824 | 41.0 | 164 | 0.2053 | 0.0711 | 0.0196 | 0.0343 | 0.0171 |
0.1824 | 42.0 | 168 | 0.2033 | 0.0708 | 0.0187 | 0.0352 | 0.0169 |
0.1824 | 43.0 | 172 | 0.2045 | 0.0702 | 0.0184 | 0.0353 | 0.0164 |
0.1726 | 44.0 | 176 | 0.2052 | 0.0705 | 0.0190 | 0.0349 | 0.0166 |
0.1726 | 45.0 | 180 | 0.2051 | 0.0701 | 0.0191 | 0.0349 | 0.0162 |
0.1726 | 46.0 | 184 | 0.2048 | 0.0697 | 0.0188 | 0.0352 | 0.0156 |
0.1726 | 47.0 | 188 | 0.2044 | 0.0697 | 0.0189 | 0.0353 | 0.0155 |
0.1726 | 48.0 | 192 | 0.2040 | 0.0693 | 0.0190 | 0.0347 | 0.0156 |
0.1726 | 49.0 | 196 | 0.2028 | 0.0691 | 0.0194 | 0.0342 | 0.0154 |
0.1682 | 50.0 | 200 | 0.2026 | 0.0689 | 0.0195 | 0.0340 | 0.0155 |
0.1682 | 51.0 | 204 | 0.2018 | 0.0692 | 0.0193 | 0.0340 | 0.0158 |
0.1682 | 52.0 | 208 | 0.2008 | 0.0691 | 0.0189 | 0.0343 | 0.0160 |
0.1682 | 53.0 | 212 | 0.2004 | 0.0685 | 0.0186 | 0.0343 | 0.0157 |
0.1682 | 54.0 | 216 | 0.2002 | 0.0685 | 0.0188 | 0.0340 | 0.0157 |
0.1682 | 55.0 | 220 | 0.2002 | 0.0683 | 0.0189 | 0.0337 | 0.0156 |
0.1682 | 56.0 | 224 | 0.2004 | 0.0679 | 0.0191 | 0.0333 | 0.0155 |
0.1636 | 57.0 | 228 | 0.1999 | 0.0679 | 0.0190 | 0.0334 | 0.0155 |
0.1636 | 58.0 | 232 | 0.1996 | 0.0677 | 0.0185 | 0.0337 | 0.0155 |
0.1636 | 59.0 | 236 | 0.1993 | 0.0678 | 0.0185 | 0.0338 | 0.0156 |
0.1636 | 60.0 | 240 | 0.1996 | 0.0677 | 0.0186 | 0.0336 | 0.0155 |
0.1636 | 61.0 | 244 | 0.1998 | 0.0679 | 0.0188 | 0.0335 | 0.0156 |
0.1636 | 62.0 | 248 | 0.2006 | 0.0678 | 0.0190 | 0.0334 | 0.0154 |
0.1612 | 63.0 | 252 | 0.1998 | 0.0674 | 0.0189 | 0.0335 | 0.0151 |
0.1612 | 64.0 | 256 | 0.1992 | 0.0670 | 0.0187 | 0.0335 | 0.0147 |
0.1612 | 65.0 | 260 | 0.1990 | 0.0670 | 0.0187 | 0.0336 | 0.0147 |
0.1612 | 66.0 | 264 | 0.1987 | 0.0671 | 0.0187 | 0.0337 | 0.0147 |
0.1612 | 67.0 | 268 | 0.1986 | 0.0671 | 0.0186 | 0.0337 | 0.0147 |
0.1612 | 68.0 | 272 | 0.1984 | 0.0670 | 0.0187 | 0.0336 | 0.0147 |
0.1584 | 69.0 | 276 | 0.1983 | 0.0670 | 0.0187 | 0.0337 | 0.0147 |
0.1584 | 70.0 | 280 | 0.1982 | 0.0670 | 0.0186 | 0.0337 | 0.0147 |
0.1584 | 71.0 | 284 | 0.1981 | 0.0669 | 0.0186 | 0.0336 | 0.0146 |
0.1584 | 72.0 | 288 | 0.1980 | 0.0669 | 0.0187 | 0.0336 | 0.0146 |
0.1584 | 73.0 | 292 | 0.1980 | 0.0669 | 0.0186 | 0.0336 | 0.0146 |
0.1584 | 74.0 | 296 | 0.1979 | 0.0669 | 0.0186 | 0.0337 | 0.0146 |
0.1588 | 75.0 | 300 | 0.1979 | 0.0669 | 0.0186 | 0.0336 | 0.0146 |
0.1588 | 76.0 | 304 | 0.1979 | 0.0669 | 0.0186 | 0.0336 | 0.0146 |
0.1588 | 77.0 | 308 | 0.1979 | 0.0669 | 0.0186 | 0.0336 | 0.0147 |
0.1588 | 78.0 | 312 | 0.1979 | 0.0669 | 0.0186 | 0.0337 | 0.0147 |
0.1588 | 79.0 | 316 | 0.1979 | 0.0669 | 0.0186 | 0.0336 | 0.0147 |
0.1588 | 80.0 | 320 | 0.1979 | 0.0669 | 0.0186 | 0.0336 | 0.0147 |
Framework versions
- Transformers 4.51.0
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.
Model tree for objects76/synthetic-all-2.25sec-250416_1557
Base model
pyannote/segmentation-3.0