chickens-repro-2

This model is a fine-tuned version of facebook/detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2380

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss
2.8642 1.0 227 2.7786
2.3721 2.0 454 2.2758
2.3709 3.0 681 2.1522
1.992 4.0 908 1.8385
2.1679 5.0 1135 1.9064
1.758 6.0 1362 1.6010
1.5898 7.0 1589 1.4712
1.5394 8.0 1816 1.3826
1.587 9.0 2043 1.4157
1.4699 10.0 2270 1.2894
1.3848 11.0 2497 1.2382
1.3815 12.0 2724 1.2361
1.3613 13.0 2951 1.3203
1.4138 14.0 3178 1.2496
1.3376 15.0 3405 1.6375
1.2191 16.0 3632 1.1517
1.2034 17.0 3859 1.1106
1.2332 18.0 4086 1.0974
1.2246 19.0 4313 1.0629
1.2412 20.0 4540 1.1839
1.1197 21.0 4767 1.2765
1.2092 22.0 4994 0.9938
1.1339 23.0 5221 0.9137
1.0476 24.0 5448 0.9967
1.1219 25.0 5675 1.0872
1.032 26.0 5902 1.0375
1.0126 27.0 6129 0.8812
0.994 28.0 6356 0.8283
0.9812 29.0 6583 0.8342
0.8992 30.0 6810 0.8800
1.0294 31.0 7037 0.8117
0.9669 32.0 7264 0.8216
0.956 33.0 7491 0.8803
0.9274 34.0 7718 0.8257
0.9307 35.0 7945 0.8582
0.8248 36.0 8172 0.9441
0.8924 37.0 8399 0.7222
0.8552 38.0 8626 0.7476
0.8801 39.0 8853 0.8598
0.8812 40.0 9080 0.8358
0.8214 41.0 9307 0.7525
0.8671 42.0 9534 0.7123
0.8931 43.0 9761 0.7301
0.7545 44.0 9988 0.6438
0.8072 45.0 10215 0.5928
0.7908 46.0 10442 0.6737
0.8407 47.0 10669 0.5894
0.7697 48.0 10896 0.6289
0.7428 49.0 11123 0.8339
0.7736 50.0 11350 0.6534
0.7625 51.0 11577 0.6981
0.735 52.0 11804 0.6412
0.7195 53.0 12031 0.6030
0.693 54.0 12258 0.5586
0.7213 55.0 12485 0.5616
0.77 56.0 12712 0.5286
0.7317 57.0 12939 0.5960
0.6903 58.0 13166 0.5512
0.6516 59.0 13393 0.5363
0.6507 60.0 13620 0.5430
0.6778 61.0 13847 0.5447
0.6809 62.0 14074 0.5097
0.6508 63.0 14301 0.5164
0.6251 64.0 14528 0.5230
0.6374 65.0 14755 0.5548
0.6111 66.0 14982 0.4891
0.6163 67.0 15209 0.5317
0.5782 68.0 15436 0.5339
0.5976 69.0 15663 0.5699
0.5988 70.0 15890 0.4710
0.5744 71.0 16117 0.4559
0.6212 72.0 16344 0.5214
0.6087 73.0 16571 0.4709
0.6014 74.0 16798 0.4691
0.5496 75.0 17025 0.4391
0.5636 76.0 17252 0.4778
0.535 77.0 17479 0.4294
0.5198 78.0 17706 0.4557
0.5691 79.0 17933 0.5150
0.5253 80.0 18160 0.4082
0.5217 81.0 18387 0.4121
0.524 82.0 18614 0.4222
0.5445 83.0 18841 0.5020
0.5398 84.0 19068 0.3842
0.5395 85.0 19295 0.4417
0.5229 86.0 19522 0.3844
0.5029 87.0 19749 0.4135
0.5317 88.0 19976 0.4352
0.4964 89.0 20203 0.4285
0.5111 90.0 20430 0.4970
0.4912 91.0 20657 0.4222
0.5329 92.0 20884 0.3773
0.5028 93.0 21111 0.3613
0.4981 94.0 21338 0.3819
0.483 95.0 21565 0.4335
0.501 96.0 21792 0.4052
0.5115 97.0 22019 0.4089
0.4829 98.0 22246 0.3903
0.4995 99.0 22473 0.3389
0.4905 100.0 22700 0.3421
0.5061 101.0 22927 0.3613
0.4718 102.0 23154 0.3889
0.4949 103.0 23381 0.3943
0.4822 104.0 23608 0.3837
0.4707 105.0 23835 0.3587
0.4895 106.0 24062 0.3607
0.4346 107.0 24289 0.3733
0.4656 108.0 24516 0.3329
0.4466 109.0 24743 0.3684
0.4727 110.0 24970 0.3618
0.4426 111.0 25197 0.3540
0.454 112.0 25424 0.3167
0.439 113.0 25651 0.3334
0.4624 114.0 25878 0.3527
0.4308 115.0 26105 0.3511
0.4407 116.0 26332 0.3275
0.4517 117.0 26559 0.3365
0.4601 118.0 26786 0.3527
0.4454 119.0 27013 0.3842
0.4281 120.0 27240 0.3746
0.4453 121.0 27467 0.3427
0.436 122.0 27694 0.3067
0.4403 123.0 27921 0.3150
0.4143 124.0 28148 0.3164
0.4048 125.0 28375 0.3616
0.428 126.0 28602 0.3720
0.4188 127.0 28829 0.3234
0.4172 128.0 29056 0.3124
0.4494 129.0 29283 0.2888
0.4267 130.0 29510 0.2965
0.4243 131.0 29737 0.3622
0.4136 132.0 29964 0.3205
0.4333 133.0 30191 0.3315
0.4185 134.0 30418 0.3479
0.395 135.0 30645 0.3305
0.4046 136.0 30872 0.3213
0.4118 137.0 31099 0.3258
0.4209 138.0 31326 0.2947
0.4129 139.0 31553 0.3013
0.4286 140.0 31780 0.2957
0.4043 141.0 32007 0.3560
0.4028 142.0 32234 0.2829
0.3998 143.0 32461 0.3024
0.3926 144.0 32688 0.3162
0.4089 145.0 32915 0.2693
0.4075 146.0 33142 0.3138
0.4059 147.0 33369 0.3018
0.393 148.0 33596 0.2809
0.3981 149.0 33823 0.3162
0.4073 150.0 34050 0.3090
0.398 151.0 34277 0.2970
0.3778 152.0 34504 0.2866
0.3808 153.0 34731 0.3098
0.3883 154.0 34958 0.2856
0.3817 155.0 35185 0.2766
0.3783 156.0 35412 0.3010
0.3826 157.0 35639 0.3383
0.4134 158.0 35866 0.3586
0.3847 159.0 36093 0.2574
0.3839 160.0 36320 0.2916
0.3876 161.0 36547 0.2743
0.3923 162.0 36774 0.2743
0.3557 163.0 37001 0.2901
0.362 164.0 37228 0.2476
0.3867 165.0 37455 0.3223
0.3745 166.0 37682 0.2784
0.3714 167.0 37909 0.3123
0.364 168.0 38136 0.2759
0.3873 169.0 38363 0.2787
0.3835 170.0 38590 0.3225
0.3797 171.0 38817 0.2866
0.376 172.0 39044 0.2592
0.3576 173.0 39271 0.2962
0.3627 174.0 39498 0.2839
0.3932 175.0 39725 0.2616
0.3605 176.0 39952 0.2960
0.3729 177.0 40179 0.2642
0.3613 178.0 40406 0.2763
0.3744 179.0 40633 0.2748
0.3574 180.0 40860 0.2835
0.3471 181.0 41087 0.2893
0.3631 182.0 41314 0.2618
0.3517 183.0 41541 0.2641
0.3526 184.0 41768 0.2523
0.3705 185.0 41995 0.2873
0.3736 186.0 42222 0.2636
0.3635 187.0 42449 0.2791
0.3607 188.0 42676 0.2664
0.3435 189.0 42903 0.2487
0.3729 190.0 43130 0.2833
0.3611 191.0 43357 0.2557
0.3624 192.0 43584 0.2985
0.3615 193.0 43811 0.2616
0.346 194.0 44038 0.2669
0.3478 195.0 44265 0.2753
0.3392 196.0 44492 0.2974
0.3537 197.0 44719 0.2647
0.3665 198.0 44946 0.2623
0.3256 199.0 45173 0.2381
0.3508 200.0 45400 0.2741
0.335 201.0 45627 0.2800
0.3535 202.0 45854 0.2947
0.3414 203.0 46081 0.2443
0.3334 204.0 46308 0.2493
0.3399 205.0 46535 0.2597
0.3317 206.0 46762 0.2701
0.328 207.0 46989 0.2622
0.3589 208.0 47216 0.2319
0.3305 209.0 47443 0.2423
0.3578 210.0 47670 0.2300
0.3427 211.0 47897 0.2728
0.3278 212.0 48124 0.2541
0.3624 213.0 48351 0.2530
0.3292 214.0 48578 0.2817
0.3349 215.0 48805 0.2558
0.3345 216.0 49032 0.2464
0.3476 217.0 49259 0.2799
0.3276 218.0 49486 0.2618
0.3495 219.0 49713 0.2551
0.3451 220.0 49940 0.2640
0.3319 221.0 50167 0.2483
0.3348 222.0 50394 0.2460
0.3399 223.0 50621 0.2517
0.3619 224.0 50848 0.2425
0.3416 225.0 51075 0.2626
0.3156 226.0 51302 0.2513
0.3398 227.0 51529 0.2561
0.3424 228.0 51756 0.2573
0.3386 229.0 51983 0.2471
0.3382 230.0 52210 0.2462
0.331 231.0 52437 0.2557
0.3329 232.0 52664 0.2513
0.3319 233.0 52891 0.2750
0.3314 234.0 53118 0.2377
0.3437 235.0 53345 0.2416
0.3278 236.0 53572 0.2448
0.3206 237.0 53799 0.2478
0.3305 238.0 54026 0.2441
0.3269 239.0 54253 0.2497
0.3343 240.0 54480 0.2314
0.3235 241.0 54707 0.2492
0.3034 242.0 54934 0.2438
0.3124 243.0 55161 0.2455
0.356 244.0 55388 0.2515
0.3276 245.0 55615 0.2452
0.3383 246.0 55842 0.2466
0.3168 247.0 56069 0.2527
0.3451 248.0 56296 0.2529
0.3263 249.0 56523 0.2477
0.3332 250.0 56750 0.2479
0.3281 251.0 56977 0.2359
0.3352 252.0 57204 0.2411
0.3271 253.0 57431 0.2384
0.3373 254.0 57658 0.2499
0.3366 255.0 57885 0.2498
0.3276 256.0 58112 0.2550
0.3368 257.0 58339 0.2485
0.3371 258.0 58566 0.2395
0.3187 259.0 58793 0.2413
0.3208 260.0 59020 0.2375
0.3296 261.0 59247 0.2437
0.3191 262.0 59474 0.2478
0.3097 263.0 59701 0.2350
0.3125 264.0 59928 0.2378
0.3146 265.0 60155 0.2415
0.3163 266.0 60382 0.2437
0.3106 267.0 60609 0.2406
0.3135 268.0 60836 0.2405
0.3219 269.0 61063 0.2410
0.3287 270.0 61290 0.2350
0.3264 271.0 61517 0.2412
0.3196 272.0 61744 0.2355
0.3216 273.0 61971 0.2403
0.3125 274.0 62198 0.2394
0.3356 275.0 62425 0.2365
0.3116 276.0 62652 0.2407
0.3201 277.0 62879 0.2397
0.3188 278.0 63106 0.2407
0.3056 279.0 63333 0.2380
0.3173 280.0 63560 0.2395
0.3349 281.0 63787 0.2387
0.3206 282.0 64014 0.2354
0.3172 283.0 64241 0.2363
0.3099 284.0 64468 0.2439
0.339 285.0 64695 0.2364
0.32 286.0 64922 0.2350
0.3251 287.0 65149 0.2366
0.3271 288.0 65376 0.2390
0.3033 289.0 65603 0.2381
0.3226 290.0 65830 0.2385
0.3104 291.0 66057 0.2388
0.3118 292.0 66284 0.2397
0.3188 293.0 66511 0.2385
0.3103 294.0 66738 0.2385
0.319 295.0 66965 0.2387
0.3145 296.0 67192 0.2380
0.3197 297.0 67419 0.2380
0.3132 298.0 67646 0.2380
0.3348 299.0 67873 0.2380
0.3086 300.0 68100 0.2380

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.1+cu121
  • Datasets 2.19.2
  • Tokenizers 0.20.0
Downloads last month
17
Safetensors
Model size
41.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for joe611/chickens-repro-2

Finetuned
(572)
this model