--- library_name: transformers license: apache-2.0 base_model: SenseTime/deformable-detr tags: - generated_from_trainer datasets: - generator model-index: - name: fisheye8k_SenseTime_deformable-detr results: [] --- # fisheye8k_SenseTime_deformable-detr This model is a fine-tuned version of [SenseTime/deformable-detr](https://huggingface.co/SenseTime/deformable-detr) on the generator dataset. It achieves the following results on the evaluation set: - Loss: 1.2359 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 8 - seed: 0 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 0.8578 | 1.0 | 5288 | 1.4569 | | 0.7252 | 2.0 | 10576 | 1.2005 | | 0.6929 | 3.0 | 15864 | 1.1937 | | 0.6535 | 4.0 | 21152 | 1.2539 | | 0.6191 | 5.0 | 26440 | 1.2350 | | 0.5987 | 6.0 | 31728 | 1.2300 | | 0.5677 | 7.0 | 37016 | 1.1897 | | 0.558 | 8.0 | 42304 | 1.2451 | | 0.5338 | 9.0 | 47592 | 1.1865 | | 0.5054 | 10.0 | 52880 | 1.1585 | | 0.4795 | 11.0 | 58168 | 1.2366 | | 0.4809 | 12.0 | 63456 | 1.2193 | | 0.4825 | 13.0 | 68744 | 1.2375 | | 0.4535 | 14.0 | 74032 | 1.2542 | | 0.4357 | 15.0 | 79320 | 1.2359 | ### Framework versions - Transformers 4.48.3 - Pytorch 2.5.1+cu124 - Datasets 3.2.0 - Tokenizers 0.21.0