model_id
stringlengths
9
102
model_card
stringlengths
4
343k
model_labels
listlengths
2
50.8k
shilpid/candy-finetuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # candy-finetuned This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 600 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "black_star", "cat", "grey_star", "insect", "moon", "owl", "unicorn_head", "unicorn_whole" ]
LiviaQi/trained_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # trained_model This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 500 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "black_star", "cat", "grey_star", "insect", "moon", "owl", "unicorn_head", "unicorn_whole" ]
ZilongLiu/Zilong_Candy_counter
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Zilong_Candy_counter This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 300 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "black_star", "cat", "grey_star", "insect", "moon", "owl", "unicorn_head", "unicorn_whole" ]
daloopa/tatr-dataset-1000-500epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tatr-dataset-1000-500epochs This model is a fine-tuned version of [microsoft/table-transformer-structure-recognition](https://huggingface.co/microsoft/table-transformer-structure-recognition) on the None dataset. It achieves the following results on the evaluation set: - eval_loss: 0.7819 - eval_runtime: 10.4713 - eval_samples_per_second: 13.943 - eval_steps_per_second: 1.814 - epoch: 243.23 - step: 6324 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 500 ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
krystaleahr/detr-resnet-50_finetuned_candy
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_candy This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 75 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "black_star", "cat", "grey_star", "insect", "moon", "owl", "unicorn_head", "unicorn_whole" ]
AladarMezga/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
GwenGawon/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
pratik33/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
amyeroberts/detr-resnet-50-base-coco
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50-base-coco This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the detection-datasets/coco dataset. It achieves the following results on the evaluation set: - Loss: 5.2641 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 4 | 5.5331 | | No log | 2.0 | 8 | 5.5277 | | 5.4377 | 3.0 | 12 | 5.4450 | | 5.4377 | 4.0 | 16 | 5.3960 | | 5.1582 | 5.0 | 20 | 5.3349 | | 5.1582 | 6.0 | 24 | 5.3144 | | 5.1582 | 7.0 | 28 | 5.2738 | | 5.0556 | 8.0 | 32 | 5.2641 | | 5.0556 | 9.0 | 36 | 5.2848 | | 4.9784 | 10.0 | 40 | 5.2792 | ### Framework versions - Transformers 4.32.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "person", "bicycle", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "car", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "motorcycle", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "airplane", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "bus", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "train", "dining table", "toilet", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "truck", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush", "boat", "traffic light" ]
TheNobody-12/my_awesome_model
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8" ]
grays-ai/table-transformer-structure-recognition
# Table Transformer (fine-tuned for Table Structure Recognition) Table Transformer (DETR) model trained on PubTables1M. It was introduced in the paper [PubTables-1M: Towards Comprehensive Table Extraction From Unstructured Documents](https://arxiv.org/abs/2110.00061) by Smock et al. and first released in [this repository](https://github.com/microsoft/table-transformer). Disclaimer: The team releasing Table Transformer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Table Transformer is equivalent to [DETR](https://huggingface.co/docs/transformers/model_doc/detr), a Transformer-based object detection model. Note that the authors decided to use the "normalize before" setting of DETR, which means that layernorm is applied before self- and cross-attention. ## Usage You can use the raw model for detecting the structure (like rows, columns) in tables. See the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/table-transformer) for more info.
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
grays-ai/table-detection
# Table Transformer (fine-tuned for Table Detection) Table Transformer (DETR) model trained on PubTables1M. It was introduced in the paper [PubTables-1M: Towards Comprehensive Table Extraction From Unstructured Documents](https://arxiv.org/abs/2110.00061) by Smock et al. and first released in [this repository](https://github.com/microsoft/table-transformer). Disclaimer: The team releasing Table Transformer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Table Transformer is equivalent to [DETR](https://huggingface.co/docs/transformers/model_doc/detr), a Transformer-based object detection model. Note that the authors decided to use the "normalize before" setting of DETR, which means that layernorm is applied before self- and cross-attention. ## Usage You can use the raw model for detecting tables in documents. See the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/table-transformer) for more info.
[ "table", "table rotated" ]
hsanchez/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
DunnBC22/yolos-tiny-NFL_Object_Detection
# *** This model is not completely trained!!! *** # <hr/> ## This model requires more training than what the resouces I have can offer!!! # # yolos-tiny-NFL_Object_Detection This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the nfl-object-detection dataset. ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/tree/main/Computer%20Vision/Object%20Detection/Trained%2C%20But%20to%20Standard/NFL%20Object%20Detection/Successful%20Attempt * Fine-tuning and evaluation of this model are in separate files. ** If you plan on fine-tuning an Object Detection model on the NFL Helmet detection dataset, I would recommend using (at least) the Yolos-small checkpoint. ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/keremberke/nfl-object-detection ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 18 ### Training results | Metric Name | IoU | Area | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.003 | | Average Precision (AP) | IoU=0.50 | area= all | maxDets=100 | 0.010 | | Average Precision (AP) | IoU=0.75 | area= all | maxDets=100 | 0.000 | | Average Precision (AP) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.002 | | Average Precision (AP) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.014 | | Average Precision (AP) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.000 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 1 | 0.002 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 10 | 0.014 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.029 | | Average Recall (AR) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.026 | | Average Recall (AR) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.105 | | Average Recall (AR) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.000 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "helmet", "helmet-blurred", "helmet-difficult", "helmet-partial", "helmet-sideline" ]
DunnBC22/yolos-small-Abdomen_MRI
# yolos-small-Abdomen_MRI This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small). ## Model description https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Abdomen%20MRIs%20Object%20Detection/Abdomen_MRI_Object_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/abdomen-mri ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 ### Training results | Metric Name | IoU | Area | maxDets | Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | 0.50:0.95 | all | 100 | 0.453 | | Average Precision (AP) | 0.50 | all | 100 | 0.928 | | Average Precision (AP) | 0.75 | all | 100 | 0.319 | | Average Precision (AP) | 0.50:0.95 | small | 100 | -1.000 | | Average Precision (AP) | 0.50:0.95 | medium | 100 | 0.426 | | Average Precision (AP) | 0.50:0.95 | large | 100 | 0.457 | | Average Recall (AR) | 0.50:0.95 | all | 1 | 0.518 | | Average Recall (AR) | 0.50:0.95 | all | 10 | 0.645 | | Average Recall (AR) | 0.50:0.95 | all | 100 | 0.715 | | Average Recall (AR) | 0.50:0.95 | small | 100 | -1.000 | | Average Recall (AR) | 0.50:0.95 | medium | 100 | 0.633 | | Average Recall (AR) | 0.50:0.95 | large | 100 | 0.716 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "abdomen-mri", "0" ]
decene/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "input", "label", "button", "p", "html", "h2" ]
DunnBC22/yolos-small-Wall_Damage
# yolos-small-Wall_Damage This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small). ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Trained%2C%20But%20to%20Standard/Wall%20Damage%20Object%20Detection/Wall_Damage_Object_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/wall-damage ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Metric Name | IoU | Area | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.241 | | Average Precision (AP) | IoU=0.50 | area= all | maxDets=100 | 0.400 | | Average Precision (AP) | IoU=0.75 | area= all | maxDets=100 | 0.231 | | Average Precision (AP) | IoU=0.50:0.95 | area= small | maxDets=100 | -1.000 | | Average Precision (AP) | IoU=0.50:0.95 | area=medium | maxDets=100 | -1.000 | | Average Precision (AP) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.241 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 1 | 0.488 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 10 | 0.579 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.621 | | Average Recall (AR) | IoU=0.50:0.95 | area= small | maxDets=100 | -1.000 | | Average Recall (AR) | IoU=0.50:0.95 | area=medium | maxDets=100 | -1.000 | | Average Recall (AR) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.621 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "wall-damage", "minorrotation", "moderaterotation", "severerotation" ]
DunnBC22/yolos-tiny-Brain_Tumor_Detection
# yolos-tiny-Brain_Tumor_Detection This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny). ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Brain%20Tumors/Brain_Tumor_m2pbp_Object_Detection_YOLOS.ipynb **If you intend on trying this project yourself, I highly recommend using (at least) the yolos-small checkpoint. ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/brain-tumor-m2pbp **Example** ![Example Image](https://raw.githubusercontent.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/main/Computer%20Vision/Object%20Detection/Brain%20Tumors/Images/Example.png) ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Metric Name | IoU | Area | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.185 | Average Precision (AP) | IoU=0.50 | area= all | maxDets=100 | 0.448 | Average Precision (AP) | IoU=0.75 | area= all | maxDets=100 | 0.126 | Average Precision (AP) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.001 | Average Precision (AP) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.080 | Average Precision (AP) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.296 | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 1 | 0.254 | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 10 | 0.353 | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.407 | Average Recall (AR) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.036 | Average Recall (AR) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.312 | Average Recall (AR) |IoU=0.50:0.95 | area= large | maxDets=100 | 0.565 ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "brain-tumor", "label0", "label1", "label2" ]
DunnBC22/yolos-small-Liver_Disease
# yolos-small-Liver_Disease This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small). ## Model description https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Liver%20Disease%20Object%20Detection/Liver_Disease_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/liver-disease **Example Image** ![Example Image](https://raw.githubusercontent.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/main/Computer%20Vision/Object%20Detection/Liver%20Disease%20Object%20Detection/Images/Example.png) ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results |Metric Name | IoU | Area | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.254 | | Average Precision (AP) | IoU=0.50 | area= all | maxDets=100 | 0.399 | | Average Precision (AP) | IoU=0.75 | area= all | maxDets=100 | 0.291 | | Average Precision (AP) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.000 | | Average Precision (AP) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.154 | | Average Precision (AP) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.283 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 1 | 0.147 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 10 | 0.451 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.552 | | Average Recall (AR) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.000 | | Average Recall (AR) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.444 | | Average Recall (AR) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.572 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "diseases", "ballooning", "fibrosis", "inflammation", "steatosis" ]
rice-rice/detr-resnet-50_finetuned_dataset
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_dataset This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the pothole-segmentation dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "potholes", "object", "pothole" ]
DunnBC22/yolos-tiny-Hard_Hat_Detection
# yolos-tiny-Hard_Hat_Detection This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the hard-hat-detection dataset. ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Hard%20Hat%20Detection/Hard_Hat_Object_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/keremberke/hard-hat-detection ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 ### Training results | Metric Name | IoU | Area| maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP)| IoU=0.50:0.95 | all | maxDets=100 | 0.346 | | Average Precision (AP)| IoU=0.50 | all | maxDets=100 | 0.747 | | Average Precision (AP)| IoU=0.75 | all | maxDets=100 | 0.275 | | Average Precision (AP)| IoU=0.50:0.95 | small | maxDets=100 | 0.128 | | Average Precision (AP)| IoU=0.50:0.95 | medium | maxDets=100 | 0.343 | | Average Precision (AP)| IoU=0.50:0.95 | large | maxDets=100 | 0.521 | | Average Recall (AR)| IoU=0.50:0.95 | all | maxDets=1 | 0.188 | | Average Recall (AR)| IoU=0.50:0.95 | all | maxDets=10 | 0.484 | | Average Recall (AR)| IoU=0.50:0.95 | all | maxDets=100 | 0.558 | | Average Recall (AR)| IoU=0.50:0.95 | small | maxDets=100 | 0.320 | | Average Recall (AR)| IoU=0.50:0.95 | medium | maxDets=100 | 0.538 | | Average Recall (AR)| IoU=0.50:0.95 | large | maxDets=100 | 0.743 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "hardhat", "no-hardhat" ]
DunnBC22/yolos-small-Stomata_Cells
# yolos-small-Stomata_Cells This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small). ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Stomata%20Cells/Stomata_Cells_Object_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/stomata-cells ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 12 ### Training results | Metric Name | IoU | Area| maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | all | maxDets=100 | 0.340 | | Average Precision (AP) | IoU=0.50 | all | maxDets=100 | 0.571 | | Average Precision (AP) | IoU=0.75 | all | maxDets=100 | 0.361 | | Average Precision (AP) | IoU=0.50:0.95 | small | maxDets=100 | 0.155 | | Average Precision (AP) | IoU=0.50:0.95 | medium | maxDets=100 | 0.220 | | Average Precision (AP) | IoU=0.50:0.95 | large | maxDets=100 | 0.498 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets= 1 | 0.146 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets= 10 | 0.423 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=100 | 0.547 | | Average Recall (AR) | IoU=0.50:0.95 | small | maxDets=100 | 0.275 | | Average Recall (AR) | IoU=0.50:0.95 | medium | maxDets=100 | 0.439 | | Average Recall (AR) | IoU=0.50:0.95 | large | maxDets=100 | 0.764 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "stomata-cells", "close", "open" ]
DunnBC22/yolos-small-Forklift_Object_Detection
# yolos-small-Forklift_Object_Detection This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small) on the forklift-object-detection dataset. ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/tree/main/Computer%20Vision/Object%20Detection/Forklift%20Object%20Detection ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/keremberke/forklift-object-detection ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Metric Name | IoU | Area Category | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.136 | | Average Precision (AP) | IoU=0.50 | area= all | maxDets=100 | 0.400 | | Average Precision (AP) | IoU=0.75 | area= all | maxDets=100 | 0.054 | | Average Precision (AP) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.001 | | Average Precision (AP) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.051 | | Average Precision (AP) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.177 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 1 | 0.178 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 10 | 0.294 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.340 | | Average Recall (AR) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.075 | | Average Recall (AR) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.299 | | Average Recall (AR) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.373 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "forklift", "person" ]
DunnBC22/yolos-small-Axial_MRIs
# yolos-small-Axial_MRIs This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small). ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Axial%20MRIs/Axial_MRIs_Object_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/axial-mri ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Metric Name | IoU | Area| maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | all | maxDets=100 | 0.284 | | Average Precision (AP) | IoU=0.50 | all | maxDets=100 | 0.451 | | Average Precision (AP) | IoU=0.75 | all | maxDets=100 | 0.351 | | Average Precision (AP) | IoU=0.50:0.95 | small | maxDets=100 | 0.000 | | Average Precision (AP) | IoU=0.50:0.95 | medium | maxDets=100 | 0.182 | | Average Precision (AP) | IoU=0.50:0.95 | large | maxDets=100 | 0.663 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=1 | 0.388 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=10 | 0.524 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=100 | 0.566 | | Average Recall (AR) | IoU=0.50:0.95 | small | maxDets=100 | 0.000 | | Average Recall (AR) | IoU=0.50:0.95 | medium | maxDets=100 | 0.502 | | Average Recall (AR) | IoU=0.50:0.95 | large | maxDets=100 | 0.791 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "axial-mri", "negative", "positive" ]
DunnBC22/yolos-small-Blood_Cell_Object_Detection
# yolos-small-Blood_Cell_Object_Detection This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small) on the blood-cell-object-detection dataset. ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Blood%20Cell%20Object%20Detection/Blood_Cell_Object_Detection_YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/keremberke/blood-cell-object-detection ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Metric Name | IoU | Area | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | all | maxDets=100 | 0.344 | | Average Precision (AP) | IoU=0.50 | all | maxDets=100 | 0.579 | | Average Precision (AP) | IoU=0.75 | all | maxDets=100 | 0.374 | | Average Precision (AP) | IoU=0.50:0.95 | small | maxDets=100 | 0.097 | | Average Precision (AP) | IoU=0.50:0.95 | medium | maxDets=100 | 0.258 | | Average Precision (AP) | IoU=0.50:0.95 | large | maxDets=100 | 0.224 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=1 | 0.210 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=10 | 0.376 | | Average Recall (AR) | IoU=0.50:0.95 | all | maxDets=100 | 0.448 | | Average Recall (AR) | IoU=0.50:0.95 | small | maxDets=100 | 0.108 | | Average Recall (AR) | IoU=0.50:0.95 | medium | maxDets=100 | 0.375 | | Average Recall (AR) | IoU=0.50:0.95 | large | maxDets=100 | 0.448 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "platelets", "rbc", "wbc" ]
decene/detr-resnet-50_finetuned_ht1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_ht1 This model is a fine-tuned version of [decene/detr-resnet-50_finetuned_ht1](https://huggingface.co/decene/detr-resnet-50_finetuned_ht1) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 200 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "input", "label", "button", "p", "html", "h2" ]
machinelearningzuu/detr-resnet-50_finetuned-normal-vs-disabled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned-normal-vs-disabled This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 1.13.1 - Datasets 2.12.0 - Tokenizers 0.13.3
[ "normal", "diabled" ]
priynka/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
distill-io/detr-amzss3-v2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-amzss3-v2 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3494 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | No log | 0.54 | 1000 | 0.4810 | | 0.5325 | 1.08 | 2000 | 0.4812 | | 0.5325 | 1.62 | 3000 | 0.4739 | | 0.5322 | 2.16 | 4000 | 0.4759 | | 0.5322 | 2.7 | 5000 | 0.4818 | | 0.5259 | 3.24 | 6000 | 0.4522 | | 0.5259 | 3.78 | 7000 | 0.4632 | | 0.5167 | 4.32 | 8000 | 0.4628 | | 0.5167 | 4.86 | 9000 | 0.4345 | | 0.5076 | 5.4 | 10000 | 0.4563 | | 0.5076 | 5.94 | 11000 | 0.4326 | | 0.494 | 6.48 | 12000 | 0.4424 | | 0.4906 | 7.02 | 13000 | 0.4272 | | 0.4906 | 7.56 | 14000 | 0.4164 | | 0.4801 | 8.1 | 15000 | 0.4213 | | 0.4801 | 8.64 | 16000 | 0.4320 | | 0.4699 | 9.18 | 17000 | 0.4100 | | 0.4699 | 9.72 | 18000 | 0.4127 | | 0.4613 | 10.26 | 19000 | 0.4035 | | 0.4613 | 10.8 | 20000 | 0.4039 | | 0.4556 | 11.34 | 21000 | 0.4149 | | 0.4556 | 11.88 | 22000 | 0.4092 | | 0.4475 | 12.42 | 23000 | 0.3965 | | 0.4475 | 12.96 | 24000 | 0.3973 | | 0.4389 | 13.5 | 25000 | 0.4013 | | 0.4349 | 14.04 | 26000 | 0.3797 | | 0.4349 | 14.58 | 27000 | 0.3728 | | 0.4288 | 15.12 | 28000 | 0.3834 | | 0.4288 | 15.66 | 29000 | 0.3885 | | 0.4222 | 16.2 | 30000 | 0.3820 | | 0.4222 | 16.74 | 31000 | 0.3755 | | 0.4152 | 17.28 | 32000 | 0.3693 | | 0.4152 | 17.82 | 33000 | 0.3679 | | 0.4122 | 18.36 | 34000 | 0.3605 | | 0.4122 | 18.9 | 35000 | 0.3625 | | 0.4077 | 19.44 | 36000 | 0.3631 | | 0.4077 | 19.98 | 37000 | 0.3607 | | 0.4 | 20.52 | 38000 | 0.3615 | | 0.3972 | 21.06 | 39000 | 0.3561 | | 0.3972 | 21.6 | 40000 | 0.3594 | | 0.3953 | 22.14 | 41000 | 0.3554 | | 0.3953 | 22.68 | 42000 | 0.3515 | | 0.3903 | 23.22 | 43000 | 0.3539 | | 0.3903 | 23.76 | 44000 | 0.3500 | | 0.3878 | 24.3 | 45000 | 0.3489 | | 0.3878 | 24.84 | 46000 | 0.3494 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "block", "footer", "header" ]
akar49/detr-resnet-50_machinery-I
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_machinery-I This model is a fine-tuned version of [akar49/detr-resnet-50_machinery-I](https://huggingface.co/akar49/detr-resnet-50_machinery-I) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.1.0+cu121 - Datasets 2.16.0 - Tokenizers 0.13.3
[ "excavators", "dump truck", "wheel loader" ]
machinelearningzuu/detr-resnet-50_finetuned-room-objects
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned-room-objects This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.30.2 - Pytorch 1.13.0 - Datasets 2.11.0 - Tokenizers 0.13.0
[ "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "dining table", "toilet", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
chanelcolgate/detr-resnet-50_finetuned_yenthienviet
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_yenthienviet This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the yenthienviet dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "hop_dln", "hop_jn", "hop_vtg", "hop_ytv", "lo_kids", "lo_ytv", "loc_dln", "loc_jn", "loc_kids", "loc_ytv" ]
DunnBC22/yolos-small-Cell_Tower_Detection
# yolos-small-Cell_Tower_Detection This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small). ## Model description For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Object%20Detection/Cell%20Tower%20Object%20Detection/Cell%20Tower%20Detection%20YOLOS.ipynb ## Intended uses & limitations This model is intended to demonstrate my ability to solve a complex problem using technology. ## Training and evaluation data Dataset Source: https://huggingface.co/datasets/Francesco/cell-towers ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results | Metric Name | IoU | Area | maxDets | Metric Value | |:-----:|:-----:|:-----:|:-----:|:-----:| | Average Precision (AP) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.287 | | Average Precision (AP) | IoU=0.50 | area= all | maxDets=100 | 0.636 | | Average Precision (AP) | IoU=0.75 | area= all | maxDets=100 | 0.239 | | Average Precision (AP) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.069 | | Average Precision (AP) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.289 | | Average Precision (AP) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.556 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 1 | 0.192 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets= 10 | 0.460 | | Average Recall (AR) | IoU=0.50:0.95 | area= all | maxDets=100 | 0.492 | | Average Recall (AR) | IoU=0.50:0.95 | area= small | maxDets=100 | 0.151 | | Average Recall (AR) | IoU=0.50:0.95 | area=medium | maxDets=100 | 0.488 | | Average Recall (AR) | IoU=0.50:0.95 | area= large | maxDets=100 | 0.760 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "pieces", "joint", "side" ]
SIA86/detr-resnet-50_finetuned_WFCR
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_WFCR This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the water_flow_counters_recognition dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "value_a", "value_b", "serial" ]
SIA86/detr-resnet-50_finetuned_WFCR3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_WFCR3 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the water_flow_counters_recognition dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "value_a", "value_b", "serial" ]
akar49/deform_detr-crack-I
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deform_detr-crack-I This model is a fine-tuned version of [facebook/deformable-detr-box-supervised](https://huggingface.co/facebook/deformable-detr-box-supervised) on the crack_detection-merged dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "crack", "mold", "peeling_paint", "stairstep_crack", "water_seepage" ]
akar49/detr-crack-II
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-crack-II This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the crack_detection-merged-ii dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "crack", "fissures" ]
govindrai/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.0.dev0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
distill-io/detr-v8
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-V8 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2139 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:------:|:---------------:| | No log | 0.48 | 1000 | 0.3770 | | No log | 0.96 | 2000 | 0.3967 | | 0.4391 | 1.43 | 3000 | 0.3822 | | 0.4391 | 1.91 | 4000 | 0.4163 | | 0.4434 | 2.39 | 5000 | 0.3888 | | 0.4434 | 2.87 | 6000 | 0.3867 | | 0.4509 | 3.35 | 7000 | 0.4205 | | 0.4509 | 3.83 | 8000 | 0.4014 | | 0.455 | 4.3 | 9000 | 0.4117 | | 0.455 | 4.78 | 10000 | 0.3964 | | 0.4476 | 5.26 | 11000 | 0.3915 | | 0.4476 | 5.74 | 12000 | 0.3919 | | 0.444 | 6.22 | 13000 | 0.4026 | | 0.444 | 6.7 | 14000 | 0.3832 | | 0.443 | 7.17 | 15000 | 0.4057 | | 0.443 | 7.65 | 16000 | 0.3677 | | 0.4232 | 8.13 | 17000 | 0.3746 | | 0.4232 | 8.61 | 18000 | 0.3672 | | 0.4202 | 9.09 | 19000 | 0.3629 | | 0.4202 | 9.56 | 20000 | 0.3739 | | 0.4131 | 10.04 | 21000 | 0.3712 | | 0.4131 | 10.52 | 22000 | 0.3470 | | 0.4131 | 11.0 | 23000 | 0.3632 | | 0.4024 | 11.48 | 24000 | 0.3561 | | 0.4024 | 11.96 | 25000 | 0.3562 | | 0.4013 | 12.43 | 26000 | 0.3253 | | 0.4013 | 12.91 | 27000 | 0.3390 | | 0.3925 | 13.39 | 28000 | 0.3398 | | 0.3925 | 13.87 | 29000 | 0.3460 | | 0.3804 | 14.35 | 30000 | 0.3338 | | 0.3804 | 14.83 | 31000 | 0.3201 | | 0.3757 | 15.3 | 32000 | 0.3119 | | 0.3757 | 15.78 | 33000 | 0.3106 | | 0.3663 | 16.26 | 34000 | 0.3164 | | 0.3663 | 16.74 | 35000 | 0.3190 | | 0.3588 | 17.22 | 36000 | 0.3141 | | 0.3588 | 17.69 | 37000 | 0.3262 | | 0.3515 | 18.17 | 38000 | 0.3027 | | 0.3515 | 18.65 | 39000 | 0.3178 | | 0.3557 | 19.13 | 40000 | 0.3053 | | 0.3557 | 19.61 | 41000 | 0.3032 | | 0.3478 | 20.09 | 42000 | 0.3147 | | 0.3478 | 20.56 | 43000 | 0.3069 | | 0.3451 | 21.04 | 44000 | 0.3070 | | 0.3451 | 21.52 | 45000 | 0.3055 | | 0.3451 | 22.0 | 46000 | 0.2883 | | 0.3367 | 22.48 | 47000 | 0.3090 | | 0.3367 | 22.96 | 48000 | 0.2906 | | 0.3348 | 23.43 | 49000 | 0.2805 | | 0.3348 | 23.91 | 50000 | 0.2920 | | 0.3298 | 24.39 | 51000 | 0.2854 | | 0.3298 | 24.87 | 52000 | 0.2841 | | 0.3254 | 25.35 | 53000 | 0.2822 | | 0.3254 | 25.82 | 54000 | 0.2716 | | 0.3169 | 26.3 | 55000 | 0.2825 | | 0.3169 | 26.78 | 56000 | 0.2700 | | 0.314 | 27.26 | 57000 | 0.2640 | | 0.314 | 27.74 | 58000 | 0.2728 | | 0.3047 | 28.22 | 59000 | 0.2654 | | 0.3047 | 28.69 | 60000 | 0.2691 | | 0.2999 | 29.17 | 61000 | 0.2601 | | 0.2999 | 29.65 | 62000 | 0.2607 | | 0.297 | 30.13 | 63000 | 0.2581 | | 0.297 | 30.61 | 64000 | 0.2511 | | 0.2946 | 31.09 | 65000 | 0.2557 | | 0.2946 | 31.56 | 66000 | 0.2568 | | 0.2912 | 32.04 | 67000 | 0.2569 | | 0.2912 | 32.52 | 68000 | 0.2594 | | 0.2912 | 33.0 | 69000 | 0.2553 | | 0.2906 | 33.48 | 70000 | 0.2425 | | 0.2906 | 33.96 | 71000 | 0.2475 | | 0.2833 | 34.43 | 72000 | 0.2394 | | 0.2833 | 34.91 | 73000 | 0.2422 | | 0.278 | 35.39 | 74000 | 0.2403 | | 0.278 | 35.87 | 75000 | 0.2349 | | 0.2738 | 36.35 | 76000 | 0.2300 | | 0.2738 | 36.82 | 77000 | 0.2332 | | 0.2701 | 37.3 | 78000 | 0.2309 | | 0.2701 | 37.78 | 79000 | 0.2298 | | 0.2659 | 38.26 | 80000 | 0.2343 | | 0.2659 | 38.74 | 81000 | 0.2265 | | 0.2626 | 39.22 | 82000 | 0.2310 | | 0.2626 | 39.69 | 83000 | 0.2255 | | 0.259 | 40.17 | 84000 | 0.2263 | | 0.259 | 40.65 | 85000 | 0.2282 | | 0.2563 | 41.13 | 86000 | 0.2309 | | 0.2563 | 41.61 | 87000 | 0.2270 | | 0.2548 | 42.09 | 88000 | 0.2237 | | 0.2548 | 42.56 | 89000 | 0.2203 | | 0.254 | 43.04 | 90000 | 0.2204 | | 0.254 | 43.52 | 91000 | 0.2218 | | 0.254 | 44.0 | 92000 | 0.2207 | | 0.2484 | 44.48 | 93000 | 0.2144 | | 0.2484 | 44.95 | 94000 | 0.2194 | | 0.2475 | 45.43 | 95000 | 0.2165 | | 0.2475 | 45.91 | 96000 | 0.2162 | | 0.2453 | 46.39 | 97000 | 0.2136 | | 0.2453 | 46.87 | 98000 | 0.2152 | | 0.2441 | 47.35 | 99000 | 0.2162 | | 0.2441 | 47.82 | 100000 | 0.2171 | | 0.2408 | 48.3 | 101000 | 0.2119 | | 0.2408 | 48.78 | 102000 | 0.2131 | | 0.2389 | 49.26 | 103000 | 0.2109 | | 0.2389 | 49.74 | 104000 | 0.2139 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "block", "footer", "header" ]
mmoltisanti/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.29.2 - Pytorch 1.13.1 - Datasets 2.14.4 - Tokenizers 0.13.2
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
Yorai/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.32.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
Yorai/yolos-tiny_finetuned_cppe-5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # yolos-tiny_finetuned_cppe-5 This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.32.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
Yorai/yolos-tiny_finetuned_dataset
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # yolos-tiny_finetuned_dataset This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.32.0 - Pytorch 2.0.1+cu117 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "metals_and_plastic", "other", "non_recyclable", "glass", "paper", "bio", "unknown" ]
machinelearningzuu/detr-resnet-50_finetuned-weed-detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned-weed-detection This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 1.13.1 - Datasets 2.12.0 - Tokenizers 0.13.3
[ "weed1", "weed2", "weed3", "weed4" ]
Garell/detr-resnet-50_finetuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "teeth", "caries" ]
machinelearningzuu/deformable-detr-box-finetuned-weed-detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deformable-detr-box-finetuned-weed-detection This model is a fine-tuned version of [facebook/deformable-detr-box-supervised](https://huggingface.co/facebook/deformable-detr-box-supervised) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 1.13.1 - Datasets 2.12.0 - Tokenizers 0.13.3
[ "weed1", "weed2", "weed3", "weed4" ]
Rozminzamha/detr-resnet-50_finetuned-normal-vs-disabled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned-normal-vs-disabled This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 1.13.1+cpu - Datasets 2.14.4 - Tokenizers 0.13.3
[ "normal", "diabled" ]
andrei-saceleanu/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
IT20255756/deformable-detr-box-finetuned-weed-detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deformable-detr-box-finetuned-weed-detection This model is a fine-tuned version of [facebook/deformable-detr-box-supervised](https://huggingface.co/facebook/deformable-detr-box-supervised) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 1.13.1+cpu - Datasets 2.14.4 - Tokenizers 0.13.3
[ "weed1", "weed2", "weed3", "weed4" ]
IT20429546/detr-resnet-50_finetuned-weed-detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned-weed-detection This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 1.13.1+cpu - Datasets 2.14.4 - Tokenizers 0.13.3
[ "weed1", "weed2", "weed3", "weed4" ]
thirosh0520/detr-resnet-50_finetuned-room-objects
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned-room-objects This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cpu - Datasets 2.14.4 - Tokenizers 0.13.3
[ "n/a", "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "street sign", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "hat", "backpack", "umbrella", "shoe", "eye glasses", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "plate", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "mirror", "dining table", "window", "desk", "toilet", "door", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "blender", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
Yorai/detr-resnet-50_finetuned_detect-waste
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_detect-waste This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the [detect-waste](https://huggingface.co/datasets/Yorai/detect-waste) dataset. It achieves the following results on the evaluation set: - Loss: 1.0141 - Map: 0.2906 - Map 50: 0.4268 - Map 75: 0.3141 - Map Small: 0.2906 - Map Medium: -1.0 - Map Large: -1.0 - Mar 1: 0.2422 - Mar 10: 0.3643 - Mar 100: 0.3643 - Mar Small: 0.3643 - Mar Medium: -1.0 - Mar Large: -1.0 - Map Per Class: [0.34276220202445984, 0.27865204215049744, 0.1607096940279007, 0.26159897446632385, 0.3790518343448639, 0.43935513496398926, 0.17204682528972626] - Mar 100 Per Class: [0.4175185561180115, 0.34998106956481934, 0.24253687262535095, 0.3344777226448059, 0.4658925533294678, 0.48632490634918213, 0.25357064604759216] - Classes: [0, 1, 2, 3, 4, 5, 6] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Per Class | Mar 100 Per Class | Classes | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------:| | 4.1468 | 0.78 | 500 | 4.0411 | 0.0087 | 0.0192 | 0.0071 | 0.0087 | -1.0 | -1.0 | 0.0167 | 0.0324 | 0.0324 | 0.0324 | -1.0 | -1.0 | [0.04623645916581154, 0.0, 0.0008125691674649715, 0.0, 0.012627051211893559, 0.0, 0.0012919838773086667] | [0.1765443980693817, 0.0, 0.010891089215874672, 0.0, 0.01826483942568302, 0.0, 0.021276595070958138] | [0, 1, 2, 3, 4, 5, 6] | | 4.194 | 1.56 | 1000 | 4.0133 | 0.0095 | 0.0202 | 0.0078 | 0.0095 | -1.0 | -1.0 | 0.0185 | 0.0342 | 0.0342 | 0.0342 | -1.0 | -1.0 | [0.048249680548906326, 0.0, 0.0006883515743538737, 0.0, 0.016567163169384003, 0.0, 0.0011674906127154827] | [0.17282818257808685, 0.0, 0.009735973551869392, 0.0, 0.032420091331005096, 0.0, 0.024113474413752556] | [0, 1, 2, 3, 4, 5, 6] | | 3.9605 | 2.34 | 1500 | 3.7443 | 0.0113 | 0.0240 | 0.0092 | 0.0113 | -1.0 | -1.0 | 0.0207 | 0.0389 | 0.0389 | 0.0389 | -1.0 | -1.0 | [0.05232662707567215, 0.0, 0.008277339860796928, 0.0, 0.016159038990736008, 0.0, 0.0020039111841470003] | [0.18117760121822357, 0.0, 0.01171617116779089, 0.0, 0.04779299721121788, 0.0, 0.031610943377017975] | [0, 1, 2, 3, 4, 5, 6] | | 3.7367 | 3.12 | 2000 | 3.7225 | 0.0127 | 0.0271 | 0.0102 | 0.0127 | -1.0 | -1.0 | 0.0222 | 0.0429 | 0.0429 | 0.0429 | -1.0 | -1.0 | [0.060713302344083786, 0.0, 0.008382072672247887, 0.0, 0.016689885407686234, 0.0001313720567850396, 0.0029707648791372776] | [0.1877654492855072, 0.0, 0.013902639970183372, 0.0, 0.05388128012418747, 0.005785123910754919, 0.03893110528588295] | [0, 1, 2, 3, 4, 5, 6] | | 3.6339 | 3.91 | 2500 | 3.6414 | 0.0151 | 0.0320 | 0.0126 | 0.0151 | -1.0 | -1.0 | 0.0241 | 0.0467 | 0.0467 | 0.0467 | -1.0 | -1.0 | [0.06824539601802826, 0.0009900990407913923, 0.008812529034912586, 0.0, 0.022641275078058243, 9.549226524541155e-05, 0.004885945934802294] | [0.19152510166168213, 0.00024242424115072936, 0.01871287077665329, 0.0, 0.06410958617925644, 0.006446280982345343, 0.04555217921733856] | [0, 1, 2, 3, 4, 5, 6] | | 3.5368 | 4.69 | 3000 | 3.4403 | 0.0170 | 0.0349 | 0.0140 | 0.0170 | -1.0 | -1.0 | 0.0263 | 0.0502 | 0.0502 | 0.0502 | -1.0 | -1.0 | [0.07444033771753311, 0.0013285944005474448, 0.0036969168577343225, 0.0, 0.02982231415808201, 0.0008005613344721496, 0.008707632310688496] | [0.19416023790836334, 0.0031313130166381598, 0.018261825665831566, 0.0, 0.06887366622686386, 0.009641873650252819, 0.05749746784567833] | [0, 1, 2, 3, 4, 5, 6] | | 3.3781 | 5.47 | 3500 | 3.2647 | 0.0196 | 0.0403 | 0.0162 | 0.0196 | -1.0 | -1.0 | 0.0283 | 0.0542 | 0.0542 | 0.0542 | -1.0 | -1.0 | [0.07894252985715866, 0.003304258920252323, 0.005874663591384888, 0.00534653477370739, 0.032837092876434326, 0.0010820407187566161, 0.009714609943330288] | [0.20048262178897858, 0.004588744603097439, 0.019801979884505272, 0.0014829460997134447, 0.07756034284830093, 0.012514757923781872, 0.0631350427865982] | [0, 1, 2, 3, 4, 5, 6] | | 3.3963 | 6.25 | 4000 | 3.3014 | 0.0198 | 0.0406 | 0.0165 | 0.0198 | -1.0 | -1.0 | 0.0310 | 0.0586 | 0.0586 | 0.0586 | -1.0 | -1.0 | [0.07857409864664078, 0.002693906659260392, 0.010952512733638287, 0.0014566973550245166, 0.03381899371743202, 0.0017245921771973372, 0.009445087984204292] | [0.20394545793533325, 0.006590908858925104, 0.023514851927757263, 0.0028114186134189367, 0.08607305586338043, 0.020454544574022293, 0.06689462810754776] | [0, 1, 2, 3, 4, 5, 6] | | 3.3532 | 7.03 | 4500 | 3.2767 | 0.0217 | 0.0446 | 0.0186 | 0.0217 | -1.0 | -1.0 | 0.0352 | 0.0648 | 0.0648 | 0.0648 | -1.0 | -1.0 | [0.08428213000297546, 0.0030477724503725767, 0.012319584377110004, 0.0017159844283014536, 0.03654558211565018, 0.003238566452637315, 0.01057812012732029] | [0.207185760140419, 0.007003366947174072, 0.028456179425120354, 0.005613225512206554, 0.10152207314968109, 0.03379246965050697, 0.07021276652812958] | [0, 1, 2, 3, 4, 5, 6] | | 3.1127 | 7.81 | 5000 | 3.2209 | 0.0245 | 0.0498 | 0.0211 | 0.0245 | -1.0 | -1.0 | 0.0380 | 0.0696 | 0.0696 | 0.0696 | -1.0 | -1.0 | [0.09061206132173538, 0.0029931419994682074, 0.012835826724767685, 0.003690356155857444, 0.03761321306228638, 0.012010062113404274, 0.011985357850790024] | [0.20946910977363586, 0.008363636210560799, 0.030462047085165977, 0.008373702876269817, 0.11018265038728714, 0.04512396827340126, 0.07545085996389389] | [0, 1, 2, 3, 4, 5, 6] | | 3.1775 | 8.59 | 5500 | 3.1662 | 0.0269 | 0.0539 | 0.0243 | 0.0269 | -1.0 | -1.0 | 0.0411 | 0.0743 | 0.0743 | 0.0743 | -1.0 | -1.0 | [0.09162214398384094, 0.0012531784595921636, 0.014446384273469448, 0.008817446418106556, 0.04345640912652016, 0.015094032511115074, 0.013941464945673943] | [0.21342577040195465, 0.00909090880304575, 0.03360335901379585, 0.013306071050465107, 0.11743462085723877, 0.056423742324113846, 0.07680758833885193] | [0, 1, 2, 3, 4, 5, 6] | | 3.1419 | 9.38 | 6000 | 3.0182 | 0.0297 | 0.0584 | 0.0270 | 0.0297 | -1.0 | -1.0 | 0.0437 | 0.0782 | 0.0782 | 0.0782 | -1.0 | -1.0 | [0.09514405578374863, 0.0025450591929256916, 0.016270553693175316, 0.0081990547478199, 0.046881288290023804, 0.020359132438898087, 0.01845024712383747] | [0.21765604615211487, 0.011666666716337204, 0.03633113205432892, 0.014359861612319946, 0.11845509707927704, 0.06893939524888992, 0.07989699393510818] | [0, 1, 2, 3, 4, 5, 6] | | 3.0921 | 10.16 | 6500 | 3.0030 | 0.0302 | 0.0610 | 0.0256 | 0.0302 | -1.0 | -1.0 | 0.0463 | 0.0827 | 0.0827 | 0.0827 | -1.0 | -1.0 | [0.09758389741182327, 0.006373511627316475, 0.01742025651037693, 0.008713486604392529, 0.048744086176157, 0.019423358142375946, 0.013462764210999012] | [0.21828779578208923, 0.01557109598070383, 0.04075907543301582, 0.01892467401921749, 0.12395504117012024, 0.08010171353816986, 0.08163821697235107] | [0, 1, 2, 3, 4, 5, 6] | | 2.964 | 10.94 | 7000 | 2.9448 | 0.0331 | 0.0657 | 0.0288 | 0.0331 | -1.0 | -1.0 | 0.0499 | 0.0877 | 0.0877 | 0.0877 | -1.0 | -1.0 | [0.10320653021335602, 0.007576418109238148, 0.018034495413303375, 0.011375959031283855, 0.052645713090896606, 0.02358655259013176, 0.014970344491302967] | [0.22245587408542633, 0.019740259274840355, 0.042126353830099106, 0.02360355854034424, 0.1306588351726532, 0.0916765034198761, 0.08358662575483322] | [0, 1, 2, 3, 4, 5, 6] | | 2.9554 | 11.72 | 7500 | 2.8727 | 0.0357 | 0.0710 | 0.0314 | 0.0357 | -1.0 | -1.0 | 0.0529 | 0.0920 | 0.0920 | 0.0920 | -1.0 | -1.0 | [0.1065044030547142, 0.008357521146535873, 0.01859544776380062, 0.012739308178424835, 0.05574037879705429, 0.029986992478370667, 0.017998481169342995] | [0.22563062608242035, 0.023838384076952934, 0.045775577425956726, 0.02758938819169998, 0.1338508427143097, 0.10088153928518295, 0.08644376695156097] | [0, 1, 2, 3, 4, 5, 6] | | 2.6755 | 12.5 | 8000 | 2.7571 | 0.0381 | 0.0766 | 0.0329 | 0.0381 | -1.0 | -1.0 | 0.0570 | 0.0973 | 0.0973 | 0.0973 | -1.0 | -1.0 | [0.11142787337303162, 0.007701957132667303, 0.0196710005402565, 0.016465453431010246, 0.05648297443985939, 0.035306066274642944, 0.019693374633789062] | [0.22848697006702423, 0.026174241676926613, 0.0493193082511425, 0.03451557084918022, 0.13941210508346558, 0.11265496164560318, 0.09084979444742203] | [0, 1, 2, 3, 4, 5, 6] | | 2.7857 | 13.28 | 8500 | 2.7355 | 0.0411 | 0.0814 | 0.0362 | 0.0411 | -1.0 | -1.0 | 0.0601 | 0.1016 | 0.1016 | 0.1016 | -1.0 | -1.0 | [0.11556719243526459, 0.010197699069976807, 0.02036297507584095, 0.018473459407687187, 0.05874854698777199, 0.042199619114398956, 0.022376172244548798] | [0.2314615100622177, 0.030053475871682167, 0.05247524753212929, 0.03916141018271446, 0.14496374130249023, 0.11988332867622375, 0.09348590672016144] | [0, 1, 2, 3, 4, 5, 6] | | 2.4419 | 14.06 | 9000 | 2.5641 | 0.0439 | 0.0874 | 0.0385 | 0.0439 | -1.0 | -1.0 | 0.0640 | 0.1069 | 0.1069 | 0.1069 | -1.0 | -1.0 | [0.11788569390773773, 0.012564579956233501, 0.02252984419465065, 0.01985805667936802, 0.06335312873125076, 0.04805980622768402, 0.02310897968709469] | [0.23441119492053986, 0.03670033812522888, 0.057132378220558167, 0.045463282614946365, 0.14961948990821838, 0.12920109927654266, 0.09600923210382462] | [0, 1, 2, 3, 4, 5, 6] | | 2.5962 | 14.84 | 9500 | 2.6140 | 0.0463 | 0.0923 | 0.0405 | 0.0463 | -1.0 | -1.0 | 0.0672 | 0.1116 | 0.1116 | 0.1116 | -1.0 | -1.0 | [0.12200454622507095, 0.011156002059578896, 0.023570792749524117, 0.02225014939904213, 0.06721360236406326, 0.05404386296868324, 0.02417328953742981] | [0.2358311265707016, 0.04146730527281761, 0.059996526688337326, 0.05144782364368439, 0.1566450446844101, 0.13732057809829712, 0.09836826473474503] | [0, 1, 2, 3, 4, 5, 6] | | 2.5992 | 15.62 | 10000 | 2.6493 | 0.0478 | 0.0964 | 0.0416 | 0.0478 | -1.0 | -1.0 | 0.0701 | 0.1156 | 0.1156 | 0.1156 | -1.0 | -1.0 | [0.11906087398529053, 0.013203187845647335, 0.023804809898138046, 0.023666758090257645, 0.07104818522930145, 0.060628872364759445, 0.02351595088839531] | [0.2391698807477951, 0.045818183571100235, 0.06427393108606339, 0.055276814848184586, 0.1593150645494461, 0.1476859450340271, 0.09775582700967789] | [0, 1, 2, 3, 4, 5, 6] | | 2.6462 | 16.41 | 10500 | 2.5972 | 0.0497 | 0.0992 | 0.0443 | 0.0497 | -1.0 | -1.0 | 0.0725 | 0.1193 | 0.1193 | 0.1193 | -1.0 | -1.0 | [0.125301331281662, 0.015778768807649612, 0.02504820004105568, 0.025865115225315094, 0.0740809291601181, 0.058498892933130264, 0.02366984821856022] | [0.2426043450832367, 0.05085137113928795, 0.06543297320604324, 0.058691710233688354, 0.16227440536022186, 0.1550177037715912, 0.09991797804832458] | [0, 1, 2, 3, 4, 5, 6] | | 2.4706 | 17.19 | 11000 | 2.4914 | 0.0524 | 0.1040 | 0.0465 | 0.0524 | -1.0 | -1.0 | 0.0760 | 0.1238 | 0.1238 | 0.1238 | -1.0 | -1.0 | [0.12998996675014496, 0.015761325135827065, 0.02641410008072853, 0.028352826833724976, 0.07700449973344803, 0.06418990343809128, 0.02509688399732113] | [0.245647594332695, 0.055867768824100494, 0.06862436234951019, 0.06473734229803085, 0.16654212772846222, 0.16367392241954803, 0.10171778500080109] | [0, 1, 2, 3, 4, 5, 6] | | 2.4022 | 17.97 | 11500 | 2.6025 | 0.0538 | 0.1067 | 0.0488 | 0.0538 | -1.0 | -1.0 | 0.0779 | 0.1267 | 0.1267 | 0.1267 | -1.0 | -1.0 | [0.13130246102809906, 0.01669127121567726, 0.026978671550750732, 0.030373232439160347, 0.07948648184537888, 0.06546752899885178, 0.026632554829120636] | [0.24881651997566223, 0.05733860284090042, 0.0698235034942627, 0.06813599914312363, 0.16934683918952942, 0.1705354005098343, 0.10290295630693436] | [0, 1, 2, 3, 4, 5, 6] | | 2.3806 | 18.75 | 12000 | 2.4377 | 0.0565 | 0.1116 | 0.0510 | 0.0565 | -1.0 | -1.0 | 0.0809 | 0.1307 | 0.1307 | 0.1307 | -1.0 | -1.0 | [0.13539382815361023, 0.019166018813848495, 0.028394022956490517, 0.03170689195394516, 0.08289874345064163, 0.0700925663113594, 0.028070559725165367] | [0.2516007125377655, 0.06212121248245239, 0.07227034866809845, 0.07164071500301361, 0.17606544494628906, 0.1762741059064865, 0.1052769348025322] | [0, 1, 2, 3, 4, 5, 6] | | 2.3381 | 19.53 | 12500 | 2.3298 | 0.0595 | 0.1170 | 0.0539 | 0.0595 | -1.0 | -1.0 | 0.0841 | 0.1354 | 0.1354 | 0.1354 | -1.0 | -1.0 | [0.14168289303779602, 0.02123761549592018, 0.029218005016446114, 0.03483952209353447, 0.08439919352531433, 0.07521592825651169, 0.029624762013554573] | [0.254447877407074, 0.06652121245861053, 0.07486468553543091, 0.07665052264928818, 0.18379908800125122, 0.1837686002254486, 0.10760283470153809] | [0, 1, 2, 3, 4, 5, 6] | | 2.1834 | 20.31 | 13000 | 2.2925 | 0.0619 | 0.1218 | 0.0563 | 0.0619 | -1.0 | -1.0 | 0.0868 | 0.1392 | 0.1392 | 0.1392 | -1.0 | -1.0 | [0.14434954524040222, 0.023596107959747314, 0.029413597658276558, 0.03724164515733719, 0.08747584372758865, 0.07988733053207397, 0.031240329146385193] | [0.2572765052318573, 0.07151515036821365, 0.07733561843633652, 0.08106201887130737, 0.18765367567539215, 0.19011442363262177, 0.10954329371452332] | [0, 1, 2, 3, 4, 5, 6] | | 2.3694 | 21.09 | 13500 | 2.2980 | 0.0643 | 0.1250 | 0.0596 | 0.0643 | -1.0 | -1.0 | 0.0896 | 0.1431 | 0.1431 | 0.1431 | -1.0 | -1.0 | [0.14629203081130981, 0.02630077488720417, 0.03064255230128765, 0.03969859704375267, 0.08952046930789948, 0.08544820547103882, 0.032189808785915375] | [0.26002073287963867, 0.07739618420600891, 0.07984354346990585, 0.08463411778211594, 0.19238965213298798, 0.19605141878128052, 0.11159518361091614] | [0, 1, 2, 3, 4, 5, 6] | | 2.1876 | 21.88 | 14000 | 2.2552 | 0.0664 | 0.1299 | 0.0615 | 0.0664 | -1.0 | -1.0 | 0.0920 | 0.1466 | 0.1466 | 0.1466 | -1.0 | -1.0 | [0.14918148517608643, 0.029802532866597176, 0.030558602884411812, 0.04181119427084923, 0.0926012173295021, 0.088821642100811, 0.03230477496981621] | [0.2630963921546936, 0.08175324648618698, 0.08174210041761398, 0.08892733603715897, 0.19709719717502594, 0.20020660758018494, 0.11349327117204666] | [0, 1, 2, 3, 4, 5, 6] | | 2.22 | 22.66 | 14500 | 2.1956 | 0.0696 | 0.1349 | 0.0645 | 0.0696 | -1.0 | -1.0 | 0.0949 | 0.1506 | 0.1506 | 0.1506 | -1.0 | -1.0 | [0.1518896371126175, 0.033730532974004745, 0.03175903856754303, 0.044416192919015884, 0.09609058499336243, 0.09530041366815567, 0.03383677080273628] | [0.2656303942203522, 0.08649947494268417, 0.08476158231496811, 0.09331821650266647, 0.20168477296829224, 0.2064690738916397, 0.11557488888502121] | [0, 1, 2, 3, 4, 5, 6] | | 2.2025 | 23.44 | 15000 | 2.2126 | 0.0726 | 0.1407 | 0.0675 | 0.0726 | -1.0 | -1.0 | 0.0976 | 0.1543 | 0.1543 | 0.1543 | -1.0 | -1.0 | [0.15515004098415375, 0.03552235662937164, 0.03365291655063629, 0.0472525879740715, 0.09961769729852676, 0.09976397454738617, 0.037467099726200104] | [0.2683526277542114, 0.09155555814504623, 0.08699669688940048, 0.09794694185256958, 0.2053728997707367, 0.21250689029693604, 0.11715974658727646] | [0, 1, 2, 3, 4, 5, 6] | | 2.1936 | 24.22 | 15500 | 2.1662 | 0.0751 | 0.1462 | 0.0694 | 0.0751 | -1.0 | -1.0 | 0.1005 | 0.1580 | 0.1580 | 0.1580 | -1.0 | -1.0 | [0.15799090266227722, 0.03928203508257866, 0.034626252949237823, 0.048913635313510895, 0.1035238727927208, 0.10347861051559448, 0.03808673098683357] | [0.27103936672210693, 0.09616813063621521, 0.08940700441598892, 0.10285746306180954, 0.21057593822479248, 0.21770194172859192, 0.11854757368564606] | [0, 1, 2, 3, 4, 5, 6] | | 2.2018 | 25.0 | 16000 | 2.2816 | 0.0772 | 0.1496 | 0.0717 | 0.0772 | -1.0 | -1.0 | 0.1024 | 0.1605 | 0.1605 | 0.1605 | -1.0 | -1.0 | [0.15951977670192719, 0.040796443819999695, 0.035473428666591644, 0.05021883174777031, 0.10709941387176514, 0.1077590137720108, 0.039515502750873566] | [0.2728523313999176, 0.09820076078176498, 0.09194513410329819, 0.10459558665752411, 0.21406963467597961, 0.22213326394557953, 0.11955104023218155] | [0, 1, 2, 3, 4, 5, 6] | | 2.069 | 25.78 | 16500 | 2.2788 | 0.0799 | 0.1544 | 0.0744 | 0.0799 | -1.0 | -1.0 | 0.1048 | 0.1637 | 0.1637 | 0.1637 | -1.0 | -1.0 | [0.1624949425458908, 0.043188855051994324, 0.03846662491559982, 0.052888572216033936, 0.11016106605529785, 0.11159937083721161, 0.04036788269877434] | [0.27505558729171753, 0.10233241319656372, 0.095039501786232, 0.10908042639493942, 0.2170056700706482, 0.22714750468730927, 0.12026649713516235] | [0, 1, 2, 3, 4, 5, 6] | | 2.2388 | 26.56 | 17000 | 2.1393 | 0.0822 | 0.1584 | 0.0767 | 0.0822 | -1.0 | -1.0 | 0.1072 | 0.1670 | 0.1670 | 0.1670 | -1.0 | -1.0 | [0.16449908912181854, 0.04439976066350937, 0.03944051265716553, 0.054946914315223694, 0.11398687958717346, 0.11732934415340424, 0.04113543778657913] | [0.27652451395988464, 0.10696969926357269, 0.09715589135885239, 0.11262975633144379, 0.22107172012329102, 0.23228001594543457, 0.12206925451755524] | [0, 1, 2, 3, 4, 5, 6] | | 2.002 | 27.34 | 17500 | 2.1429 | 0.0848 | 0.1634 | 0.0789 | 0.0848 | -1.0 | -1.0 | 0.1097 | 0.1701 | 0.1701 | 0.1701 | -1.0 | -1.0 | [0.16622968018054962, 0.04853104427456856, 0.04015907645225525, 0.05710913985967636, 0.11814910173416138, 0.11969507485628128, 0.043414726853370667] | [0.2779178023338318, 0.11163636296987534, 0.09929750114679337, 0.11634206771850586, 0.22536203265190125, 0.2366705983877182, 0.12360689043998718] | [0, 1, 2, 3, 4, 5, 6] | | 1.8305 | 28.12 | 18000 | 2.0416 | 0.0876 | 0.1682 | 0.0819 | 0.0876 | -1.0 | -1.0 | 0.1123 | 0.1736 | 0.1736 | 0.1736 | -1.0 | -1.0 | [0.1689627468585968, 0.05172034725546837, 0.04114536941051483, 0.06016615405678749, 0.12208609282970428, 0.12335427850484848, 0.04596796631813049] | [0.28000858426094055, 0.11762626469135284, 0.10105427354574203, 0.12081891298294067, 0.22976915538311005, 0.2406795173883438, 0.12553754448890686] | [0, 1, 2, 3, 4, 5, 6] | | 1.9246 | 28.91 | 18500 | 2.0548 | 0.0902 | 0.1726 | 0.0845 | 0.0902 | -1.0 | -1.0 | 0.1147 | 0.1769 | 0.1769 | 0.1769 | -1.0 | -1.0 | [0.17139744758605957, 0.05572446435689926, 0.04115204140543938, 0.06324263662099838, 0.1263333410024643, 0.12660689651966095, 0.04671153053641319] | [0.28168630599975586, 0.12232596427202225, 0.10400499403476715, 0.12460488080978394, 0.23404911160469055, 0.24453875422477722, 0.12732824683189392] | [0, 1, 2, 3, 4, 5, 6] | | 2.0712 | 29.69 | 19000 | 2.0686 | 0.0927 | 0.1770 | 0.0870 | 0.0927 | -1.0 | -1.0 | 0.1169 | 0.1801 | 0.1801 | 0.1801 | -1.0 | -1.0 | [0.1739092618227005, 0.0565057098865509, 0.04212590306997299, 0.06619900465011597, 0.13020069897174835, 0.13175153732299805, 0.048211514949798584] | [0.28365930914878845, 0.1268102079629898, 0.1058841422200203, 0.12817338109016418, 0.2378995418548584, 0.24880382418632507, 0.12912601232528687] | [0, 1, 2, 3, 4, 5, 6] | | 1.8114 | 30.47 | 19500 | 2.0090 | 0.0957 | 0.1815 | 0.0893 | 0.0957 | -1.0 | -1.0 | 0.1192 | 0.1832 | 0.1832 | 0.1832 | -1.0 | -1.0 | [0.17668937146663666, 0.060196880251169205, 0.04316006973385811, 0.06869190186262131, 0.134142205119133, 0.13747332990169525, 0.049689896404743195] | [0.2853875756263733, 0.13062937557697296, 0.10824659466743469, 0.13195812702178955, 0.24280528724193573, 0.25331637263298035, 0.13033278286457062] | [0, 1, 2, 3, 4, 5, 6] | | 1.8545 | 31.25 | 20000 | 1.8640 | 0.1000 | 0.1882 | 0.0943 | 0.1000 | -1.0 | -1.0 | 0.1220 | 0.1871 | 0.1871 | 0.1871 | -1.0 | -1.0 | [0.17970405519008636, 0.06414728611707687, 0.04569493606686592, 0.07412267476320267, 0.13957497477531433, 0.1450406312942505, 0.05184954032301903] | [0.28745174407958984, 0.13636364042758942, 0.11039604246616364, 0.13654844462871552, 0.2473972588777542, 0.25890496373176575, 0.13267983496189117] | [0, 1, 2, 3, 4, 5, 6] | | 1.8498 | 32.03 | 20500 | 1.8956 | 0.1032 | 0.1936 | 0.0969 | 0.1032 | -1.0 | -1.0 | 0.1244 | 0.1906 | 0.1906 | 0.1906 | -1.0 | -1.0 | [0.18263494968414307, 0.06708105653524399, 0.046513475477695465, 0.07773694396018982, 0.14296144247055054, 0.15190638601779938, 0.053776174783706665] | [0.2898060083389282, 0.14060606062412262, 0.11237623542547226, 0.14104144275188446, 0.25213274359703064, 0.26371699571609497, 0.13437862694263458] | [0, 1, 2, 3, 4, 5, 6] | | 1.7807 | 32.81 | 21000 | 1.8849 | 0.1058 | 0.1978 | 0.0996 | 0.1058 | -1.0 | -1.0 | 0.1268 | 0.1937 | 0.1937 | 0.1937 | -1.0 | -1.0 | [0.18508057296276093, 0.07147838175296783, 0.04737703502178192, 0.07961855083703995, 0.1454845517873764, 0.15771792829036713, 0.05387093871831894] | [0.29202979803085327, 0.14513708651065826, 0.11443108320236206, 0.14490030705928802, 0.25490322709083557, 0.26835891604423523, 0.13622570037841797] | [0, 1, 2, 3, 4, 5, 6] | | 1.7693 | 33.59 | 21500 | 1.8167 | 0.1082 | 0.2027 | 0.1019 | 0.1082 | -1.0 | -1.0 | 0.1290 | 0.1967 | 0.1967 | 0.1967 | -1.0 | -1.0 | [0.18606284260749817, 0.07484107464551926, 0.04835912957787514, 0.08157777786254883, 0.14935535192489624, 0.16259758174419403, 0.05479986220598221] | [0.2938022017478943, 0.14931641519069672, 0.11614859104156494, 0.148426815867424, 0.2591802179813385, 0.27255430817604065, 0.13770410418510437] | [0, 1, 2, 3, 4, 5, 6] | | 1.7453 | 34.38 | 22000 | 1.8090 | 0.1113 | 0.2077 | 0.1052 | 0.1113 | -1.0 | -1.0 | 0.1313 | 0.1999 | 0.1999 | 0.1999 | -1.0 | -1.0 | [0.18891556560993195, 0.0787239521741867, 0.04937181994318962, 0.08449568599462509, 0.15322794020175934, 0.1689060628414154, 0.05579029396176338] | [0.2957901954650879, 0.15378788113594055, 0.11824557185173035, 0.15204466879367828, 0.2632420063018799, 0.27685949206352234, 0.1396472305059433] | [0, 1, 2, 3, 4, 5, 6] | | 1.8001 | 35.16 | 22500 | 1.7838 | 0.1146 | 0.2127 | 0.1088 | 0.1146 | -1.0 | -1.0 | 0.1335 | 0.2030 | 0.2030 | 0.2030 | -1.0 | -1.0 | [0.19174334406852722, 0.08240403980016708, 0.05114790424704552, 0.08605141937732697, 0.15808245539665222, 0.1754416525363922, 0.057301394641399384] | [0.2980480492115021, 0.1578047126531601, 0.120700404047966, 0.1550634354352951, 0.2666260898113251, 0.28123047947883606, 0.14118653535842896] | [0, 1, 2, 3, 4, 5, 6] | | 1.7384 | 35.94 | 23000 | 1.8036 | 0.1174 | 0.2165 | 0.1119 | 0.1174 | -1.0 | -1.0 | 0.1356 | 0.2059 | 0.2059 | 0.2059 | -1.0 | -1.0 | [0.1949423998594284, 0.08474200963973999, 0.05296053737401962, 0.088404580950737, 0.1613713502883911, 0.18075096607208252, 0.058822985738515854] | [0.300113320350647, 0.1617918312549591, 0.12262161076068878, 0.15814653038978577, 0.2703295648097992, 0.28532159328460693, 0.14286154508590698] | [0, 1, 2, 3, 4, 5, 6] | | 1.7308 | 36.72 | 23500 | 1.7081 | 0.1206 | 0.2215 | 0.1155 | 0.1206 | -1.0 | -1.0 | 0.1379 | 0.2090 | 0.2090 | 0.2090 | -1.0 | -1.0 | [0.19688330590724945, 0.0874597355723381, 0.05458848550915718, 0.0911809653043747, 0.16641899943351746, 0.1867956668138504, 0.06093086674809456] | [0.30188122391700745, 0.16584138572216034, 0.12504038214683533, 0.16180519759655, 0.27433207631111145, 0.2896430492401123, 0.14437904953956604] | [0, 1, 2, 3, 4, 5, 6] | | 1.6386 | 37.5 | 24000 | 1.7598 | 0.1241 | 0.2253 | 0.1199 | 0.1241 | -1.0 | -1.0 | 0.1401 | 0.2120 | 0.2120 | 0.2120 | -1.0 | -1.0 | [0.19938696920871735, 0.09302081167697906, 0.05561578646302223, 0.09631506353616714, 0.17022699117660522, 0.19142889976501465, 0.0624491423368454] | [0.3034467399120331, 0.17077019810676575, 0.12714865803718567, 0.16521769762039185, 0.27807268500328064, 0.2932679057121277, 0.1461668312549591] | [0, 1, 2, 3, 4, 5, 6] | | 1.7212 | 38.28 | 24500 | 1.7361 | 0.1267 | 0.2303 | 0.1225 | 0.1267 | -1.0 | -1.0 | 0.1420 | 0.2149 | 0.2149 | 0.2149 | -1.0 | -1.0 | [0.20129698514938354, 0.09607404470443726, 0.05763624981045723, 0.09819900244474411, 0.17313453555107117, 0.19651460647583008, 0.06392311304807663] | [0.3053010106086731, 0.17469388246536255, 0.12949080765247345, 0.16820846498012543, 0.28130650520324707, 0.29748693108558655, 0.14794781804084778] | [0, 1, 2, 3, 4, 5, 6] | | 1.5007 | 39.06 | 25000 | 1.7068 | 0.1294 | 0.2337 | 0.1254 | 0.1294 | -1.0 | -1.0 | 0.1439 | 0.2176 | 0.2176 | 0.2176 | -1.0 | -1.0 | [0.20381218194961548, 0.09927998483181, 0.05934193357825279, 0.10101280361413956, 0.17670287191867828, 0.20033632218837738, 0.06550956517457962] | [0.30728572607040405, 0.17846061289310455, 0.13128052651882172, 0.17155016958713531, 0.2846940755844116, 0.3004628121852875, 0.14973454177379608] | [0, 1, 2, 3, 4, 5, 6] | | 1.6147 | 39.84 | 25500 | 1.7262 | 0.1321 | 0.2381 | 0.1289 | 0.1321 | -1.0 | -1.0 | 0.1460 | 0.2205 | 0.2205 | 0.2205 | -1.0 | -1.0 | [0.20637443661689758, 0.10149955004453659, 0.06139121204614639, 0.101739302277565, 0.18070530891418457, 0.2046823352575302, 0.06848781555891037] | [0.30909985303878784, 0.18267379701137543, 0.13362130522727966, 0.17453694343566895, 0.28829798102378845, 0.3038405478000641, 0.15139558911323547] | [0, 1, 2, 3, 4, 5, 6] | | 1.5141 | 40.62 | 26000 | 1.7278 | 0.1348 | 0.2422 | 0.1314 | 0.1348 | -1.0 | -1.0 | 0.1479 | 0.2231 | 0.2231 | 0.2231 | -1.0 | -1.0 | [0.2083674520254135, 0.10427972674369812, 0.06303983926773071, 0.10368069261312485, 0.18451228737831116, 0.21048979461193085, 0.06922110170125961] | [0.31071242690086365, 0.1856759935617447, 0.13551345467567444, 0.1775086522102356, 0.2913241982460022, 0.3082008957862854, 0.15268296003341675] | [0, 1, 2, 3, 4, 5, 6] | | 1.4385 | 41.41 | 26500 | 1.6737 | 0.1371 | 0.2454 | 0.1348 | 0.1371 | -1.0 | -1.0 | 0.1497 | 0.2256 | 0.2256 | 0.2256 | -1.0 | -1.0 | [0.21084575355052948, 0.10635006427764893, 0.06430502980947495, 0.10656581819057465, 0.18613353371620178, 0.21510617434978485, 0.07017965614795685] | [0.3124080300331116, 0.18867924809455872, 0.13702596724033356, 0.18037474155426025, 0.2946583926677704, 0.3123031258583069, 0.15406128764152527] | [0, 1, 2, 3, 4, 5, 6] | | 1.4519 | 42.19 | 27000 | 1.6628 | 0.1396 | 0.2499 | 0.1371 | 0.1396 | -1.0 | -1.0 | 0.1514 | 0.2281 | 0.2281 | 0.2281 | -1.0 | -1.0 | [0.21279378235340118, 0.10987628996372223, 0.06571341305971146, 0.1084621325135231, 0.1897042989730835, 0.21943020820617676, 0.07122423499822617] | [0.3142070770263672, 0.1921885460615158, 0.13871164619922638, 0.18352556228637695, 0.29725182056427, 0.3155188262462616, 0.15546737611293793] | [0, 1, 2, 3, 4, 5, 6] | | 1.538 | 42.97 | 27500 | 1.6760 | 0.1427 | 0.2541 | 0.1399 | 0.1427 | -1.0 | -1.0 | 0.1532 | 0.2308 | 0.2308 | 0.2308 | -1.0 | -1.0 | [0.21556733548641205, 0.11342202872037888, 0.06732655316591263, 0.11160027235746384, 0.19317805767059326, 0.22488607466220856, 0.07287413626909256] | [0.3162986934185028, 0.19509641826152802, 0.1405220478773117, 0.1862032115459442, 0.30072230100631714, 0.31980466842651367, 0.15693101286888123] | [0, 1, 2, 3, 4, 5, 6] | | 1.6061 | 43.75 | 28000 | 1.6461 | 0.1455 | 0.2585 | 0.1434 | 0.1455 | -1.0 | -1.0 | 0.1551 | 0.2334 | 0.2334 | 0.2334 | -1.0 | -1.0 | [0.2179284244775772, 0.11804981529712677, 0.0686318427324295, 0.11424344778060913, 0.196621835231781, 0.22896119952201843, 0.07397373765707016] | [0.3179588317871094, 0.19867965579032898, 0.14226190745830536, 0.18921156227588654, 0.30400359630584717, 0.32350945472717285, 0.1582772433757782] | [0, 1, 2, 3, 4, 5, 6] | | 1.5999 | 44.53 | 28500 | 1.6499 | 0.1482 | 0.2614 | 0.1461 | 0.1482 | -1.0 | -1.0 | 0.1567 | 0.2357 | 0.2357 | 0.2357 | -1.0 | -1.0 | [0.2203698754310608, 0.12020596116781235, 0.07041482627391815, 0.11792736500501633, 0.19977207481861115, 0.2334306240081787, 0.07521776854991913] | [0.31949469447135925, 0.2019670456647873, 0.14413757622241974, 0.19179263710975647, 0.30652087926864624, 0.3264607787132263, 0.15956202149391174] | [0, 1, 2, 3, 4, 5, 6] | | 1.4135 | 45.31 | 29000 | 1.5887 | 0.1509 | 0.2652 | 0.1491 | 0.1509 | -1.0 | -1.0 | 0.1585 | 0.2383 | 0.2383 | 0.2383 | -1.0 | -1.0 | [0.22293353080749512, 0.12379045784473419, 0.07232733815908432, 0.12009650468826294, 0.20234964787960052, 0.23802727460861206, 0.07673442363739014] | [0.32130709290504456, 0.20496343076229095, 0.1461704820394516, 0.194487527012825, 0.3094630837440491, 0.33020803332328796, 0.16118156909942627] | [0, 1, 2, 3, 4, 5, 6] | | 1.3849 | 46.09 | 29500 | 1.5587 | 0.1534 | 0.2696 | 0.1529 | 0.1534 | -1.0 | -1.0 | 0.1603 | 0.2409 | 0.2409 | 0.2409 | -1.0 | -1.0 | [0.22550000250339508, 0.126255065202713, 0.07389067113399506, 0.12276104837656021, 0.20572707056999207, 0.24077364802360535, 0.07866828143596649] | [0.323210209608078, 0.20810477435588837, 0.14789114892482758, 0.19780071079730988, 0.313141405582428, 0.3335341215133667, 0.16273246705532074] | [0, 1, 2, 3, 4, 5, 6] | | 1.3678 | 46.88 | 30000 | 1.6096 | 0.1558 | 0.2722 | 0.1555 | 0.1558 | -1.0 | -1.0 | 0.1620 | 0.2433 | 0.2433 | 0.2433 | -1.0 | -1.0 | [0.22747229039669037, 0.12818685173988342, 0.07569180428981781, 0.12502236664295197, 0.20901745557785034, 0.24479806423187256, 0.08022068440914154] | [0.3249678313732147, 0.21116161346435547, 0.14949944615364075, 0.20055362582206726, 0.31560882925987244, 0.33657023310661316, 0.16441743075847626] | [0, 1, 2, 3, 4, 5, 6] | | 1.5416 | 47.66 | 30500 | 1.5705 | 0.1582 | 0.2763 | 0.1582 | 0.1582 | -1.0 | -1.0 | 0.1636 | 0.2456 | 0.2456 | 0.2456 | -1.0 | -1.0 | [0.22910533845424652, 0.13224217295646667, 0.07711625099182129, 0.1279756724834442, 0.2114134281873703, 0.24824558198451996, 0.08149348199367523] | [0.32661086320877075, 0.21458519995212555, 0.1511010080575943, 0.20336377620697021, 0.31837713718414307, 0.33922234177589417, 0.1660820096731186] | [0, 1, 2, 3, 4, 5, 6] | | 1.5698 | 48.44 | 31000 | 1.5384 | 0.1607 | 0.2796 | 0.1615 | 0.1607 | -1.0 | -1.0 | 0.1652 | 0.2480 | 0.2480 | 0.2480 | -1.0 | -1.0 | [0.23130907118320465, 0.1348022222518921, 0.07854016125202179, 0.13033372163772583, 0.21463152766227722, 0.2524957060813904, 0.08308678865432739] | [0.3283239006996155, 0.2169794738292694, 0.15278132259845734, 0.20617814362049103, 0.32176315784454346, 0.34209543466567993, 0.1676422506570816] | [0, 1, 2, 3, 4, 5, 6] | | 1.4281 | 49.22 | 31500 | 1.5423 | 0.1635 | 0.2828 | 0.1651 | 0.1635 | -1.0 | -1.0 | 0.1669 | 0.2503 | 0.2503 | 0.2503 | -1.0 | -1.0 | [0.2337646633386612, 0.13741059601306915, 0.0802868977189064, 0.1331375390291214, 0.2190386801958084, 0.2564108967781067, 0.0843500867486] | [0.33013421297073364, 0.2201346755027771, 0.15410970151424408, 0.2087274193763733, 0.32478800415992737, 0.3452184200286865, 0.1692124605178833] | [0, 1, 2, 3, 4, 5, 6] | | 1.5232 | 50.0 | 32000 | 1.5577 | 0.1656 | 0.2860 | 0.1669 | 0.1656 | -1.0 | -1.0 | 0.1684 | 0.2525 | 0.2525 | 0.2525 | -1.0 | -1.0 | [0.23572662472724915, 0.1402345597743988, 0.08165154606103897, 0.1347123384475708, 0.2218606323003769, 0.25896695256233215, 0.08578291535377502] | [0.3317522406578064, 0.22330492734909058, 0.15557962656021118, 0.21121864020824432, 0.32721176743507385, 0.3473915159702301, 0.1707620918750763] | [0, 1, 2, 3, 4, 5, 6] | | 1.4718 | 50.78 | 32500 | 1.5329 | 0.1679 | 0.2890 | 0.1697 | 0.1679 | -1.0 | -1.0 | 0.1700 | 0.2547 | 0.2547 | 0.2547 | -1.0 | -1.0 | [0.23852567374706268, 0.1429818868637085, 0.08330734819173813, 0.13681721687316895, 0.22413747012615204, 0.26250723004341125, 0.08725880831480026] | [0.3335283696651459, 0.22599533200263977, 0.15709824860095978, 0.2133883386850357, 0.3300948441028595, 0.3500826358795166, 0.1724136918783188] | [0, 1, 2, 3, 4, 5, 6] | | 1.5368 | 51.56 | 33000 | 1.5226 | 0.1703 | 0.2917 | 0.1724 | 0.1703 | -1.0 | -1.0 | 0.1714 | 0.2567 | 0.2567 | 0.2567 | -1.0 | -1.0 | [0.24014446139335632, 0.14406023919582367, 0.08496519178152084, 0.13978202641010284, 0.22792623937129974, 0.2665834128856659, 0.08841584622859955] | [0.3352448344230652, 0.22808080911636353, 0.15874338150024414, 0.21567054092884064, 0.3328075408935547, 0.352955162525177, 0.17364373803138733] | [0, 1, 2, 3, 4, 5, 6] | | 1.3778 | 52.34 | 33500 | 1.4644 | 0.1729 | 0.2959 | 0.1754 | 0.1729 | -1.0 | -1.0 | 0.1730 | 0.2591 | 0.2591 | 0.2591 | -1.0 | -1.0 | [0.2424972951412201, 0.14849114418029785, 0.08599013835191727, 0.14239133894443512, 0.23136775195598602, 0.26967743039131165, 0.08995284140110016] | [0.33695903420448303, 0.2317141592502594, 0.1604699343442917, 0.21840624511241913, 0.3351666331291199, 0.3555569350719452, 0.1751621812582016] | [0, 1, 2, 3, 4, 5, 6] | | 1.2753 | 53.12 | 34000 | 1.4789 | 0.1753 | 0.2984 | 0.1786 | 0.1753 | -1.0 | -1.0 | 0.1746 | 0.2614 | 0.2614 | 0.2614 | -1.0 | -1.0 | [0.2448333203792572, 0.15089264512062073, 0.08769001066684723, 0.1446457952260971, 0.2337132841348648, 0.27418655157089233, 0.0914132297039032] | [0.33877894282341003, 0.23455436527729034, 0.16229130327701569, 0.22096478939056396, 0.3380137085914612, 0.35878705978393555, 0.17660170793533325] | [0, 1, 2, 3, 4, 5, 6] | | 1.3929 | 53.91 | 34500 | 1.4726 | 0.1776 | 0.3009 | 0.1813 | 0.1776 | -1.0 | -1.0 | 0.1761 | 0.2635 | 0.2635 | 0.2635 | -1.0 | -1.0 | [0.24712836742401123, 0.15434853732585907, 0.08864349126815796, 0.14744579792022705, 0.23680062592029572, 0.27693623304367065, 0.09224053472280502] | [0.340232789516449, 0.23757575452327728, 0.1638350784778595, 0.22334888577461243, 0.34065911173820496, 0.3610851466655731, 0.17764268815517426] | [0, 1, 2, 3, 4, 5, 6] | | 1.3218 | 54.69 | 35000 | 1.4250 | 0.1802 | 0.3055 | 0.1838 | 0.1802 | -1.0 | -1.0 | 0.1775 | 0.2656 | 0.2656 | 0.2656 | -1.0 | -1.0 | [0.2493417114019394, 0.15740056335926056, 0.08998408913612366, 0.15052294731140137, 0.2398378998041153, 0.2808479070663452, 0.09363849461078644] | [0.3417774438858032, 0.24019047617912292, 0.16550683975219727, 0.22571922838687897, 0.34335291385650635, 0.36390790343284607, 0.17897234857082367] | [0, 1, 2, 3, 4, 5, 6] | | 1.3176 | 55.47 | 35500 | 1.4067 | 0.1827 | 0.3080 | 0.1869 | 0.1827 | -1.0 | -1.0 | 0.1790 | 0.2678 | 0.2678 | 0.2678 | -1.0 | -1.0 | [0.2508467435836792, 0.16055212914943695, 0.09181094914674759, 0.15233220160007477, 0.24301283061504364, 0.285194456577301, 0.09526713937520981] | [0.34326499700546265, 0.24291080236434937, 0.1671036183834076, 0.22774988412857056, 0.3466396629810333, 0.36645326018333435, 0.1804557889699936] | [0, 1, 2, 3, 4, 5, 6] | | 1.22 | 56.25 | 36000 | 1.4432 | 0.1848 | 0.3107 | 0.1893 | 0.1848 | -1.0 | -1.0 | 0.1805 | 0.2699 | 0.2699 | 0.2699 | -1.0 | -1.0 | [0.25353720784187317, 0.16321063041687012, 0.09297779947519302, 0.15274809300899506, 0.2463236004114151, 0.2882259488105774, 0.09666171669960022] | [0.3447769284248352, 0.24560606479644775, 0.16864915192127228, 0.2300509363412857, 0.3490169942378998, 0.3692837357521057, 0.1819247454404831] | [0, 1, 2, 3, 4, 5, 6] | | 1.2624 | 57.03 | 36500 | 1.4534 | 0.1872 | 0.3143 | 0.1927 | 0.1872 | -1.0 | -1.0 | 0.1820 | 0.2720 | 0.2720 | 0.2720 | -1.0 | -1.0 | [0.2553740441799164, 0.16707704961299896, 0.09462162852287292, 0.1561412215232849, 0.24870790541172028, 0.2907980680465698, 0.09786700457334518] | [0.34625932574272156, 0.24872560799121857, 0.1702970266342163, 0.2324216663837433, 0.3513542115688324, 0.37167438864707947, 0.1831466555595398] | [0, 1, 2, 3, 4, 5, 6] | | 1.2334 | 57.81 | 37000 | 1.4128 | 0.1897 | 0.3171 | 0.1957 | 0.1897 | -1.0 | -1.0 | 0.1833 | 0.2740 | 0.2740 | 0.2740 | -1.0 | -1.0 | [0.25749534368515015, 0.17012159526348114, 0.0956001728773117, 0.15746240317821503, 0.25159570574760437, 0.2954738736152649, 0.1000288873910904] | [0.34803688526153564, 0.2508763372898102, 0.1717308908700943, 0.23444309830665588, 0.3537949025630951, 0.37444716691970825, 0.18453818559646606] | [0, 1, 2, 3, 4, 5, 6] | | 1.1761 | 58.59 | 37500 | 1.3650 | 0.1918 | 0.3191 | 0.1981 | 0.1918 | -1.0 | -1.0 | 0.1847 | 0.2761 | 0.2761 | 0.2761 | -1.0 | -1.0 | [0.25979435443878174, 0.17184627056121826, 0.0965074971318245, 0.15925957262516022, 0.25439971685409546, 0.29917991161346436, 0.10163913667201996] | [0.3496769666671753, 0.2532282769680023, 0.1731221079826355, 0.23651672899723053, 0.35657837986946106, 0.3775647282600403, 0.18600067496299744] | [0, 1, 2, 3, 4, 5, 6] | | 1.1746 | 59.38 | 38000 | 1.3084 | 0.1940 | 0.3216 | 0.2005 | 0.1940 | -1.0 | -1.0 | 0.1861 | 0.2782 | 0.2782 | 0.2782 | -1.0 | -1.0 | [0.2614623010158539, 0.1746634691953659, 0.0982770100235939, 0.16152290999889374, 0.2570221424102783, 0.30207931995391846, 0.10326719284057617] | [0.3511837124824524, 0.25599682331085205, 0.1749001145362854, 0.2385540008544922, 0.3590543270111084, 0.3800782859325409, 0.1875486522912979] | [0, 1, 2, 3, 4, 5, 6] | | 1.124 | 60.16 | 38500 | 1.3802 | 0.1959 | 0.3251 | 0.2027 | 0.1959 | -1.0 | -1.0 | 0.1873 | 0.2801 | 0.2801 | 0.2801 | -1.0 | -1.0 | [0.2630487084388733, 0.17768719792366028, 0.09916898608207703, 0.16377225518226624, 0.25905221700668335, 0.30371031165122986, 0.10464335232973099] | [0.3525422513484955, 0.2583392262458801, 0.1763812154531479, 0.2407900094985962, 0.3612228035926819, 0.38215091824531555, 0.18898142874240875] | [0, 1, 2, 3, 4, 5, 6] | | 1.2122 | 60.94 | 39000 | 1.3704 | 0.1978 | 0.3277 | 0.2051 | 0.1978 | -1.0 | -1.0 | 0.1886 | 0.2819 | 0.2819 | 0.2819 | -1.0 | -1.0 | [0.2640724182128906, 0.17990341782569885, 0.10043416917324066, 0.16539953649044037, 0.2615806758403778, 0.30746400356292725, 0.10585416853427887] | [0.3538845181465149, 0.2604351341724396, 0.17792586982250214, 0.24284890294075012, 0.36360496282577515, 0.3844670355319977, 0.1903436928987503] | [0, 1, 2, 3, 4, 5, 6] | | 1.1796 | 61.72 | 39500 | 1.3533 | 0.1995 | 0.3304 | 0.2074 | 0.1995 | -1.0 | -1.0 | 0.1898 | 0.2837 | 0.2837 | 0.2837 | -1.0 | -1.0 | [0.2663557529449463, 0.18150398135185242, 0.10159678012132645, 0.1672067791223526, 0.26377254724502563, 0.3099142014980316, 0.106254942715168] | [0.3554444909095764, 0.26238587498664856, 0.17931236326694489, 0.24486443400382996, 0.3654124140739441, 0.38674548268318176, 0.1915804147720337] | [0, 1, 2, 3, 4, 5, 6] | | 1.0856 | 62.5 | 40000 | 1.3118 | 0.2016 | 0.3322 | 0.2100 | 0.2016 | -1.0 | -1.0 | 0.1910 | 0.2856 | 0.2856 | 0.2856 | -1.0 | -1.0 | [0.26789969205856323, 0.18293805420398712, 0.1026647612452507, 0.16959702968597412, 0.26639893651008606, 0.31403395533561707, 0.10793528705835342] | [0.3571368157863617, 0.2641363739967346, 0.1811014860868454, 0.2470199018716812, 0.36790525913238525, 0.389245867729187, 0.19290399551391602] | [0, 1, 2, 3, 4, 5, 6] | | 1.1677 | 63.28 | 40500 | 1.3136 | 0.2039 | 0.3354 | 0.2132 | 0.2039 | -1.0 | -1.0 | 0.1924 | 0.2876 | 0.2876 | 0.2876 | -1.0 | -1.0 | [0.2693575620651245, 0.18515463173389435, 0.10445524752140045, 0.1724829226732254, 0.26850834488868713, 0.31805768609046936, 0.1095009371638298] | [0.3586777150630951, 0.26634493470191956, 0.18248583376407623, 0.24928018450737, 0.37000393867492676, 0.391990602016449, 0.19435375928878784] | [0, 1, 2, 3, 4, 5, 6] | | 1.155 | 64.06 | 41000 | 1.2842 | 0.2064 | 0.3380 | 0.2153 | 0.2064 | -1.0 | -1.0 | 0.1938 | 0.2897 | 0.2897 | 0.2897 | -1.0 | -1.0 | [0.27208176255226135, 0.18823444843292236, 0.10572110861539841, 0.17518718540668488, 0.2712228298187256, 0.32101067900657654, 0.11111129820346832] | [0.3603164255619049, 0.26862528920173645, 0.18374185264110565, 0.2518904507160187, 0.37252476811408997, 0.39505138993263245, 0.19568908214569092] | [0, 1, 2, 3, 4, 5, 6] | | 1.1489 | 64.84 | 41500 | 1.2779 | 0.2084 | 0.3405 | 0.2177 | 0.2084 | -1.0 | -1.0 | 0.1952 | 0.2917 | 0.2917 | 0.2917 | -1.0 | -1.0 | [0.27354657649993896, 0.19084295630455017, 0.10690724849700928, 0.1768673211336136, 0.27369409799575806, 0.3242957592010498, 0.11255022883415222] | [0.361765593290329, 0.27128878235816956, 0.18515248596668243, 0.25417518615722656, 0.3749188482761383, 0.39749079942703247, 0.19700443744659424] | [0, 1, 2, 3, 4, 5, 6] | | 1.0019 | 65.62 | 42000 | 1.2343 | 0.2106 | 0.3430 | 0.2213 | 0.2106 | -1.0 | -1.0 | 0.1965 | 0.2937 | 0.2937 | 0.2937 | -1.0 | -1.0 | [0.27533525228500366, 0.19294942915439606, 0.1081010177731514, 0.1796436905860901, 0.27713286876678467, 0.3272176682949066, 0.1141667366027832] | [0.3632113039493561, 0.2739033102989197, 0.18647846579551697, 0.2564755380153656, 0.37761470675468445, 0.39968517422676086, 0.19862498342990875] | [0, 1, 2, 3, 4, 5, 6] | | 1.0393 | 66.41 | 42500 | 1.2497 | 0.2127 | 0.3459 | 0.2233 | 0.2127 | -1.0 | -1.0 | 0.1977 | 0.2956 | 0.2956 | 0.2956 | -1.0 | -1.0 | [0.2769603431224823, 0.19543510675430298, 0.10956516116857529, 0.1815720647573471, 0.27938321232795715, 0.33071368932724, 0.11541370302438736] | [0.36470019817352295, 0.2760855555534363, 0.18784119188785553, 0.2584652900695801, 0.3799194097518921, 0.4020807147026062, 0.19989511370658875] | [0, 1, 2, 3, 4, 5, 6] | | 1.1978 | 67.19 | 43000 | 1.2584 | 0.2149 | 0.3485 | 0.2259 | 0.2149 | -1.0 | -1.0 | 0.1990 | 0.2975 | 0.2975 | 0.2975 | -1.0 | -1.0 | [0.2787444293498993, 0.1985068917274475, 0.11107343435287476, 0.18374863266944885, 0.28199175000190735, 0.3330363929271698, 0.11708976328372955] | [0.36612531542778015, 0.2782452404499054, 0.18934492766857147, 0.26068639755249023, 0.3821493089199066, 0.4043148159980774, 0.2014266848564148] | [0, 1, 2, 3, 4, 5, 6] | | 1.0198 | 67.97 | 43500 | 1.1981 | 0.2169 | 0.3505 | 0.2282 | 0.2169 | -1.0 | -1.0 | 0.2003 | 0.2994 | 0.2994 | 0.2994 | -1.0 | -1.0 | [0.2804291248321533, 0.20049406588077545, 0.11303915083408356, 0.1850598007440567, 0.28519004583358765, 0.3359755575656891, 0.1184569001197815] | [0.36760973930358887, 0.2804458439350128, 0.19099237024784088, 0.2626854479312897, 0.3847373127937317, 0.4067065715789795, 0.20275768637657166] | [0, 1, 2, 3, 4, 5, 6] | | 1.1246 | 68.75 | 44000 | 1.2557 | 0.2190 | 0.3533 | 0.2312 | 0.2190 | -1.0 | -1.0 | 0.2015 | 0.3013 | 0.3013 | 0.3013 | -1.0 | -1.0 | [0.2819073498249054, 0.2019958347082138, 0.1145104244351387, 0.18783636391162872, 0.28763484954833984, 0.33935362100601196, 0.11983713507652283] | [0.3688904047012329, 0.28254133462905884, 0.19248612225055695, 0.2649693191051483, 0.38717830181121826, 0.40896880626678467, 0.20419429242610931] | [0, 1, 2, 3, 4, 5, 6] | | 1.1038 | 69.53 | 44500 | 1.2243 | 0.2209 | 0.3559 | 0.2329 | 0.2209 | -1.0 | -1.0 | 0.2027 | 0.3030 | 0.3030 | 0.3030 | -1.0 | -1.0 | [0.283906489610672, 0.20419622957706451, 0.11568263173103333, 0.18973909318447113, 0.2897682785987854, 0.34185564517974854, 0.12127756327390671] | [0.3701639771461487, 0.2847599685192108, 0.1938461810350418, 0.26674312353134155, 0.3892001509666443, 0.41108739376068115, 0.2054164856672287] | [0, 1, 2, 3, 4, 5, 6] | | 1.0437 | 70.31 | 45000 | 1.2413 | 0.2229 | 0.3578 | 0.2348 | 0.2229 | -1.0 | -1.0 | 0.2038 | 0.3049 | 0.3049 | 0.3049 | -1.0 | -1.0 | [0.285703182220459, 0.20658300817012787, 0.11733826994895935, 0.19167673587799072, 0.2921169698238373, 0.3445843458175659, 0.12240643799304962] | [0.3715958893299103, 0.28679460287094116, 0.19535203278064728, 0.26880815625190735, 0.391522079706192, 0.41324150562286377, 0.2066948115825653] | [0, 1, 2, 3, 4, 5, 6] | | 0.9239 | 71.09 | 45500 | 1.2163 | 0.2252 | 0.3599 | 0.2373 | 0.2252 | -1.0 | -1.0 | 0.2050 | 0.3067 | 0.3067 | 0.3067 | -1.0 | -1.0 | [0.2875484824180603, 0.20982173085212708, 0.11878927052021027, 0.194691464304924, 0.2946434020996094, 0.3480072617530823, 0.12267839908599854] | [0.3730217516422272, 0.28887778520584106, 0.19683022797107697, 0.27080878615379333, 0.3938581943511963, 0.4157206416130066, 0.20803745090961456] | [0, 1, 2, 3, 4, 5, 6] | | 1.0799 | 71.88 | 46000 | 1.2077 | 0.2275 | 0.3624 | 0.2407 | 0.2275 | -1.0 | -1.0 | 0.2063 | 0.3086 | 0.3086 | 0.3086 | -1.0 | -1.0 | [0.28902295231819153, 0.2118113934993744, 0.12033383548259735, 0.19715572893619537, 0.2974853217601776, 0.3516872823238373, 0.12515637278556824] | [0.37449219822883606, 0.2907114624977112, 0.19832651317119598, 0.27295395731925964, 0.3961236774921417, 0.41836148500442505, 0.20927491784095764] | [0, 1, 2, 3, 4, 5, 6] | | 0.9485 | 72.66 | 46500 | 1.1767 | 0.2294 | 0.3644 | 0.2431 | 0.2294 | -1.0 | -1.0 | 0.2075 | 0.3105 | 0.3105 | 0.3105 | -1.0 | -1.0 | [0.2905147075653076, 0.21348299086093903, 0.12186509370803833, 0.19925419986248016, 0.29991215467453003, 0.35510337352752686, 0.12601090967655182] | [0.37585005164146423, 0.2928510904312134, 0.1997089982032776, 0.2750307023525238, 0.39854666590690613, 0.42074114084243774, 0.21064919233322144] | [0, 1, 2, 3, 4, 5, 6] | | 1.0266 | 73.44 | 47000 | 1.1805 | 0.2315 | 0.3668 | 0.2449 | 0.2315 | -1.0 | -1.0 | 0.2086 | 0.3122 | 0.3122 | 0.3122 | -1.0 | -1.0 | [0.2922998070716858, 0.2159629464149475, 0.1230894923210144, 0.2011932134628296, 0.3026053309440613, 0.35820436477661133, 0.12747688591480255] | [0.3772324025630951, 0.29456478357315063, 0.2011498510837555, 0.2769012749195099, 0.4006654918193817, 0.4231053292751312, 0.21204380691051483] | [0, 1, 2, 3, 4, 5, 6] | | 0.8902 | 74.22 | 47500 | 1.1975 | 0.2334 | 0.3692 | 0.2469 | 0.2334 | -1.0 | -1.0 | 0.2097 | 0.3139 | 0.3139 | 0.3139 | -1.0 | -1.0 | [0.29417213797569275, 0.21815602481365204, 0.12440662086009979, 0.20238643884658813, 0.3047296702861786, 0.36108291149139404, 0.12876363098621368] | [0.37840884923934937, 0.2964656949043274, 0.20244571566581726, 0.27881261706352234, 0.4026099443435669, 0.4252718687057495, 0.21332800388336182] | [0, 1, 2, 3, 4, 5, 6] | | 1.0085 | 75.0 | 48000 | 1.1693 | 0.2355 | 0.3711 | 0.2500 | 0.2355 | -1.0 | -1.0 | 0.2109 | 0.3157 | 0.3157 | 0.3157 | -1.0 | -1.0 | [0.2964259088039398, 0.2206290364265442, 0.12558795511722565, 0.20491278171539307, 0.3071436583995819, 0.3635878562927246, 0.1304273009300232] | [0.3797307312488556, 0.29837751388549805, 0.2038503885269165, 0.28071293234825134, 0.4047755002975464, 0.42745351791381836, 0.21473214030265808] | [0, 1, 2, 3, 4, 5, 6] | | 0.9439 | 75.78 | 48500 | 1.1271 | 0.2375 | 0.3738 | 0.2525 | 0.2375 | -1.0 | -1.0 | 0.2120 | 0.3175 | 0.3175 | 0.3175 | -1.0 | -1.0 | [0.2981511652469635, 0.22302350401878357, 0.12688370048999786, 0.2064957469701767, 0.30954036116600037, 0.36666110157966614, 0.13198280334472656] | [0.3811796009540558, 0.30058732628822327, 0.20540301501750946, 0.2825491428375244, 0.40685874223709106, 0.42970946431159973, 0.2161606103181839] | [0, 1, 2, 3, 4, 5, 6] | | 0.9932 | 76.56 | 49000 | 1.1668 | 0.2395 | 0.3760 | 0.2543 | 0.2395 | -1.0 | -1.0 | 0.2131 | 0.3192 | 0.3192 | 0.3192 | -1.0 | -1.0 | [0.2997910976409912, 0.22443556785583496, 0.12914197146892548, 0.20867681503295898, 0.31173640489578247, 0.36961501836776733, 0.13330094516277313] | [0.382452130317688, 0.3023933172225952, 0.2068178802728653, 0.2844855487346649, 0.40896469354629517, 0.43175914883613586, 0.21751338243484497] | [0, 1, 2, 3, 4, 5, 6] | | 1.045 | 77.34 | 49500 | 1.1756 | 0.2413 | 0.3781 | 0.2560 | 0.2413 | -1.0 | -1.0 | 0.2142 | 0.3208 | 0.3208 | 0.3208 | -1.0 | -1.0 | [0.30133965611457825, 0.22616147994995117, 0.13028943538665771, 0.21039654314517975, 0.3142479956150055, 0.3722107708454132, 0.13469184935092926] | [0.38375064730644226, 0.3039608299732208, 0.20813080668449402, 0.2864702343940735, 0.4109819531440735, 0.43382585048675537, 0.21866895258426666] | [0, 1, 2, 3, 4, 5, 6] | | 0.8991 | 78.12 | 50000 | 1.1265 | 0.2433 | 0.3798 | 0.2594 | 0.2433 | -1.0 | -1.0 | 0.2154 | 0.3226 | 0.3226 | 0.3226 | -1.0 | -1.0 | [0.3027842044830322, 0.22833791375160217, 0.13169842958450317, 0.21230222284793854, 0.31705981492996216, 0.37462857365608215, 0.13647869229316711] | [0.38506948947906494, 0.3060303032398224, 0.20944884419441223, 0.28853288292884827, 0.413251131772995, 0.43552064895629883, 0.22004255652427673] | [0, 1, 2, 3, 4, 5, 6] | | 1.0419 | 78.91 | 50500 | 1.1598 | 0.2453 | 0.3820 | 0.2617 | 0.2453 | -1.0 | -1.0 | 0.2165 | 0.3242 | 0.3242 | 0.3242 | -1.0 | -1.0 | [0.30399709939956665, 0.23175884783267975, 0.13290618360042572, 0.2146383672952652, 0.3194376826286316, 0.3767666518688202, 0.1376808136701584] | [0.3862399160861969, 0.30805879831314087, 0.21072116494178772, 0.2904176115989685, 0.41539400815963745, 0.4374273717403412, 0.22135885059833527] | [0, 1, 2, 3, 4, 5, 6] | | 0.928 | 79.69 | 51000 | 1.1520 | 0.2468 | 0.3839 | 0.2631 | 0.2468 | -1.0 | -1.0 | 0.2175 | 0.3258 | 0.3258 | 0.3258 | -1.0 | -1.0 | [0.30559828877449036, 0.23321305215358734, 0.1334191858768463, 0.2149236500263214, 0.321972131729126, 0.38008633255958557, 0.13867558538913727] | [0.38757285475730896, 0.309685081243515, 0.21205267310142517, 0.29206526279449463, 0.41759780049324036, 0.4394749701023102, 0.22249439358711243] | [0, 1, 2, 3, 4, 5, 6] | | 0.9376 | 80.47 | 51500 | 1.1385 | 0.2488 | 0.3863 | 0.2648 | 0.2488 | -1.0 | -1.0 | 0.2185 | 0.3275 | 0.3275 | 0.3275 | -1.0 | -1.0 | [0.3067678213119507, 0.23459766805171967, 0.13415294885635376, 0.2181853950023651, 0.3255745470523834, 0.38228902220726013, 0.14015766978263855] | [0.3887871503829956, 0.3114210069179535, 0.21324457228183746, 0.2940303087234497, 0.4196878969669342, 0.44157907366752625, 0.223750501871109] | [0, 1, 2, 3, 4, 5, 6] | | 0.9427 | 81.25 | 52000 | 1.1222 | 0.2508 | 0.3877 | 0.2677 | 0.2508 | -1.0 | -1.0 | 0.2196 | 0.3291 | 0.3291 | 0.3291 | -1.0 | -1.0 | [0.30901777744293213, 0.23716850578784943, 0.13516782224178314, 0.21972113847732544, 0.3280062675476074, 0.3846227526664734, 0.14155663549900055] | [0.39011359214782715, 0.3132750689983368, 0.2145436704158783, 0.29575127363204956, 0.421540230512619, 0.4435870945453644, 0.22506624460220337] | [0, 1, 2, 3, 4, 5, 6] | | 1.0334 | 82.03 | 52500 | 1.1334 | 0.2524 | 0.3896 | 0.2699 | 0.2524 | -1.0 | -1.0 | 0.2206 | 0.3307 | 0.3307 | 0.3307 | -1.0 | -1.0 | [0.31127798557281494, 0.23829974234104156, 0.135979562997818, 0.22239266335964203, 0.32956355810165405, 0.3864337205886841, 0.1428307592868805] | [0.3913862705230713, 0.31477054953575134, 0.2157975733280182, 0.29752182960510254, 0.42348337173461914, 0.445541113615036, 0.22630095481872559] | [0, 1, 2, 3, 4, 5, 6] | | 0.8902 | 82.81 | 53000 | 1.1508 | 0.2542 | 0.3915 | 0.2720 | 0.2542 | -1.0 | -1.0 | 0.2216 | 0.3323 | 0.3323 | 0.3323 | -1.0 | -1.0 | [0.31267109513282776, 0.23989807069301605, 0.1374475210905075, 0.22427316009998322, 0.33089154958724976, 0.38992753624916077, 0.14410248398780823] | [0.39266136288642883, 0.3162893056869507, 0.21704185009002686, 0.29923614859580994, 0.4253079891204834, 0.44769999384880066, 0.2275276780128479] | [0, 1, 2, 3, 4, 5, 6] | | 0.9281 | 83.59 | 53500 | 1.1239 | 0.2561 | 0.3934 | 0.2737 | 0.2561 | -1.0 | -1.0 | 0.2226 | 0.3338 | 0.3338 | 0.3338 | -1.0 | -1.0 | [0.31389114260673523, 0.24228313565254211, 0.1386796087026596, 0.22620290517807007, 0.3334823548793793, 0.39271605014801025, 0.14563240110874176] | [0.3938855528831482, 0.31805720925331116, 0.2183230072259903, 0.3009539842605591, 0.42726925015449524, 0.44945546984672546, 0.2288157194852829] | [0, 1, 2, 3, 4, 5, 6] | | 0.8653 | 84.38 | 54000 | 1.0924 | 0.2578 | 0.3952 | 0.2754 | 0.2578 | -1.0 | -1.0 | 0.2237 | 0.3354 | 0.3354 | 0.3354 | -1.0 | -1.0 | [0.315106600522995, 0.24380050599575043, 0.13995368778705597, 0.22797514498233795, 0.3359430730342865, 0.39488911628723145, 0.14695100486278534] | [0.39509955048561096, 0.31968575716018677, 0.21961098909378052, 0.3027489483356476, 0.42926180362701416, 0.45143863558769226, 0.23008930683135986] | [0, 1, 2, 3, 4, 5, 6] | | 0.8345 | 85.16 | 54500 | 1.1562 | 0.2594 | 0.3977 | 0.2778 | 0.2594 | -1.0 | -1.0 | 0.2246 | 0.3369 | 0.3369 | 0.3369 | -1.0 | -1.0 | [0.3165804445743561, 0.24580344557762146, 0.14057014882564545, 0.2296101450920105, 0.33843737840652466, 0.39689549803733826, 0.14809773862361908] | [0.3962390720844269, 0.3212288022041321, 0.22074666619300842, 0.3044189214706421, 0.43118008971214294, 0.453309565782547, 0.23127539455890656] | [0, 1, 2, 3, 4, 5, 6] | | 0.935 | 85.94 | 55000 | 1.0993 | 0.2610 | 0.3991 | 0.2798 | 0.2610 | -1.0 | -1.0 | 0.2256 | 0.3385 | 0.3385 | 0.3385 | -1.0 | -1.0 | [0.31773683428764343, 0.24742969870567322, 0.1413785070180893, 0.23119401931762695, 0.340358704328537, 0.39966636896133423, 0.14942407608032227] | [0.39745262265205383, 0.32293662428855896, 0.22188419103622437, 0.30613085627555847, 0.43311747908592224, 0.45521411299705505, 0.23248779773712158] | [0, 1, 2, 3, 4, 5, 6] | | 0.8335 | 86.72 | 55500 | 1.0813 | 0.2626 | 0.4006 | 0.2814 | 0.2626 | -1.0 | -1.0 | 0.2266 | 0.3400 | 0.3400 | 0.3400 | -1.0 | -1.0 | [0.3188647925853729, 0.24917103350162506, 0.14266985654830933, 0.23321840167045593, 0.3422880470752716, 0.40139272809028625, 0.15093566477298737] | [0.3986016809940338, 0.32461369037628174, 0.2230844646692276, 0.3078930079936981, 0.43495413661003113, 0.45695778727531433, 0.23369023203849792] | [0, 1, 2, 3, 4, 5, 6] | | 0.9013 | 87.5 | 56000 | 1.0935 | 0.2643 | 0.4024 | 0.2840 | 0.2643 | -1.0 | -1.0 | 0.2275 | 0.3414 | 0.3414 | 0.3414 | -1.0 | -1.0 | [0.3207457959651947, 0.2506600618362427, 0.14368991553783417, 0.23426273465156555, 0.3447193503379822, 0.40385517477989197, 0.1522432267665863] | [0.399725079536438, 0.32615259289741516, 0.22432667016983032, 0.3094167113304138, 0.43683138489723206, 0.4587588608264923, 0.23490555584430695] | [0, 1, 2, 3, 4, 5, 6] | | 0.8493 | 88.28 | 56500 | 1.1127 | 0.2662 | 0.4037 | 0.2856 | 0.2662 | -1.0 | -1.0 | 0.2285 | 0.3430 | 0.3430 | 0.3430 | -1.0 | -1.0 | [0.3219901919364929, 0.2527844309806824, 0.1446000337600708, 0.23651152849197388, 0.34774136543273926, 0.40631386637687683, 0.15377303957939148] | [0.4008473753929138, 0.3279699683189392, 0.22549724578857422, 0.311124712228775, 0.43878045678138733, 0.4605865478515625, 0.2361944168806076] | [0, 1, 2, 3, 4, 5, 6] | | 0.8236 | 89.06 | 57000 | 1.0746 | 0.2684 | 0.4058 | 0.2881 | 0.2684 | -1.0 | -1.0 | 0.2295 | 0.3445 | 0.3445 | 0.3445 | -1.0 | -1.0 | [0.32365232706069946, 0.25473299622535706, 0.14600147306919098, 0.23888789117336273, 0.35029536485671997, 0.4098081886768341, 0.15511035919189453] | [0.4018661379814148, 0.3295002579689026, 0.22673267126083374, 0.31283313035964966, 0.4406152367591858, 0.4624691903591156, 0.2374046891927719] | [0, 1, 2, 3, 4, 5, 6] | | 0.7476 | 89.84 | 57500 | 1.0491 | 0.2701 | 0.4077 | 0.2904 | 0.2701 | -1.0 | -1.0 | 0.2305 | 0.3461 | 0.3461 | 0.3461 | -1.0 | -1.0 | [0.32512611150741577, 0.25677385926246643, 0.14725041389465332, 0.24099712073802948, 0.35212111473083496, 0.4122871160507202, 0.15622466802597046] | [0.4030149281024933, 0.3311725854873657, 0.22796671092510223, 0.31457799673080444, 0.44262856245040894, 0.4643837511539459, 0.2386176884174347] | [0, 1, 2, 3, 4, 5, 6] | | 0.845 | 90.62 | 58000 | 1.0391 | 0.2717 | 0.4101 | 0.2921 | 0.2717 | -1.0 | -1.0 | 0.2314 | 0.3475 | 0.3475 | 0.3475 | -1.0 | -1.0 | [0.3263866603374481, 0.2585732936859131, 0.1484149843454361, 0.2425975203514099, 0.3543750047683716, 0.4138467609882355, 0.15736864507198334] | [0.40416887402534485, 0.33269593119621277, 0.22913536429405212, 0.31621822714805603, 0.444607138633728, 0.46590909361839294, 0.23980452120304108] | [0, 1, 2, 3, 4, 5, 6] | | 0.8801 | 91.41 | 58500 | 1.0362 | 0.2731 | 0.4113 | 0.2935 | 0.2731 | -1.0 | -1.0 | 0.2324 | 0.3489 | 0.3489 | 0.3489 | -1.0 | -1.0 | [0.32764753699302673, 0.260421484708786, 0.1494179368019104, 0.24401891231536865, 0.3558345139026642, 0.41541850566864014, 0.15869934856891632] | [0.40527504682540894, 0.3343641459941864, 0.23031367361545563, 0.31765297055244446, 0.44645434617996216, 0.4674578011035919, 0.24107326567173004] | [0, 1, 2, 3, 4, 5, 6] | | 0.8057 | 92.19 | 59000 | 1.0802 | 0.2747 | 0.4131 | 0.2953 | 0.2747 | -1.0 | -1.0 | 0.2333 | 0.3504 | 0.3504 | 0.3504 | -1.0 | -1.0 | [0.3290484547615051, 0.261924684047699, 0.15058311820030212, 0.2450467050075531, 0.3583703339099884, 0.41783881187438965, 0.15978896617889404] | [0.4063403904438019, 0.335880845785141, 0.23150277137756348, 0.3192393481731415, 0.44818899035453796, 0.4691273272037506, 0.2423076331615448] | [0, 1, 2, 3, 4, 5, 6] | | 0.8364 | 92.97 | 59500 | 1.0644 | 0.2764 | 0.4142 | 0.2979 | 0.2764 | -1.0 | -1.0 | 0.2342 | 0.3518 | 0.3518 | 0.3518 | -1.0 | -1.0 | [0.33064794540405273, 0.26310494542121887, 0.15160790085792542, 0.24726790189743042, 0.3609471619129181, 0.42027997970581055, 0.16086260974407196] | [0.4074283838272095, 0.33729055523872375, 0.2326926738023758, 0.32071182131767273, 0.4500134289264679, 0.4709007441997528, 0.24341225624084473] | [0, 1, 2, 3, 4, 5, 6] | | 0.8684 | 93.75 | 60000 | 0.9989 | 0.2781 | 0.4156 | 0.2997 | 0.2781 | -1.0 | -1.0 | 0.2351 | 0.3532 | 0.3532 | 0.3532 | -1.0 | -1.0 | [0.33202818036079407, 0.26524290442466736, 0.152442067861557, 0.24876125156879425, 0.36294373869895935, 0.4227156341075897, 0.16251923143863678] | [0.4085730314254761, 0.3388686776161194, 0.23382288217544556, 0.3220645785331726, 0.45181506872177124, 0.47255510091781616, 0.24458375573158264] | [0, 1, 2, 3, 4, 5, 6] | | 0.8243 | 94.53 | 60500 | 1.0308 | 0.2798 | 0.4167 | 0.3017 | 0.2798 | -1.0 | -1.0 | 0.2361 | 0.3546 | 0.3546 | 0.3546 | -1.0 | -1.0 | [0.33315059542655945, 0.2673637270927429, 0.15366412699222565, 0.25071096420288086, 0.36503541469573975, 0.4245595633983612, 0.16381298005580902] | [0.4096628725528717, 0.34035563468933105, 0.2349071204662323, 0.3238125145435333, 0.45360955595970154, 0.474386990070343, 0.24572500586509705] | [0, 1, 2, 3, 4, 5, 6] | | 0.7726 | 95.31 | 61000 | 1.0308 | 0.2811 | 0.4188 | 0.3032 | 0.2811 | -1.0 | -1.0 | 0.2370 | 0.3561 | 0.3561 | 0.3561 | -1.0 | -1.0 | [0.334644615650177, 0.2687603533267975, 0.15480013191699982, 0.25236111879348755, 0.36578890681266785, 0.42645925283432007, 0.1650378257036209] | [0.41087964177131653, 0.3419274687767029, 0.23602905869483948, 0.32540416717529297, 0.45544201135635376, 0.4760398268699646, 0.24688491225242615] | [0, 1, 2, 3, 4, 5, 6] | | 0.8224 | 96.09 | 61500 | 1.0665 | 0.2825 | 0.4205 | 0.3046 | 0.2825 | -1.0 | -1.0 | 0.2379 | 0.3575 | 0.3575 | 0.3575 | -1.0 | -1.0 | [0.3356887400150299, 0.270687073469162, 0.15587256848812103, 0.2533589005470276, 0.3679884076118469, 0.427993506193161, 0.16625407338142395] | [0.41197147965431213, 0.34355753660202026, 0.23712871968746185, 0.32689398527145386, 0.4571593105792999, 0.47775986790657043, 0.24796830117702484] | [0, 1, 2, 3, 4, 5, 6] | | 0.782 | 96.88 | 62000 | 1.0066 | 0.2845 | 0.4227 | 0.3077 | 0.2845 | -1.0 | -1.0 | 0.2388 | 0.3589 | 0.3589 | 0.3589 | -1.0 | -1.0 | [0.3373223841190338, 0.27250176668167114, 0.15714958310127258, 0.25480565428733826, 0.37089982628822327, 0.43106621503829956, 0.1675104945898056] | [0.41309627890586853, 0.3448973596096039, 0.2383091151714325, 0.32838764786720276, 0.4589262008666992, 0.4794788062572479, 0.24908243119716644] | [0, 1, 2, 3, 4, 5, 6] | | 0.7531 | 97.66 | 62500 | 1.0201 | 0.2861 | 0.4239 | 0.3092 | 0.2861 | -1.0 | -1.0 | 0.2397 | 0.3603 | 0.3603 | 0.3603 | -1.0 | -1.0 | [0.3393549919128418, 0.2739318013191223, 0.1581399291753769, 0.2567703127861023, 0.3725869655609131, 0.4331088066101074, 0.16851994395256042] | [0.41427335143089294, 0.34615272283554077, 0.23941782116889954, 0.33001798391342163, 0.46086210012435913, 0.4812760353088379, 0.2502192556858063] | [0, 1, 2, 3, 4, 5, 6] | | 0.7591 | 98.44 | 63000 | 1.0092 | 0.2879 | 0.4249 | 0.3107 | 0.2879 | -1.0 | -1.0 | 0.2405 | 0.3617 | 0.3617 | 0.3617 | -1.0 | -1.0 | [0.34043723344802856, 0.2754877507686615, 0.1592513918876648, 0.2589435875415802, 0.3758857548236847, 0.43556466698646545, 0.169664204120636] | [0.41535133123397827, 0.3475950062274933, 0.24051809310913086, 0.33159223198890686, 0.46269115805625916, 0.48302504420280457, 0.2513315975666046] | [0, 1, 2, 3, 4, 5, 6] | | 0.8396 | 99.22 | 63500 | 1.0428 | 0.2892 | 0.4259 | 0.3124 | 0.2892 | -1.0 | -1.0 | 0.2414 | 0.3630 | 0.3630 | 0.3630 | -1.0 | -1.0 | [0.3414204716682434, 0.2765406370162964, 0.15995648503303528, 0.2607283294200897, 0.37773287296295166, 0.43706876039505005, 0.17061668634414673] | [0.4164183735847473, 0.3485850691795349, 0.24155816435813904, 0.33308175206184387, 0.46446266770362854, 0.4846424162387848, 0.25240010023117065] | [0, 1, 2, 3, 4, 5, 6] | | 0.7759 | 100.0 | 64000 | 1.0141 | 0.2906 | 0.4268 | 0.3141 | 0.2906 | -1.0 | -1.0 | 0.2422 | 0.3643 | 0.3643 | 0.3643 | -1.0 | -1.0 | [0.34276220202445984, 0.27865204215049744, 0.1607096940279007, 0.26159897446632385, 0.3790518343448639, 0.43935513496398926, 0.17204682528972626] | [0.4175185561180115, 0.34998106956481934, 0.24253687262535095, 0.3344777226448059, 0.4658925533294678, 0.48632490634918213, 0.25357064604759216] | [0, 1, 2, 3, 4, 5, 6] | ### Framework versions - Transformers 4.33.0.dev0 - Pytorch 2.0.1+cu117 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "metals_and_plastic", "other", "non_recyclable", "glass", "paper", "bio", "unknown" ]
deepdoctection/tatr_tab_struct_v2
Microsoft Table Transformer Table Structure Recognition trained on Pubtables and Fintabnet If you do not have the deepdoctection Profile of the model, please add: ```python import deepdoctection as dd dd.ModelCatalog.register("deepdoctection/tatr_tab_struct_v2/pytorch_model.bin", dd.ModelProfile( name="deepdoctection/tatr_tab_struct_v2/pytorch_model.bin", description="Table Transformer (DETR) model trained on PubTables1M. It was introduced in the paper " "Aligning benchmark datasets for table structure recognition by Smock et " "al. This model is devoted to table structure recognition and assumes to receive a slightly cropped" "table as input. It will predict rows, column and spanning cells. Use a padding of around 5 pixels", size=[115511753], tp_model=False, config="deepdoctection/tatr_tab_struct_v2/config.json", preprocessor_config="deepdoctection/tatr_tab_struct_v2/preprocessor_config.json", hf_repo_id="deepdoctection/tatr_tab_struct_v2", hf_model_name="pytorch_model.bin", hf_config_file=["config.json", "preprocessor_config.json"], categories={ "1": dd.LayoutType.table, "2": dd.LayoutType.column, "3": dd.LayoutType.row, "4": dd.CellType.column_header, "5": dd.CellType.projected_row_header, "6": dd.CellType.spanning, }, dl_library="PT", model_wrapper="HFDetrDerivedDetector", )) ``` When running the model within the deepdoctection analyzer, adjust the segmentation parameters in order to get better predictions. ```python import deepdoctection as dd analyzer = dd.get_dd_analyzer(reset_config_file=True, config_overwrite=["PT.ITEM.WEIGHTS=deepdoctection/tatr_tab_struct_v2/pytorch_model.bin", "PT.ITEM.FILTER=['table']", "PT.ITEM.PAD.TOP=5", "PT.ITEM.PAD.RIGHT=5", "PT.ITEM.PAD.BOTTOM=5", "PT.ITEM.PAD.LEFT=5", "SEGMENTATION.THRESHOLD_ROWS=0.9", "SEGMENTATION.THRESHOLD_COLS=0.9", "SEGMENTATION.REMOVE_IOU_THRESHOLD_ROWS=0.3", "SEGMENTATION.REMOVE_IOU_THRESHOLD_COLS=0.3", "WORD_MATCHING.MAX_PARENT_ONLY=True"]) ```
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
shubhamWi91/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the coco_hf dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 12 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "pet_bottle", "hm_ldpe", "pp_w", "ldpe_wrapper", "hdpe_bottle", "paper", "pp", "aluminium_foil", "multilayer_plastic", "ps", "cardboard", "blister_pack", "aluminium_can", "tetrapack", "others" ]
chanelcolgate/yolos_finetuned_yenthienviet
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # yolos_finetuned_yenthienviet This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small) on the yenthienviet dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.0 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "hop_dln", "hop_jn", "hop_vtg", "hop_ytv", "lo_kids", "lo_ytv", "loc_dln", "loc_jn", "loc_kids", "loc_ytv" ]
JaHyeok/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
t1m0/detr-resnet-50_finetuned_cppe5_t1m0
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5_t1m0 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
shubhamWi91/detr-resnet-50_finetuned_wi
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_wi This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the coco_hf dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 12 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "pet_bottle", "hm_ldpe", "pp_w", "ldpe_wrapper", "hdpe_bottle", "paper", "pp", "aluminium_foil", "multilayer_plastic", "ps", "cardboard", "blister_pack", "aluminium_can", "tetrapack", "others" ]
mmenendezg/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
techtank/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
ismailmo1/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
TuningAI/DETR-BASE_Marine
# DETR-BASE_Marine ## Overview + Model Name: DETR-BASE_Marine + Model Architecture: DETR (End-to-End Object Detection) with ResNet-50 backbone. + Model Type: Object Detection + Framework: PyTorch + Dataset: Aerial Maritime Image Dataset + License: MIT License (for the dataset) ## Model Description The DETR-BASE_Marine Aerial Maritime Detector is a deep learning model based on the DETR architecture with a ResNet-50 backbone. It has been fine-tuned on the "Aerial Maritime Image Dataset," which comprises 74 aerial photographs captured via a Mavic Air 2 drone. The model is designed for object detection tasks in maritime environments and can identify and locate various objects such as docks, boats, lifts, jetskis, and cars in aerial images. ## Key Features: + Multi-class object detection. + Object classes: Docks, Boats, Lifts, Jetskis, Cars. + Robust performance in aerial and maritime scenarios. ## Use Cases + **Boat Counting**: Count the number of boats on water bodies, such as lakes, using drone imagery. + **Boat Lift Detection**: Identify the presence of boat lifts on the waterfront via aerial surveillance. + **Car Detection**: Detect and locate cars within maritime regions using UAV drones. + **Habitability Assessment**: Determine the level of inhabitation around lakes and water bodies based on detected objects. + **Property Monitoring**: Identify if visitors or activities are present at lake houses or properties using drone surveillance. + **Proof of Concept**: Showcase the potential of UAV imagery for maritime projects and object detection tasks. ## Dataset + **Dataset Name**: Aerial Maritime Image Dataset + **Number of Images**: 74 + **Number of Bounding Boxes**: 1,151 + **Collection Method**: Captured via Mavic Air 2 drone at 400 ft altitude. ## Usage ``` python from transformers import DetrImageProcessor, DetrForObjectDetection import torch from PIL import Image img_path = "" image = Image.open(img_path) extractor = AutoFeatureExtractor.from_pretrained("TuningAI/DETR-BASE_Marine") model = AutoModelForObjectDetection.from_pretrained("TuningAI/DETR-BASE_Marine") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) # convert outputs (bounding boxes and class logits) to COCO API # let's only keep detections with score > 0.9 target_sizes = torch.tensor([image.size[::-1]]) results = processor.post_process_object_detection(outputs, target_sizes=target_sizes, threshold=0.9)[0] for score, label, box in zip(results["scores"], results["labels"], results["boxes"]): box = [round(i, 2) for i in box.tolist()] print( f"Detected {model.config.id2label[label.item()]} with confidence " f"{round(score.item(), 3)} at location {box}" ) ``` ## License This model is provided under the MIT License. The Aerial Maritime Image Dataset used for fine-tuning is also under the MIT License.
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
shubhamWi91/train83
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # train83 This model is a fine-tuned version of [jozhang97/deta-swin-large-o365](https://huggingface.co/jozhang97/deta-swin-large-o365) on the dataloader_hf dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 20 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "pet_bottle", "hm_ldpe", "pp_w", "ldpe_wrapper", "hdpe_bottle", "paper", "pp", "aluminium_foil", "multilayer_plastic", "ps", "cardboard", "blister_pack", "aluminium_can", "tetrapack", "others" ]
shubhamWi91/train84
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # train84 This model is a fine-tuned version of [jozhang97/deta-swin-large-o365](https://huggingface.co/jozhang97/deta-swin-large-o365) on the dataloader_hf dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 20 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "pet_bottle", "hm_ldpe", "pp_w", "ldpe_wrapper", "hdpe_bottle", "paper", "pp", "aluminium_foil", "multilayer_plastic", "ps", "cardboard", "blister_pack", "aluminium_can", "tetrapack", "others" ]
NajlaSS/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
hefeng0/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.0 - Pytorch 1.13.1+cu117 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
intelli-zen/detr_cppe5_object_detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr_cppe5_object_detection This model is a fine-tuned version of [qgyd2021/detr_cppe5_object_detection](https://huggingface.co/qgyd2021/detr_cppe5_object_detection) on the cppe5 dataset. It achieves the following results on the evaluation set: - Loss: 1.0644 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 2 - total_train_batch_size: 16 - total_eval_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.8107 | 3.17 | 200 | 1.0516 | | 0.8031 | 6.35 | 400 | 1.1292 | | 0.7474 | 9.52 | 600 | 1.1179 | | 0.7315 | 12.7 | 800 | 1.0198 | | 0.7605 | 15.87 | 1000 | 1.0427 | | 0.7611 | 19.05 | 1200 | 1.0867 | | 0.7377 | 22.22 | 1400 | 1.1264 | | 0.7303 | 25.4 | 1600 | 1.1137 | | 0.6692 | 28.57 | 1800 | 1.0644 | ### Framework versions - Transformers 4.33.0 - Pytorch 2.0.0 - Datasets 2.1.0 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
ice025/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.3 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
cd-daniel/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.3 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
cour4ge/detr-resnet-50_finetuned_cppe51
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe51 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.33.3 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
guyda/yolov8-ships-2308
### Supported Labels ``` ['person', 'helmet', 'head'] ```
[ "person", "helmet", "head" ]
sanali209/DT_face_head_char
# DT_face_head_char generated from custom dataset Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb). Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
[ "face", "character", "head" ]
huggingEars/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.0
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
ahirtonlopes/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
ckandemir/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
ckandemir/detr-resnet-50_finetuned_furniture
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_furniture This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
distill-io/detr-v9
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-9 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.9548 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:------:|:---------------:| | No log | 0.68 | 100 | 3.2412 | | 3.9658 | 1.36 | 200 | 3.0423 | | 3.063 | 2.04 | 300 | 2.8348 | | 3.063 | 2.72 | 400 | 2.8693 | | 2.8954 | 3.4 | 500 | 2.6293 | | 2.7743 | 4.08 | 600 | 2.6178 | | 2.7743 | 4.76 | 700 | 2.5513 | | 2.6323 | 5.44 | 800 | 2.5795 | | 2.6396 | 6.12 | 900 | 2.3751 | | 2.6396 | 6.8 | 1000 | 2.3357 | | 2.4932 | 7.48 | 1100 | 2.3184 | | 2.4299 | 8.16 | 1200 | 2.2754 | | 2.4299 | 8.84 | 1300 | 2.2419 | | 2.3508 | 9.52 | 1400 | 2.1568 | | 2.2593 | 10.2 | 1500 | 2.1253 | | 2.2593 | 10.88 | 1600 | 2.1364 | | 2.2376 | 11.56 | 1700 | 2.1320 | | 2.1749 | 12.24 | 1800 | 2.0464 | | 2.1749 | 12.93 | 1900 | 2.0201 | | 2.0878 | 13.61 | 2000 | 2.1603 | | 2.0701 | 14.29 | 2100 | 1.9910 | | 2.0701 | 14.97 | 2200 | 1.9665 | | 2.007 | 15.65 | 2300 | 1.8944 | | 1.9432 | 16.33 | 2400 | 1.8717 | | 1.9552 | 17.01 | 2500 | 1.9168 | | 1.9552 | 17.69 | 2600 | 1.8580 | | 1.905 | 18.37 | 2700 | 1.8306 | | 1.8821 | 19.05 | 2800 | 1.8386 | | 1.8821 | 19.73 | 2900 | 1.8215 | | 1.8569 | 20.41 | 3000 | 1.7825 | | 1.8083 | 21.09 | 3100 | 1.7335 | | 1.8083 | 21.77 | 3200 | 1.7117 | | 1.7617 | 22.45 | 3300 | 1.7170 | | 1.7304 | 23.13 | 3400 | 1.7235 | | 1.7304 | 23.81 | 3500 | 1.6907 | | 1.7165 | 24.49 | 3600 | 1.6281 | | 1.6793 | 25.17 | 3700 | 1.5950 | | 1.6793 | 25.85 | 3800 | 1.5856 | | 1.656 | 26.53 | 3900 | 1.6097 | | 1.6229 | 27.21 | 4000 | 1.5895 | | 1.6229 | 27.89 | 4100 | 1.6217 | | 1.6292 | 28.57 | 4200 | 1.5542 | | 1.5697 | 29.25 | 4300 | 1.6426 | | 1.5697 | 29.93 | 4400 | 1.6292 | | 1.6189 | 30.61 | 4500 | 1.5757 | | 1.573 | 31.29 | 4600 | 1.5476 | | 1.573 | 31.97 | 4700 | 1.5437 | | 1.5273 | 32.65 | 4800 | 1.5518 | | 1.5054 | 33.33 | 4900 | 1.4695 | | 1.4757 | 34.01 | 5000 | 1.5011 | | 1.4757 | 34.69 | 5100 | 1.4744 | | 1.4538 | 35.37 | 5200 | 1.4481 | | 1.4319 | 36.05 | 5300 | 1.4195 | | 1.4319 | 36.73 | 5400 | 1.5335 | | 1.3979 | 37.41 | 5500 | 1.3619 | | 1.3775 | 38.1 | 5600 | 1.4305 | | 1.3775 | 38.78 | 5700 | 1.3961 | | 1.3914 | 39.46 | 5800 | 1.3965 | | 1.3373 | 40.14 | 5900 | 1.3740 | | 1.3373 | 40.82 | 6000 | 1.4195 | | 1.3315 | 41.5 | 6100 | 1.4008 | | 1.3288 | 42.18 | 6200 | 1.3568 | | 1.3288 | 42.86 | 6300 | 1.3631 | | 1.2927 | 43.54 | 6400 | 1.3045 | | 1.2694 | 44.22 | 6500 | 1.3606 | | 1.2694 | 44.9 | 6600 | 1.3045 | | 1.2487 | 45.58 | 6700 | 1.3688 | | 1.2544 | 46.26 | 6800 | 1.2831 | | 1.2544 | 46.94 | 6900 | 1.2673 | | 1.2359 | 47.62 | 7000 | 1.2622 | | 1.2207 | 48.3 | 7100 | 1.2208 | | 1.2207 | 48.98 | 7200 | 1.2399 | | 1.2011 | 49.66 | 7300 | 1.2899 | | 1.1915 | 50.34 | 7400 | 1.2599 | | 1.1742 | 51.02 | 7500 | 1.1938 | | 1.1742 | 51.7 | 7600 | 1.2566 | | 1.1587 | 52.38 | 7700 | 1.1858 | | 1.1431 | 53.06 | 7800 | 1.2510 | | 1.1431 | 53.74 | 7900 | 1.1766 | | 1.1473 | 54.42 | 8000 | 1.1920 | | 1.1406 | 55.1 | 8100 | 1.3181 | | 1.1406 | 55.78 | 8200 | 1.2287 | | 1.1288 | 56.46 | 8300 | 1.1828 | | 1.1058 | 57.14 | 8400 | 1.3380 | | 1.1058 | 57.82 | 8500 | 1.3039 | | 1.1011 | 58.5 | 8600 | 1.1949 | | 1.0685 | 59.18 | 8700 | 1.1771 | | 1.0685 | 59.86 | 8800 | 1.1391 | | 1.077 | 60.54 | 8900 | 1.1271 | | 1.0787 | 61.22 | 9000 | 1.1005 | | 1.0787 | 61.9 | 9100 | 1.1096 | | 1.0493 | 62.59 | 9200 | 1.1689 | | 1.0428 | 63.27 | 9300 | 1.1353 | | 1.0428 | 63.95 | 9400 | 1.1348 | | 1.1068 | 64.63 | 9500 | 1.1882 | | 1.0131 | 65.31 | 9600 | 1.2055 | | 1.0131 | 65.99 | 9700 | 1.0887 | | 1.0127 | 66.67 | 9800 | 1.1398 | | 1.0163 | 67.35 | 9900 | 1.0899 | | 1.0039 | 68.03 | 10000 | 1.0990 | | 1.0039 | 68.71 | 10100 | 1.1135 | | 1.0104 | 69.39 | 10200 | 1.1319 | | 1.0014 | 70.07 | 10300 | 1.1386 | | 1.0014 | 70.75 | 10400 | 1.1442 | | 0.9976 | 71.43 | 10500 | 1.2050 | | 0.9616 | 72.11 | 10600 | 1.0659 | | 0.9616 | 72.79 | 10700 | 1.1428 | | 0.9801 | 73.47 | 10800 | 1.1244 | | 0.9548 | 74.15 | 10900 | 1.1127 | | 0.9548 | 74.83 | 11000 | 1.1491 | | 0.9669 | 75.51 | 11100 | 1.0919 | | 0.9556 | 76.19 | 11200 | 1.1382 | | 0.9556 | 76.87 | 11300 | 1.1156 | | 0.919 | 77.55 | 11400 | 1.0326 | | 0.9121 | 78.23 | 11500 | 1.1168 | | 0.9121 | 78.91 | 11600 | 1.1301 | | 0.9038 | 79.59 | 11700 | 1.1149 | | 0.8933 | 80.27 | 11800 | 1.0959 | | 0.8933 | 80.95 | 11900 | 1.1232 | | 0.8999 | 81.63 | 12000 | 1.0805 | | 0.8931 | 82.31 | 12100 | 1.1335 | | 0.8931 | 82.99 | 12200 | 1.1315 | | 0.8815 | 83.67 | 12300 | 1.0665 | | 0.8694 | 84.35 | 12400 | 1.0750 | | 0.8793 | 85.03 | 12500 | 1.0751 | | 0.8793 | 85.71 | 12600 | 1.0839 | | 0.9073 | 86.39 | 12700 | 1.1007 | | 0.8811 | 87.07 | 12800 | 1.0817 | | 0.8811 | 87.76 | 12900 | 1.0797 | | 0.8407 | 88.44 | 13000 | 1.1029 | | 0.8772 | 89.12 | 13100 | 1.0542 | | 0.8772 | 89.8 | 13200 | 1.0271 | | 0.8447 | 90.48 | 13300 | 1.0275 | | 0.8392 | 91.16 | 13400 | 0.9989 | | 0.8392 | 91.84 | 13500 | 1.0119 | | 0.8329 | 92.52 | 13600 | 1.0015 | | 0.8392 | 93.2 | 13700 | 1.0249 | | 0.8392 | 93.88 | 13800 | 1.0294 | | 0.8175 | 94.56 | 13900 | 1.0980 | | 0.8401 | 95.24 | 14000 | 1.0724 | | 0.8401 | 95.92 | 14100 | 1.0085 | | 0.8262 | 96.6 | 14200 | 1.0564 | | 0.8007 | 97.28 | 14300 | 1.0666 | | 0.8007 | 97.96 | 14400 | 1.0119 | | 0.8013 | 98.64 | 14500 | 1.1449 | | 0.7966 | 99.32 | 14600 | 1.0698 | | 0.7966 | 100.0 | 14700 | 1.0514 | | 0.7963 | 100.68 | 14800 | 0.9480 | | 0.7939 | 101.36 | 14900 | 0.9131 | | 0.7782 | 102.04 | 15000 | 0.9641 | | 0.7782 | 102.72 | 15100 | 0.9714 | | 0.7767 | 103.4 | 15200 | 1.0656 | | 0.7762 | 104.08 | 15300 | 1.0194 | | 0.7762 | 104.76 | 15400 | 1.0062 | | 0.7929 | 105.44 | 15500 | 1.0862 | | 0.7757 | 106.12 | 15600 | 1.0567 | | 0.7757 | 106.8 | 15700 | 0.9659 | | 0.7799 | 107.48 | 15800 | 0.9637 | | 0.7736 | 108.16 | 15900 | 0.9711 | | 0.7736 | 108.84 | 16000 | 1.0166 | | 0.7483 | 109.52 | 16100 | 1.0213 | | 0.7381 | 110.2 | 16200 | 0.9550 | | 0.7381 | 110.88 | 16300 | 0.9763 | | 0.7287 | 111.56 | 16400 | 0.9390 | | 0.7327 | 112.24 | 16500 | 1.0193 | | 0.7327 | 112.93 | 16600 | 0.9088 | | 0.7377 | 113.61 | 16700 | 0.9728 | | 0.7109 | 114.29 | 16800 | 1.0400 | | 0.7109 | 114.97 | 16900 | 1.0058 | | 0.717 | 115.65 | 17000 | 0.9745 | | 0.7187 | 116.33 | 17100 | 1.0387 | | 0.7097 | 117.01 | 17200 | 0.9599 | | 0.7097 | 117.69 | 17300 | 1.0639 | | 0.7072 | 118.37 | 17400 | 1.0272 | | 0.7124 | 119.05 | 17500 | 0.9891 | | 0.7124 | 119.73 | 17600 | 0.9851 | | 0.6856 | 120.41 | 17700 | 0.9980 | | 0.6781 | 121.09 | 17800 | 1.0234 | | 0.6781 | 121.77 | 17900 | 1.0307 | | 0.6827 | 122.45 | 18000 | 0.9978 | | 0.6793 | 123.13 | 18100 | 0.9692 | | 0.6793 | 123.81 | 18200 | 0.9417 | | 0.6867 | 124.49 | 18300 | 0.9869 | | 0.6744 | 125.17 | 18400 | 0.9923 | | 0.6744 | 125.85 | 18500 | 0.9756 | | 0.6593 | 126.53 | 18600 | 0.9938 | | 0.6488 | 127.21 | 18700 | 0.9382 | | 0.6488 | 127.89 | 18800 | 0.9534 | | 0.644 | 128.57 | 18900 | 0.9072 | | 0.6725 | 129.25 | 19000 | 1.0747 | | 0.6725 | 129.93 | 19100 | 0.9569 | | 0.656 | 130.61 | 19200 | 0.9673 | | 0.6653 | 131.29 | 19300 | 0.9582 | | 0.6653 | 131.97 | 19400 | 0.9470 | | 0.6719 | 132.65 | 19500 | 0.9331 | | 0.6665 | 133.33 | 19600 | 0.9860 | | 0.6533 | 134.01 | 19700 | 1.0467 | | 0.6533 | 134.69 | 19800 | 1.0140 | | 0.6489 | 135.37 | 19900 | 0.9366 | | 0.6546 | 136.05 | 20000 | 0.9923 | | 0.6546 | 136.73 | 20100 | 1.1226 | | 0.6501 | 137.41 | 20200 | 0.9184 | | 0.6487 | 138.1 | 20300 | 1.0354 | | 0.6487 | 138.78 | 20400 | 1.0149 | | 0.6462 | 139.46 | 20500 | 0.9540 | | 0.6413 | 140.14 | 20600 | 1.0019 | | 0.6413 | 140.82 | 20700 | 0.9481 | | 0.6563 | 141.5 | 20800 | 0.9663 | | 0.6485 | 142.18 | 20900 | 0.9496 | | 0.6485 | 142.86 | 21000 | 0.9743 | | 0.6489 | 143.54 | 21100 | 1.0144 | | 0.6493 | 144.22 | 21200 | 0.9667 | | 0.6493 | 144.9 | 21300 | 0.9665 | | 0.6385 | 145.58 | 21400 | 1.0027 | | 0.6337 | 146.26 | 21500 | 0.9546 | | 0.6337 | 146.94 | 21600 | 1.0924 | | 0.6199 | 147.62 | 21700 | 0.9781 | | 0.6389 | 148.3 | 21800 | 1.0117 | | 0.6389 | 148.98 | 21900 | 0.9892 | | 0.638 | 149.66 | 22000 | 0.9263 | | 0.615 | 150.34 | 22100 | 0.9498 | | 0.6052 | 151.02 | 22200 | 0.9727 | | 0.6052 | 151.7 | 22300 | 0.9810 | | 0.6144 | 152.38 | 22400 | 0.9167 | | 0.6024 | 153.06 | 22500 | 0.9862 | | 0.6024 | 153.74 | 22600 | 1.0106 | | 0.6015 | 154.42 | 22700 | 1.0130 | | 0.5847 | 155.1 | 22800 | 1.0303 | | 0.5847 | 155.78 | 22900 | 0.9814 | | 0.6149 | 156.46 | 23000 | 0.8867 | | 0.5985 | 157.14 | 23100 | 0.9578 | | 0.5985 | 157.82 | 23200 | 1.0177 | | 0.6023 | 158.5 | 23300 | 0.9790 | | 0.5924 | 159.18 | 23400 | 0.9915 | | 0.5924 | 159.86 | 23500 | 0.9732 | | 0.5974 | 160.54 | 23600 | 0.9765 | | 0.6002 | 161.22 | 23700 | 0.9913 | | 0.6002 | 161.9 | 23800 | 1.0328 | | 0.5858 | 162.59 | 23900 | 0.9185 | | 0.5894 | 163.27 | 24000 | 0.9617 | | 0.5894 | 163.95 | 24100 | 0.9610 | | 0.5677 | 164.63 | 24200 | 0.9228 | | 0.5782 | 165.31 | 24300 | 0.9632 | | 0.5782 | 165.99 | 24400 | 0.9346 | | 0.5772 | 166.67 | 24500 | 1.0165 | | 0.5823 | 167.35 | 24600 | 1.0094 | | 0.5719 | 168.03 | 24700 | 0.9632 | | 0.5719 | 168.71 | 24800 | 0.9426 | | 0.5629 | 169.39 | 24900 | 0.9430 | | 0.5665 | 170.07 | 25000 | 0.9907 | | 0.5665 | 170.75 | 25100 | 0.9612 | | 0.5634 | 171.43 | 25200 | 1.0117 | | 0.5662 | 172.11 | 25300 | 1.0252 | | 0.5662 | 172.79 | 25400 | 0.9665 | | 0.5645 | 173.47 | 25500 | 0.9646 | | 0.5567 | 174.15 | 25600 | 0.9745 | | 0.5567 | 174.83 | 25700 | 0.9662 | | 0.5676 | 175.51 | 25800 | 0.9624 | | 0.5614 | 176.19 | 25900 | 0.9740 | | 0.5614 | 176.87 | 26000 | 0.9564 | | 0.5498 | 177.55 | 26100 | 0.9050 | | 0.5664 | 178.23 | 26200 | 0.9700 | | 0.5664 | 178.91 | 26300 | 1.0037 | | 0.5471 | 179.59 | 26400 | 0.9914 | | 0.5366 | 180.27 | 26500 | 1.0204 | | 0.5366 | 180.95 | 26600 | 0.9942 | | 0.5436 | 181.63 | 26700 | 0.9809 | | 0.5703 | 182.31 | 26800 | 1.0165 | | 0.5703 | 182.99 | 26900 | 0.9786 | | 0.549 | 183.67 | 27000 | 1.0115 | | 0.5397 | 184.35 | 27100 | 1.0087 | | 0.5344 | 185.03 | 27200 | 0.9985 | | 0.5344 | 185.71 | 27300 | 0.9601 | | 0.5346 | 186.39 | 27400 | 0.9388 | | 0.5548 | 187.07 | 27500 | 0.9791 | | 0.5548 | 187.76 | 27600 | 0.9298 | | 0.5437 | 188.44 | 27700 | 1.0127 | | 0.5551 | 189.12 | 27800 | 0.9693 | | 0.5551 | 189.8 | 27900 | 0.9636 | | 0.5438 | 190.48 | 28000 | 0.9502 | | 0.5263 | 191.16 | 28100 | 0.9204 | | 0.5263 | 191.84 | 28200 | 0.9547 | | 0.5232 | 192.52 | 28300 | 0.9199 | | 0.525 | 193.2 | 28400 | 1.0316 | | 0.525 | 193.88 | 28500 | 0.9328 | | 0.5372 | 194.56 | 28600 | 0.9614 | | 0.5478 | 195.24 | 28700 | 0.9657 | | 0.5478 | 195.92 | 28800 | 0.9648 | | 0.5401 | 196.6 | 28900 | 0.9427 | | 0.5338 | 197.28 | 29000 | 0.9627 | | 0.5338 | 197.96 | 29100 | 0.9876 | | 0.5131 | 198.64 | 29200 | 0.9777 | | 0.522 | 199.32 | 29300 | 1.0747 | | 0.522 | 200.0 | 29400 | 1.0181 | | 0.5275 | 200.68 | 29500 | 0.9527 | | 0.5342 | 201.36 | 29600 | 1.0019 | | 0.5297 | 202.04 | 29700 | 0.9576 | | 0.5297 | 202.72 | 29800 | 0.9968 | | 0.5367 | 203.4 | 29900 | 0.9542 | | 0.5148 | 204.08 | 30000 | 0.9250 | | 0.5148 | 204.76 | 30100 | 1.0072 | | 0.5176 | 205.44 | 30200 | 0.9485 | | 0.5125 | 206.12 | 30300 | 0.9220 | | 0.5125 | 206.8 | 30400 | 0.9326 | | 0.5075 | 207.48 | 30500 | 0.9153 | | 0.5084 | 208.16 | 30600 | 0.9837 | | 0.5084 | 208.84 | 30700 | 0.9482 | | 0.503 | 209.52 | 30800 | 0.9677 | | 0.5001 | 210.2 | 30900 | 0.9626 | | 0.5001 | 210.88 | 31000 | 0.9106 | | 0.5115 | 211.56 | 31100 | 1.0392 | | 0.5012 | 212.24 | 31200 | 0.9873 | | 0.5012 | 212.93 | 31300 | 0.9727 | | 0.5122 | 213.61 | 31400 | 1.0177 | | 0.4997 | 214.29 | 31500 | 0.9833 | | 0.4997 | 214.97 | 31600 | 0.9190 | | 0.5147 | 215.65 | 31700 | 0.9619 | | 0.5122 | 216.33 | 31800 | 0.8989 | | 0.4964 | 217.01 | 31900 | 0.8954 | | 0.4964 | 217.69 | 32000 | 0.9823 | | 0.4953 | 218.37 | 32100 | 1.0035 | | 0.4951 | 219.05 | 32200 | 0.9277 | | 0.4951 | 219.73 | 32300 | 0.9064 | | 0.5088 | 220.41 | 32400 | 0.9687 | | 0.5003 | 221.09 | 32500 | 1.0024 | | 0.5003 | 221.77 | 32600 | 0.9359 | | 0.5013 | 222.45 | 32700 | 0.8833 | | 0.5002 | 223.13 | 32800 | 0.8583 | | 0.5002 | 223.81 | 32900 | 0.8660 | | 0.4936 | 224.49 | 33000 | 0.8381 | | 0.4919 | 225.17 | 33100 | 0.8624 | | 0.4919 | 225.85 | 33200 | 0.8423 | | 0.5002 | 226.53 | 33300 | 0.8991 | | 0.4781 | 227.21 | 33400 | 0.9186 | | 0.4781 | 227.89 | 33500 | 0.8910 | | 0.4823 | 228.57 | 33600 | 0.9290 | | 0.4899 | 229.25 | 33700 | 0.9599 | | 0.4899 | 229.93 | 33800 | 0.8219 | | 0.4986 | 230.61 | 33900 | 0.8769 | | 0.4837 | 231.29 | 34000 | 0.9619 | | 0.4837 | 231.97 | 34100 | 0.9140 | | 0.4838 | 232.65 | 34200 | 0.9978 | | 0.491 | 233.33 | 34300 | 0.9176 | | 0.4786 | 234.01 | 34400 | 0.9227 | | 0.4786 | 234.69 | 34500 | 0.9498 | | 0.4754 | 235.37 | 34600 | 0.9387 | | 0.476 | 236.05 | 34700 | 0.9002 | | 0.476 | 236.73 | 34800 | 0.9502 | | 0.4869 | 237.41 | 34900 | 0.9350 | | 0.4638 | 238.1 | 35000 | 0.9066 | | 0.4638 | 238.78 | 35100 | 0.8994 | | 0.4748 | 239.46 | 35200 | 0.9009 | | 0.4617 | 240.14 | 35300 | 0.9449 | | 0.4617 | 240.82 | 35400 | 0.9188 | | 0.47 | 241.5 | 35500 | 0.9288 | | 0.4572 | 242.18 | 35600 | 0.9002 | | 0.4572 | 242.86 | 35700 | 0.9040 | | 0.4687 | 243.54 | 35800 | 0.9652 | | 0.4808 | 244.22 | 35900 | 0.9639 | | 0.4808 | 244.9 | 36000 | 0.8987 | | 0.4647 | 245.58 | 36100 | 0.8977 | | 0.4728 | 246.26 | 36200 | 0.9150 | | 0.4728 | 246.94 | 36300 | 0.8753 | | 0.464 | 247.62 | 36400 | 0.9486 | | 0.4628 | 248.3 | 36500 | 0.8833 | | 0.4628 | 248.98 | 36600 | 0.9540 | | 0.4692 | 249.66 | 36700 | 0.8930 | | 0.4732 | 250.34 | 36800 | 0.9098 | | 0.4552 | 251.02 | 36900 | 0.9363 | | 0.4552 | 251.7 | 37000 | 0.9720 | | 0.458 | 252.38 | 37100 | 0.8646 | | 0.4576 | 253.06 | 37200 | 0.9070 | | 0.4576 | 253.74 | 37300 | 0.9384 | | 0.4575 | 254.42 | 37400 | 0.8082 | | 0.4673 | 255.1 | 37500 | 0.9216 | | 0.4673 | 255.78 | 37600 | 0.8547 | | 0.4685 | 256.46 | 37700 | 0.9245 | | 0.4593 | 257.14 | 37800 | 0.9047 | | 0.4593 | 257.82 | 37900 | 0.8846 | | 0.4549 | 258.5 | 38000 | 0.9293 | | 0.4573 | 259.18 | 38100 | 0.8907 | | 0.4573 | 259.86 | 38200 | 0.9024 | | 0.463 | 260.54 | 38300 | 0.9144 | | 0.4549 | 261.22 | 38400 | 0.9190 | | 0.4549 | 261.9 | 38500 | 0.8713 | | 0.4459 | 262.59 | 38600 | 0.8938 | | 0.4625 | 263.27 | 38700 | 0.8699 | | 0.4625 | 263.95 | 38800 | 0.8854 | | 0.4379 | 264.63 | 38900 | 0.8578 | | 0.4458 | 265.31 | 39000 | 0.9256 | | 0.4458 | 265.99 | 39100 | 0.9711 | | 0.4438 | 266.67 | 39200 | 0.9254 | | 0.4515 | 267.35 | 39300 | 0.9599 | | 0.4565 | 268.03 | 39400 | 0.9208 | | 0.4565 | 268.71 | 39500 | 0.9153 | | 0.4586 | 269.39 | 39600 | 0.8639 | | 0.4368 | 270.07 | 39700 | 0.8932 | | 0.4368 | 270.75 | 39800 | 0.9732 | | 0.4458 | 271.43 | 39900 | 1.0161 | | 0.4452 | 272.11 | 40000 | 0.9847 | | 0.4452 | 272.79 | 40100 | 0.9129 | | 0.4499 | 273.47 | 40200 | 0.9575 | | 0.4308 | 274.15 | 40300 | 0.9167 | | 0.4308 | 274.83 | 40400 | 0.9678 | | 0.4399 | 275.51 | 40500 | 0.9841 | | 0.4355 | 276.19 | 40600 | 0.9262 | | 0.4355 | 276.87 | 40700 | 0.9440 | | 0.4312 | 277.55 | 40800 | 0.9780 | | 0.4259 | 278.23 | 40900 | 0.9153 | | 0.4259 | 278.91 | 41000 | 0.9735 | | 0.4354 | 279.59 | 41100 | 0.9483 | | 0.4318 | 280.27 | 41200 | 0.9608 | | 0.4318 | 280.95 | 41300 | 0.9413 | | 0.438 | 281.63 | 41400 | 0.9569 | | 0.4388 | 282.31 | 41500 | 0.9049 | | 0.4388 | 282.99 | 41600 | 0.8986 | | 0.4438 | 283.67 | 41700 | 0.9700 | | 0.4368 | 284.35 | 41800 | 0.9049 | | 0.4371 | 285.03 | 41900 | 0.8275 | | 0.4371 | 285.71 | 42000 | 1.0013 | | 0.4497 | 286.39 | 42100 | 0.9242 | | 0.4601 | 287.07 | 42200 | 0.9197 | | 0.4601 | 287.76 | 42300 | 0.8905 | | 0.4428 | 288.44 | 42400 | 0.8584 | | 0.4369 | 289.12 | 42500 | 0.8881 | | 0.4369 | 289.8 | 42600 | 0.9121 | | 0.4325 | 290.48 | 42700 | 0.8598 | | 0.4266 | 291.16 | 42800 | 0.9031 | | 0.4266 | 291.84 | 42900 | 0.8444 | | 0.4218 | 292.52 | 43000 | 0.8966 | | 0.4252 | 293.2 | 43100 | 0.9224 | | 0.4252 | 293.88 | 43200 | 1.0000 | | 0.4231 | 294.56 | 43300 | 0.9438 | | 0.4204 | 295.24 | 43400 | 0.8706 | | 0.4204 | 295.92 | 43500 | 0.8563 | | 0.4207 | 296.6 | 43600 | 0.9680 | | 0.4247 | 297.28 | 43700 | 0.8682 | | 0.4247 | 297.96 | 43800 | 0.9071 | | 0.4247 | 298.64 | 43900 | 0.8642 | | 0.4183 | 299.32 | 44000 | 0.8874 | | 0.4183 | 300.0 | 44100 | 0.9027 | | 0.4352 | 300.68 | 44200 | 0.8447 | | 0.4241 | 301.36 | 44300 | 0.9028 | | 0.4266 | 302.04 | 44400 | 0.9055 | | 0.4266 | 302.72 | 44500 | 0.9251 | | 0.4183 | 303.4 | 44600 | 0.9440 | | 0.4148 | 304.08 | 44700 | 0.9566 | | 0.4148 | 304.76 | 44800 | 0.8994 | | 0.4217 | 305.44 | 44900 | 1.0046 | | 0.4165 | 306.12 | 45000 | 0.8346 | | 0.4165 | 306.8 | 45100 | 0.8727 | | 0.4129 | 307.48 | 45200 | 0.9284 | | 0.408 | 308.16 | 45300 | 0.9695 | | 0.408 | 308.84 | 45400 | 0.9798 | | 0.3986 | 309.52 | 45500 | 0.9456 | | 0.4219 | 310.2 | 45600 | 0.9017 | | 0.4219 | 310.88 | 45700 | 0.9370 | | 0.422 | 311.56 | 45800 | 0.8430 | | 0.415 | 312.24 | 45900 | 0.9242 | | 0.415 | 312.93 | 46000 | 0.9381 | | 0.4173 | 313.61 | 46100 | 0.8775 | | 0.4204 | 314.29 | 46200 | 0.9259 | | 0.4204 | 314.97 | 46300 | 0.9272 | | 0.4073 | 315.65 | 46400 | 0.8997 | | 0.4137 | 316.33 | 46500 | 0.9177 | | 0.4043 | 317.01 | 46600 | 0.9592 | | 0.4043 | 317.69 | 46700 | 0.9665 | | 0.4224 | 318.37 | 46800 | 0.8610 | | 0.415 | 319.05 | 46900 | 0.8602 | | 0.415 | 319.73 | 47000 | 0.9231 | | 0.4103 | 320.41 | 47100 | 0.9351 | | 0.4162 | 321.09 | 47200 | 0.9975 | | 0.4162 | 321.77 | 47300 | 0.9037 | | 0.4083 | 322.45 | 47400 | 0.8951 | | 0.4173 | 323.13 | 47500 | 0.9530 | | 0.4173 | 323.81 | 47600 | 0.8620 | | 0.4124 | 324.49 | 47700 | 0.9234 | | 0.413 | 325.17 | 47800 | 0.9347 | | 0.413 | 325.85 | 47900 | 0.9841 | | 0.4117 | 326.53 | 48000 | 0.9996 | | 0.4127 | 327.21 | 48100 | 0.9128 | | 0.4127 | 327.89 | 48200 | 0.8949 | | 0.3967 | 328.57 | 48300 | 0.9390 | | 0.4068 | 329.25 | 48400 | 0.9034 | | 0.4068 | 329.93 | 48500 | 0.9314 | | 0.4086 | 330.61 | 48600 | 0.9609 | | 0.4143 | 331.29 | 48700 | 0.9333 | | 0.4143 | 331.97 | 48800 | 0.9294 | | 0.4144 | 332.65 | 48900 | 0.8984 | | 0.4077 | 333.33 | 49000 | 1.0073 | | 0.3953 | 334.01 | 49100 | 0.9610 | | 0.3953 | 334.69 | 49200 | 0.9907 | | 0.3961 | 335.37 | 49300 | 0.9689 | | 0.4122 | 336.05 | 49400 | 0.9386 | | 0.4122 | 336.73 | 49500 | 0.9186 | | 0.3946 | 337.41 | 49600 | 0.9927 | | 0.4021 | 338.1 | 49700 | 1.0131 | | 0.4021 | 338.78 | 49800 | 1.0783 | | 0.4039 | 339.46 | 49900 | 0.9340 | | 0.3989 | 340.14 | 50000 | 0.9544 | | 0.3989 | 340.82 | 50100 | 0.9124 | | 0.4054 | 341.5 | 50200 | 0.9842 | | 0.4059 | 342.18 | 50300 | 0.9947 | | 0.4059 | 342.86 | 50400 | 0.9687 | | 0.403 | 343.54 | 50500 | 0.9740 | | 0.3969 | 344.22 | 50600 | 0.9313 | | 0.3969 | 344.9 | 50700 | 0.9186 | | 0.4034 | 345.58 | 50800 | 0.9666 | | 0.4101 | 346.26 | 50900 | 0.8962 | | 0.4101 | 346.94 | 51000 | 0.9590 | | 0.4141 | 347.62 | 51100 | 0.9583 | | 0.4029 | 348.3 | 51200 | 0.9203 | | 0.4029 | 348.98 | 51300 | 0.9875 | | 0.4141 | 349.66 | 51400 | 0.9645 | | 0.3974 | 350.34 | 51500 | 0.9310 | | 0.3881 | 351.02 | 51600 | 0.9739 | | 0.3881 | 351.7 | 51700 | 0.9420 | | 0.384 | 352.38 | 51800 | 0.9549 | | 0.3902 | 353.06 | 51900 | 0.9647 | | 0.3902 | 353.74 | 52000 | 0.9604 | | 0.3846 | 354.42 | 52100 | 0.9756 | | 0.3906 | 355.1 | 52200 | 0.9419 | | 0.3906 | 355.78 | 52300 | 0.9461 | | 0.3712 | 356.46 | 52400 | 0.9420 | | 0.3769 | 357.14 | 52500 | 0.9315 | | 0.3769 | 357.82 | 52600 | 0.9119 | | 0.3896 | 358.5 | 52700 | 0.9798 | | 0.3787 | 359.18 | 52800 | 0.9941 | | 0.3787 | 359.86 | 52900 | 0.9364 | | 0.3903 | 360.54 | 53000 | 0.9152 | | 0.3823 | 361.22 | 53100 | 0.9817 | | 0.3823 | 361.9 | 53200 | 0.9087 | | 0.378 | 362.59 | 53300 | 0.9299 | | 0.3882 | 363.27 | 53400 | 0.9989 | | 0.3882 | 363.95 | 53500 | 0.9168 | | 0.3787 | 364.63 | 53600 | 0.9464 | | 0.3835 | 365.31 | 53700 | 0.9010 | | 0.3835 | 365.99 | 53800 | 0.8880 | | 0.3751 | 366.67 | 53900 | 0.9004 | | 0.3745 | 367.35 | 54000 | 0.9491 | | 0.3776 | 368.03 | 54100 | 1.0176 | | 0.3776 | 368.71 | 54200 | 0.9734 | | 0.3744 | 369.39 | 54300 | 0.9464 | | 0.3659 | 370.07 | 54400 | 0.9967 | | 0.3659 | 370.75 | 54500 | 0.9905 | | 0.3671 | 371.43 | 54600 | 0.9456 | | 0.3747 | 372.11 | 54700 | 0.9371 | | 0.3747 | 372.79 | 54800 | 0.8921 | | 0.3728 | 373.47 | 54900 | 0.8826 | | 0.3776 | 374.15 | 55000 | 0.9630 | | 0.3776 | 374.83 | 55100 | 0.9353 | | 0.3694 | 375.51 | 55200 | 0.9479 | | 0.3768 | 376.19 | 55300 | 0.9303 | | 0.3768 | 376.87 | 55400 | 0.9540 | | 0.3747 | 377.55 | 55500 | 0.9383 | | 0.3737 | 378.23 | 55600 | 0.9170 | | 0.3737 | 378.91 | 55700 | 0.8026 | | 0.3757 | 379.59 | 55800 | 0.8989 | | 0.3678 | 380.27 | 55900 | 0.9248 | | 0.3678 | 380.95 | 56000 | 0.8190 | | 0.3738 | 381.63 | 56100 | 0.9005 | | 0.376 | 382.31 | 56200 | 0.8561 | | 0.376 | 382.99 | 56300 | 0.9408 | | 0.3714 | 383.67 | 56400 | 0.9226 | | 0.364 | 384.35 | 56500 | 0.9577 | | 0.3465 | 385.03 | 56600 | 0.9440 | | 0.3465 | 385.71 | 56700 | 0.9178 | | 0.374 | 386.39 | 56800 | 0.9044 | | 0.3727 | 387.07 | 56900 | 0.8633 | | 0.3727 | 387.76 | 57000 | 0.9078 | | 0.3735 | 388.44 | 57100 | 0.9021 | | 0.3655 | 389.12 | 57200 | 0.9499 | | 0.3655 | 389.8 | 57300 | 0.9290 | | 0.3615 | 390.48 | 57400 | 0.8906 | | 0.3588 | 391.16 | 57500 | 0.8692 | | 0.3588 | 391.84 | 57600 | 0.8857 | | 0.3639 | 392.52 | 57700 | 0.9569 | | 0.3557 | 393.2 | 57800 | 0.9146 | | 0.3557 | 393.88 | 57900 | 0.9878 | | 0.3532 | 394.56 | 58000 | 0.8703 | | 0.3745 | 395.24 | 58100 | 0.8679 | | 0.3745 | 395.92 | 58200 | 0.8823 | | 0.3566 | 396.6 | 58300 | 0.9611 | | 0.3642 | 397.28 | 58400 | 0.9327 | | 0.3642 | 397.96 | 58500 | 0.8587 | | 0.3623 | 398.64 | 58600 | 0.8746 | | 0.3629 | 399.32 | 58700 | 0.9093 | | 0.3629 | 400.0 | 58800 | 0.8858 | | 0.354 | 400.68 | 58900 | 0.8902 | | 0.3487 | 401.36 | 59000 | 0.8693 | | 0.3467 | 402.04 | 59100 | 1.0825 | | 0.3467 | 402.72 | 59200 | 0.9697 | | 0.3517 | 403.4 | 59300 | 0.9169 | | 0.3696 | 404.08 | 59400 | 0.9237 | | 0.3696 | 404.76 | 59500 | 0.9033 | | 0.3629 | 405.44 | 59600 | 0.9062 | | 0.3559 | 406.12 | 59700 | 0.9159 | | 0.3559 | 406.8 | 59800 | 0.8730 | | 0.3603 | 407.48 | 59900 | 0.8732 | | 0.3676 | 408.16 | 60000 | 0.8897 | | 0.3676 | 408.84 | 60100 | 0.7334 | | 0.3584 | 409.52 | 60200 | 0.8494 | | 0.3449 | 410.2 | 60300 | 0.8944 | | 0.3449 | 410.88 | 60400 | 0.8014 | | 0.3513 | 411.56 | 60500 | 0.8673 | | 0.3497 | 412.24 | 60600 | 0.9071 | | 0.3497 | 412.93 | 60700 | 0.8574 | | 0.3582 | 413.61 | 60800 | 0.9135 | | 0.3555 | 414.29 | 60900 | 0.8723 | | 0.3555 | 414.97 | 61000 | 0.8772 | | 0.3453 | 415.65 | 61100 | 0.9041 | | 0.3451 | 416.33 | 61200 | 0.8647 | | 0.3499 | 417.01 | 61300 | 0.9790 | | 0.3499 | 417.69 | 61400 | 0.9859 | | 0.3502 | 418.37 | 61500 | 0.8996 | | 0.3534 | 419.05 | 61600 | 0.9446 | | 0.3534 | 419.73 | 61700 | 0.8965 | | 0.3388 | 420.41 | 61800 | 0.9314 | | 0.3441 | 421.09 | 61900 | 0.8792 | | 0.3441 | 421.77 | 62000 | 0.9422 | | 0.3443 | 422.45 | 62100 | 0.9591 | | 0.3681 | 423.13 | 62200 | 0.9265 | | 0.3681 | 423.81 | 62300 | 0.8872 | | 0.3557 | 424.49 | 62400 | 0.8528 | | 0.3553 | 425.17 | 62500 | 0.9701 | | 0.3553 | 425.85 | 62600 | 0.9512 | | 0.3523 | 426.53 | 62700 | 0.9026 | | 0.3467 | 427.21 | 62800 | 0.9087 | | 0.3467 | 427.89 | 62900 | 1.0169 | | 0.3521 | 428.57 | 63000 | 0.9314 | | 0.3411 | 429.25 | 63100 | 0.9291 | | 0.3411 | 429.93 | 63200 | 0.9567 | | 0.3469 | 430.61 | 63300 | 0.9458 | | 0.3449 | 431.29 | 63400 | 0.9337 | | 0.3449 | 431.97 | 63500 | 0.9503 | | 0.3369 | 432.65 | 63600 | 0.8987 | | 0.3384 | 433.33 | 63700 | 0.8578 | | 0.3265 | 434.01 | 63800 | 0.9543 | | 0.3265 | 434.69 | 63900 | 0.9231 | | 0.3356 | 435.37 | 64000 | 0.9121 | | 0.3388 | 436.05 | 64100 | 0.9279 | | 0.3388 | 436.73 | 64200 | 0.8939 | | 0.3351 | 437.41 | 64300 | 0.8934 | | 0.3386 | 438.1 | 64400 | 0.9469 | | 0.3386 | 438.78 | 64500 | 0.9149 | | 0.3439 | 439.46 | 64600 | 0.8963 | | 0.3381 | 440.14 | 64700 | 0.8653 | | 0.3381 | 440.82 | 64800 | 0.8633 | | 0.3339 | 441.5 | 64900 | 0.8783 | | 0.3242 | 442.18 | 65000 | 0.9143 | | 0.3242 | 442.86 | 65100 | 0.9553 | | 0.3271 | 443.54 | 65200 | 0.8563 | | 0.3281 | 444.22 | 65300 | 0.9003 | | 0.3281 | 444.9 | 65400 | 0.8555 | | 0.3367 | 445.58 | 65500 | 0.9146 | | 0.3228 | 446.26 | 65600 | 0.9052 | | 0.3228 | 446.94 | 65700 | 0.9237 | | 0.3328 | 447.62 | 65800 | 0.9128 | | 0.324 | 448.3 | 65900 | 0.9159 | | 0.324 | 448.98 | 66000 | 0.8867 | | 0.3305 | 449.66 | 66100 | 0.9694 | | 0.3329 | 450.34 | 66200 | 0.9833 | | 0.3348 | 451.02 | 66300 | 0.9344 | | 0.3348 | 451.7 | 66400 | 0.9303 | | 0.321 | 452.38 | 66500 | 0.9275 | | 0.3335 | 453.06 | 66600 | 0.9419 | | 0.3335 | 453.74 | 66700 | 0.9502 | | 0.3189 | 454.42 | 66800 | 0.9341 | | 0.3386 | 455.1 | 66900 | 0.9404 | | 0.3386 | 455.78 | 67000 | 0.9660 | | 0.3273 | 456.46 | 67100 | 0.9323 | | 0.339 | 457.14 | 67200 | 0.9266 | | 0.339 | 457.82 | 67300 | 0.9289 | | 0.3326 | 458.5 | 67400 | 0.9248 | | 0.3207 | 459.18 | 67500 | 0.9374 | | 0.3207 | 459.86 | 67600 | 0.8996 | | 0.3339 | 460.54 | 67700 | 0.9271 | | 0.3198 | 461.22 | 67800 | 0.9627 | | 0.3198 | 461.9 | 67900 | 0.9429 | | 0.3208 | 462.59 | 68000 | 0.9561 | | 0.3147 | 463.27 | 68100 | 0.8795 | | 0.3147 | 463.95 | 68200 | 0.8876 | | 0.3222 | 464.63 | 68300 | 0.9007 | | 0.3241 | 465.31 | 68400 | 0.9475 | | 0.3241 | 465.99 | 68500 | 0.9403 | | 0.3312 | 466.67 | 68600 | 0.9368 | | 0.3302 | 467.35 | 68700 | 0.8937 | | 0.3201 | 468.03 | 68800 | 0.9319 | | 0.3201 | 468.71 | 68900 | 0.9094 | | 0.3217 | 469.39 | 69000 | 0.9517 | | 0.3193 | 470.07 | 69100 | 0.8895 | | 0.3193 | 470.75 | 69200 | 0.9202 | | 0.3352 | 471.43 | 69300 | 0.9320 | | 0.3249 | 472.11 | 69400 | 0.9640 | | 0.3249 | 472.79 | 69500 | 0.9452 | | 0.3097 | 473.47 | 69600 | 0.9311 | | 0.327 | 474.15 | 69700 | 0.9392 | | 0.327 | 474.83 | 69800 | 0.9525 | | 0.3271 | 475.51 | 69900 | 0.9064 | | 0.3165 | 476.19 | 70000 | 0.9455 | | 0.3165 | 476.87 | 70100 | 0.9435 | | 0.3103 | 477.55 | 70200 | 0.8891 | | 0.3189 | 478.23 | 70300 | 0.9199 | | 0.3189 | 478.91 | 70400 | 0.9362 | | 0.3264 | 479.59 | 70500 | 0.9289 | | 0.313 | 480.27 | 70600 | 0.9246 | | 0.313 | 480.95 | 70700 | 0.9549 | | 0.3289 | 481.63 | 70800 | 0.9513 | | 0.3189 | 482.31 | 70900 | 0.9798 | | 0.3189 | 482.99 | 71000 | 0.9027 | | 0.3177 | 483.67 | 71100 | 0.8823 | | 0.3219 | 484.35 | 71200 | 0.9269 | | 0.3175 | 485.03 | 71300 | 0.8984 | | 0.3175 | 485.71 | 71400 | 0.8696 | | 0.3167 | 486.39 | 71500 | 0.8722 | | 0.318 | 487.07 | 71600 | 0.8909 | | 0.318 | 487.76 | 71700 | 0.8783 | | 0.3128 | 488.44 | 71800 | 0.8144 | | 0.315 | 489.12 | 71900 | 0.8250 | | 0.315 | 489.8 | 72000 | 0.8791 | | 0.3085 | 490.48 | 72100 | 0.9192 | | 0.3081 | 491.16 | 72200 | 0.8403 | | 0.3081 | 491.84 | 72300 | 0.9223 | | 0.31 | 492.52 | 72400 | 0.8974 | | 0.3054 | 493.2 | 72500 | 0.9169 | | 0.3054 | 493.88 | 72600 | 0.8845 | | 0.3134 | 494.56 | 72700 | 0.9554 | | 0.3083 | 495.24 | 72800 | 0.9337 | | 0.3083 | 495.92 | 72900 | 0.9209 | | 0.3028 | 496.6 | 73000 | 0.9142 | | 0.3016 | 497.28 | 73100 | 0.9345 | | 0.3016 | 497.96 | 73200 | 0.9100 | | 0.3075 | 498.64 | 73300 | 0.8989 | | 0.3105 | 499.32 | 73400 | 0.8598 | | 0.3105 | 500.0 | 73500 | 0.9177 | | 0.3059 | 500.68 | 73600 | 0.9242 | | 0.3018 | 501.36 | 73700 | 0.9403 | | 0.3159 | 502.04 | 73800 | 0.9011 | | 0.3159 | 502.72 | 73900 | 0.9442 | | 0.2996 | 503.4 | 74000 | 0.9575 | | 0.3016 | 504.08 | 74100 | 0.9119 | | 0.3016 | 504.76 | 74200 | 0.9072 | | 0.3072 | 505.44 | 74300 | 0.9389 | | 0.3042 | 506.12 | 74400 | 0.9038 | | 0.3042 | 506.8 | 74500 | 0.8814 | | 0.3142 | 507.48 | 74600 | 0.9452 | | 0.3099 | 508.16 | 74700 | 0.9395 | | 0.3099 | 508.84 | 74800 | 0.9604 | | 0.3081 | 509.52 | 74900 | 0.9176 | | 0.3175 | 510.2 | 75000 | 0.8799 | | 0.3175 | 510.88 | 75100 | 0.8732 | | 0.3052 | 511.56 | 75200 | 0.8323 | | 0.2961 | 512.24 | 75300 | 0.8956 | | 0.2961 | 512.93 | 75400 | 0.8629 | | 0.3012 | 513.61 | 75500 | 0.8523 | | 0.2999 | 514.29 | 75600 | 0.8276 | | 0.2999 | 514.97 | 75700 | 0.9008 | | 0.298 | 515.65 | 75800 | 0.8051 | | 0.2968 | 516.33 | 75900 | 0.8240 | | 0.2907 | 517.01 | 76000 | 0.9271 | | 0.2907 | 517.69 | 76100 | 0.8934 | | 0.2859 | 518.37 | 76200 | 0.9044 | | 0.306 | 519.05 | 76300 | 0.8994 | | 0.306 | 519.73 | 76400 | 0.8539 | | 0.2947 | 520.41 | 76500 | 0.9063 | | 0.2977 | 521.09 | 76600 | 0.9074 | | 0.2977 | 521.77 | 76700 | 0.9297 | | 0.2991 | 522.45 | 76800 | 0.9109 | | 0.3013 | 523.13 | 76900 | 0.9491 | | 0.3013 | 523.81 | 77000 | 0.8518 | | 0.3 | 524.49 | 77100 | 0.9199 | | 0.3009 | 525.17 | 77200 | 0.9277 | | 0.3009 | 525.85 | 77300 | 0.9617 | | 0.3054 | 526.53 | 77400 | 0.9254 | | 0.2994 | 527.21 | 77500 | 0.8886 | | 0.2994 | 527.89 | 77600 | 0.8579 | | 0.2957 | 528.57 | 77700 | 0.9694 | | 0.3082 | 529.25 | 77800 | 0.9411 | | 0.3082 | 529.93 | 77900 | 0.8823 | | 0.2928 | 530.61 | 78000 | 0.8684 | | 0.2936 | 531.29 | 78100 | 0.9942 | | 0.2936 | 531.97 | 78200 | 0.8861 | | 0.2964 | 532.65 | 78300 | 0.8939 | | 0.2914 | 533.33 | 78400 | 0.9633 | | 0.2928 | 534.01 | 78500 | 0.8713 | | 0.2928 | 534.69 | 78600 | 0.8938 | | 0.2909 | 535.37 | 78700 | 0.8905 | | 0.2966 | 536.05 | 78800 | 0.9006 | | 0.2966 | 536.73 | 78900 | 0.9431 | | 0.2886 | 537.41 | 79000 | 0.9343 | | 0.2922 | 538.1 | 79100 | 0.9032 | | 0.2922 | 538.78 | 79200 | 0.9507 | | 0.2817 | 539.46 | 79300 | 0.9199 | | 0.2917 | 540.14 | 79400 | 0.9156 | | 0.2917 | 540.82 | 79500 | 0.9175 | | 0.29 | 541.5 | 79600 | 0.9104 | | 0.291 | 542.18 | 79700 | 0.9223 | | 0.291 | 542.86 | 79800 | 0.9622 | | 0.3055 | 543.54 | 79900 | 0.8998 | | 0.2842 | 544.22 | 80000 | 0.9216 | | 0.2842 | 544.9 | 80100 | 0.9475 | | 0.2952 | 545.58 | 80200 | 0.9345 | | 0.278 | 546.26 | 80300 | 0.9923 | | 0.278 | 546.94 | 80400 | 0.9217 | | 0.2882 | 547.62 | 80500 | 0.9385 | | 0.286 | 548.3 | 80600 | 0.9422 | | 0.286 | 548.98 | 80700 | 0.9100 | | 0.2828 | 549.66 | 80800 | 0.9751 | | 0.2903 | 550.34 | 80900 | 0.9360 | | 0.2803 | 551.02 | 81000 | 0.9827 | | 0.2803 | 551.7 | 81100 | 0.9771 | | 0.282 | 552.38 | 81200 | 1.0085 | | 0.2901 | 553.06 | 81300 | 0.9342 | | 0.2901 | 553.74 | 81400 | 1.0034 | | 0.2822 | 554.42 | 81500 | 0.9586 | | 0.281 | 555.1 | 81600 | 0.9590 | | 0.281 | 555.78 | 81700 | 0.9488 | | 0.2824 | 556.46 | 81800 | 0.9709 | | 0.287 | 557.14 | 81900 | 0.9507 | | 0.287 | 557.82 | 82000 | 0.9429 | | 0.2873 | 558.5 | 82100 | 0.9334 | | 0.2806 | 559.18 | 82200 | 0.9271 | | 0.2806 | 559.86 | 82300 | 0.9470 | | 0.2892 | 560.54 | 82400 | 0.9602 | | 0.2772 | 561.22 | 82500 | 0.9843 | | 0.2772 | 561.9 | 82600 | 0.9335 | | 0.2881 | 562.59 | 82700 | 0.9451 | | 0.2816 | 563.27 | 82800 | 0.9621 | | 0.2816 | 563.95 | 82900 | 0.9989 | | 0.2813 | 564.63 | 83000 | 0.9163 | | 0.2804 | 565.31 | 83100 | 0.9638 | | 0.2804 | 565.99 | 83200 | 0.9520 | | 0.2748 | 566.67 | 83300 | 0.9263 | | 0.2795 | 567.35 | 83400 | 0.9293 | | 0.2804 | 568.03 | 83500 | 0.9620 | | 0.2804 | 568.71 | 83600 | 0.9169 | | 0.2741 | 569.39 | 83700 | 0.9286 | | 0.2718 | 570.07 | 83800 | 0.9334 | | 0.2718 | 570.75 | 83900 | 0.9654 | | 0.2782 | 571.43 | 84000 | 0.9761 | | 0.2843 | 572.11 | 84100 | 0.9883 | | 0.2843 | 572.79 | 84200 | 0.9993 | | 0.2805 | 573.47 | 84300 | 0.9312 | | 0.2793 | 574.15 | 84400 | 0.9932 | | 0.2793 | 574.83 | 84500 | 0.9828 | | 0.2722 | 575.51 | 84600 | 0.9558 | | 0.273 | 576.19 | 84700 | 0.9739 | | 0.273 | 576.87 | 84800 | 0.9193 | | 0.2706 | 577.55 | 84900 | 0.9511 | | 0.2745 | 578.23 | 85000 | 0.9054 | | 0.2745 | 578.91 | 85100 | 0.9574 | | 0.2715 | 579.59 | 85200 | 0.9881 | | 0.2715 | 580.27 | 85300 | 0.9603 | | 0.2715 | 580.95 | 85400 | 1.0218 | | 0.2789 | 581.63 | 85500 | 0.9076 | | 0.274 | 582.31 | 85600 | 0.9393 | | 0.274 | 582.99 | 85700 | 0.8968 | | 0.2762 | 583.67 | 85800 | 0.9474 | | 0.2767 | 584.35 | 85900 | 0.9883 | | 0.2688 | 585.03 | 86000 | 0.9717 | | 0.2688 | 585.71 | 86100 | 1.0013 | | 0.2706 | 586.39 | 86200 | 0.9569 | | 0.2739 | 587.07 | 86300 | 0.9369 | | 0.2739 | 587.76 | 86400 | 0.8882 | | 0.2716 | 588.44 | 86500 | 0.9189 | | 0.2693 | 589.12 | 86600 | 0.9402 | | 0.2693 | 589.8 | 86700 | 0.9262 | | 0.2667 | 590.48 | 86800 | 0.9782 | | 0.268 | 591.16 | 86900 | 0.9457 | | 0.268 | 591.84 | 87000 | 0.9509 | | 0.2726 | 592.52 | 87100 | 0.9320 | | 0.275 | 593.2 | 87200 | 0.9357 | | 0.275 | 593.88 | 87300 | 0.9786 | | 0.2673 | 594.56 | 87400 | 0.9770 | | 0.2684 | 595.24 | 87500 | 0.9389 | | 0.2684 | 595.92 | 87600 | 0.9558 | | 0.2664 | 596.6 | 87700 | 0.9698 | | 0.2691 | 597.28 | 87800 | 1.0059 | | 0.2691 | 597.96 | 87900 | 0.9660 | | 0.2753 | 598.64 | 88000 | 0.9761 | | 0.2547 | 599.32 | 88100 | 0.9627 | | 0.2547 | 600.0 | 88200 | 0.9621 | | 0.2691 | 600.68 | 88300 | 0.9752 | | 0.266 | 601.36 | 88400 | 0.9677 | | 0.2675 | 602.04 | 88500 | 0.9663 | | 0.2675 | 602.72 | 88600 | 0.9749 | | 0.2747 | 603.4 | 88700 | 0.9452 | | 0.2674 | 604.08 | 88800 | 0.9587 | | 0.2674 | 604.76 | 88900 | 0.9693 | | 0.2801 | 605.44 | 89000 | 0.9513 | | 0.2722 | 606.12 | 89100 | 0.9783 | | 0.2722 | 606.8 | 89200 | 0.9452 | | 0.2731 | 607.48 | 89300 | 0.9678 | | 0.2723 | 608.16 | 89400 | 0.9786 | | 0.2723 | 608.84 | 89500 | 0.9852 | | 0.2651 | 609.52 | 89600 | 0.9570 | | 0.2811 | 610.2 | 89700 | 0.9567 | | 0.2811 | 610.88 | 89800 | 0.9049 | | 0.2688 | 611.56 | 89900 | 0.9634 | | 0.2624 | 612.24 | 90000 | 0.8975 | | 0.2624 | 612.93 | 90100 | 0.9899 | | 0.2616 | 613.61 | 90200 | 0.9626 | | 0.2603 | 614.29 | 90300 | 0.9310 | | 0.2603 | 614.97 | 90400 | 0.9788 | | 0.2721 | 615.65 | 90500 | 0.9413 | | 0.2622 | 616.33 | 90600 | 0.9807 | | 0.2683 | 617.01 | 90700 | 0.9218 | | 0.2683 | 617.69 | 90800 | 0.9893 | | 0.2573 | 618.37 | 90900 | 0.9086 | | 0.2654 | 619.05 | 91000 | 0.9373 | | 0.2654 | 619.73 | 91100 | 0.9583 | | 0.2647 | 620.41 | 91200 | 0.9232 | | 0.2616 | 621.09 | 91300 | 0.9738 | | 0.2616 | 621.77 | 91400 | 0.9405 | | 0.258 | 622.45 | 91500 | 0.9601 | | 0.2632 | 623.13 | 91600 | 0.9567 | | 0.2632 | 623.81 | 91700 | 0.9362 | | 0.2636 | 624.49 | 91800 | 0.9496 | | 0.2636 | 625.17 | 91900 | 1.0030 | | 0.2636 | 625.85 | 92000 | 0.9785 | | 0.2454 | 626.53 | 92100 | 0.9485 | | 0.2533 | 627.21 | 92200 | 0.9630 | | 0.2533 | 627.89 | 92300 | 0.9709 | | 0.2596 | 628.57 | 92400 | 0.9479 | | 0.256 | 629.25 | 92500 | 0.9214 | | 0.256 | 629.93 | 92600 | 0.9570 | | 0.255 | 630.61 | 92700 | 0.9472 | | 0.2613 | 631.29 | 92800 | 0.9457 | | 0.2613 | 631.97 | 92900 | 0.9615 | | 0.2703 | 632.65 | 93000 | 0.9583 | | 0.2582 | 633.33 | 93100 | 0.9601 | | 0.2634 | 634.01 | 93200 | 0.9444 | | 0.2634 | 634.69 | 93300 | 0.9499 | | 0.259 | 635.37 | 93400 | 0.9512 | | 0.2617 | 636.05 | 93500 | 0.9543 | | 0.2617 | 636.73 | 93600 | 0.9303 | | 0.2611 | 637.41 | 93700 | 0.9388 | | 0.2513 | 638.1 | 93800 | 0.9443 | | 0.2513 | 638.78 | 93900 | 0.9276 | | 0.2571 | 639.46 | 94000 | 0.9073 | | 0.2636 | 640.14 | 94100 | 0.9122 | | 0.2636 | 640.82 | 94200 | 0.9132 | | 0.2673 | 641.5 | 94300 | 0.9055 | | 0.2594 | 642.18 | 94400 | 0.9299 | | 0.2594 | 642.86 | 94500 | 0.9161 | | 0.2552 | 643.54 | 94600 | 0.9347 | | 0.254 | 644.22 | 94700 | 0.9239 | | 0.254 | 644.9 | 94800 | 0.9454 | | 0.2522 | 645.58 | 94900 | 0.9481 | | 0.2556 | 646.26 | 95000 | 0.9153 | | 0.2556 | 646.94 | 95100 | 0.9141 | | 0.2583 | 647.62 | 95200 | 0.9280 | | 0.2645 | 648.3 | 95300 | 0.9218 | | 0.2645 | 648.98 | 95400 | 0.9603 | | 0.2512 | 649.66 | 95500 | 0.9017 | | 0.2602 | 650.34 | 95600 | 0.9101 | | 0.255 | 651.02 | 95700 | 0.9184 | | 0.255 | 651.7 | 95800 | 0.9234 | | 0.2547 | 652.38 | 95900 | 0.9194 | | 0.2546 | 653.06 | 96000 | 0.9825 | | 0.2546 | 653.74 | 96100 | 0.9515 | | 0.2526 | 654.42 | 96200 | 0.9067 | | 0.261 | 655.1 | 96300 | 0.9282 | | 0.261 | 655.78 | 96400 | 0.9561 | | 0.2545 | 656.46 | 96500 | 0.9466 | | 0.2509 | 657.14 | 96600 | 0.9294 | | 0.2509 | 657.82 | 96700 | 0.9114 | | 0.2503 | 658.5 | 96800 | 1.0040 | | 0.2482 | 659.18 | 96900 | 0.9106 | | 0.2482 | 659.86 | 97000 | 0.9159 | | 0.2523 | 660.54 | 97100 | 0.9490 | | 0.2528 | 661.22 | 97200 | 0.9538 | | 0.2528 | 661.9 | 97300 | 0.9570 | | 0.2455 | 662.59 | 97400 | 0.8882 | | 0.2502 | 663.27 | 97500 | 0.9164 | | 0.2502 | 663.95 | 97600 | 0.9269 | | 0.2465 | 664.63 | 97700 | 0.9628 | | 0.2524 | 665.31 | 97800 | 0.8976 | | 0.2524 | 665.99 | 97900 | 0.9017 | | 0.2479 | 666.67 | 98000 | 0.9197 | | 0.249 | 667.35 | 98100 | 0.9282 | | 0.2533 | 668.03 | 98200 | 0.9342 | | 0.2533 | 668.71 | 98300 | 0.9494 | | 0.2501 | 669.39 | 98400 | 0.9430 | | 0.2444 | 670.07 | 98500 | 0.9252 | | 0.2444 | 670.75 | 98600 | 0.9799 | | 0.243 | 671.43 | 98700 | 0.9195 | | 0.249 | 672.11 | 98800 | 0.9142 | | 0.249 | 672.79 | 98900 | 0.9553 | | 0.2528 | 673.47 | 99000 | 0.9196 | | 0.244 | 674.15 | 99100 | 0.9640 | | 0.244 | 674.83 | 99200 | 0.9809 | | 0.2462 | 675.51 | 99300 | 0.9868 | | 0.247 | 676.19 | 99400 | 0.9640 | | 0.247 | 676.87 | 99500 | 0.9228 | | 0.2615 | 677.55 | 99600 | 0.9172 | | 0.2487 | 678.23 | 99700 | 0.9166 | | 0.2487 | 678.91 | 99800 | 0.8928 | | 0.242 | 679.59 | 99900 | 0.8830 | | 0.2448 | 680.27 | 100000 | 0.9209 | | 0.2448 | 680.95 | 100100 | 0.9139 | | 0.2488 | 681.63 | 100200 | 0.8970 | | 0.2504 | 682.31 | 100300 | 0.9254 | | 0.2504 | 682.99 | 100400 | 0.9437 | | 0.2381 | 683.67 | 100500 | 0.9419 | | 0.245 | 684.35 | 100600 | 0.9379 | | 0.2452 | 685.03 | 100700 | 0.9465 | | 0.2452 | 685.71 | 100800 | 0.9626 | | 0.2482 | 686.39 | 100900 | 0.9472 | | 0.2456 | 687.07 | 101000 | 0.9434 | | 0.2456 | 687.76 | 101100 | 0.9426 | | 0.2388 | 688.44 | 101200 | 0.9440 | | 0.2496 | 689.12 | 101300 | 0.9311 | | 0.2496 | 689.8 | 101400 | 0.9338 | | 0.2399 | 690.48 | 101500 | 0.9290 | | 0.2427 | 691.16 | 101600 | 0.9347 | | 0.2427 | 691.84 | 101700 | 0.9197 | | 0.2709 | 692.52 | 101800 | 0.9046 | | 0.2474 | 693.2 | 101900 | 0.9455 | | 0.2474 | 693.88 | 102000 | 0.9212 | | 0.2411 | 694.56 | 102100 | 0.9508 | | 0.242 | 695.24 | 102200 | 0.9558 | | 0.242 | 695.92 | 102300 | 0.9846 | | 0.2443 | 696.6 | 102400 | 0.9656 | | 0.2356 | 697.28 | 102500 | 0.9428 | | 0.2356 | 697.96 | 102600 | 0.9238 | | 0.2422 | 698.64 | 102700 | 0.9156 | | 0.2341 | 699.32 | 102800 | 0.9324 | | 0.2341 | 700.0 | 102900 | 0.9372 | | 0.2382 | 700.68 | 103000 | 0.9374 | | 0.2407 | 701.36 | 103100 | 0.9342 | | 0.2427 | 702.04 | 103200 | 0.9400 | | 0.2427 | 702.72 | 103300 | 0.9451 | | 0.2373 | 703.4 | 103400 | 0.9355 | | 0.2439 | 704.08 | 103500 | 0.9281 | | 0.2439 | 704.76 | 103600 | 0.9282 | | 0.247 | 705.44 | 103700 | 0.9186 | | 0.2391 | 706.12 | 103800 | 0.8933 | | 0.2391 | 706.8 | 103900 | 0.9392 | | 0.2467 | 707.48 | 104000 | 0.9764 | | 0.238 | 708.16 | 104100 | 0.9495 | | 0.238 | 708.84 | 104200 | 0.9409 | | 0.2436 | 709.52 | 104300 | 0.9296 | | 0.2396 | 710.2 | 104400 | 0.9472 | | 0.2396 | 710.88 | 104500 | 0.9574 | | 0.2476 | 711.56 | 104600 | 0.9231 | | 0.2397 | 712.24 | 104700 | 0.8930 | | 0.2397 | 712.93 | 104800 | 0.9173 | | 0.2448 | 713.61 | 104900 | 0.9187 | | 0.2448 | 714.29 | 105000 | 0.9194 | | 0.2448 | 714.97 | 105100 | 0.9242 | | 0.2365 | 715.65 | 105200 | 0.9254 | | 0.2374 | 716.33 | 105300 | 0.8915 | | 0.2417 | 717.01 | 105400 | 0.9117 | | 0.2417 | 717.69 | 105500 | 0.9284 | | 0.2355 | 718.37 | 105600 | 0.9527 | | 0.2344 | 719.05 | 105700 | 0.9486 | | 0.2344 | 719.73 | 105800 | 0.9683 | | 0.2387 | 720.41 | 105900 | 0.9552 | | 0.2395 | 721.09 | 106000 | 0.9223 | | 0.2395 | 721.77 | 106100 | 0.9092 | | 0.2433 | 722.45 | 106200 | 0.9380 | | 0.2353 | 723.13 | 106300 | 0.9535 | | 0.2353 | 723.81 | 106400 | 0.9584 | | 0.2375 | 724.49 | 106500 | 0.9346 | | 0.2325 | 725.17 | 106600 | 0.9275 | | 0.2325 | 725.85 | 106700 | 0.9382 | | 0.2335 | 726.53 | 106800 | 0.9199 | | 0.234 | 727.21 | 106900 | 0.9580 | | 0.234 | 727.89 | 107000 | 0.9703 | | 0.2324 | 728.57 | 107100 | 0.9901 | | 0.2303 | 729.25 | 107200 | 0.9820 | | 0.2303 | 729.93 | 107300 | 0.9646 | | 0.2291 | 730.61 | 107400 | 0.9548 | | 0.2415 | 731.29 | 107500 | 0.9346 | | 0.2415 | 731.97 | 107600 | 0.9242 | | 0.2385 | 732.65 | 107700 | 0.9535 | | 0.2324 | 733.33 | 107800 | 0.9393 | | 0.233 | 734.01 | 107900 | 0.9851 | | 0.233 | 734.69 | 108000 | 0.9890 | | 0.2324 | 735.37 | 108100 | 0.9940 | | 0.2332 | 736.05 | 108200 | 0.9818 | | 0.2332 | 736.73 | 108300 | 0.9922 | | 0.2317 | 737.41 | 108400 | 0.9977 | | 0.2343 | 738.1 | 108500 | 0.9593 | | 0.2343 | 738.78 | 108600 | 0.9995 | | 0.2288 | 739.46 | 108700 | 1.0020 | | 0.2291 | 740.14 | 108800 | 0.9864 | | 0.2291 | 740.82 | 108900 | 0.9740 | | 0.2308 | 741.5 | 109000 | 0.9993 | | 0.2347 | 742.18 | 109100 | 1.0078 | | 0.2347 | 742.86 | 109200 | 0.9762 | | 0.2309 | 743.54 | 109300 | 0.9776 | | 0.2264 | 744.22 | 109400 | 0.9629 | | 0.2264 | 744.9 | 109500 | 0.9830 | | 0.2312 | 745.58 | 109600 | 0.9737 | | 0.2253 | 746.26 | 109700 | 1.0124 | | 0.2253 | 746.94 | 109800 | 1.0121 | | 0.2275 | 747.62 | 109900 | 0.9924 | | 0.2331 | 748.3 | 110000 | 0.9456 | | 0.2331 | 748.98 | 110100 | 0.9486 | | 0.2275 | 749.66 | 110200 | 0.9388 | | 0.2303 | 750.34 | 110300 | 0.9762 | | 0.2322 | 751.02 | 110400 | 0.9643 | | 0.2322 | 751.7 | 110500 | 0.9878 | | 0.232 | 752.38 | 110600 | 0.9780 | | 0.2348 | 753.06 | 110700 | 0.9774 | | 0.2348 | 753.74 | 110800 | 1.0018 | | 0.2312 | 754.42 | 110900 | 0.9684 | | 0.2304 | 755.1 | 111000 | 0.9828 | | 0.2304 | 755.78 | 111100 | 0.9591 | | 0.2412 | 756.46 | 111200 | 0.9862 | | 0.2313 | 757.14 | 111300 | 0.9796 | | 0.2313 | 757.82 | 111400 | 0.9653 | | 0.2309 | 758.5 | 111500 | 0.9666 | | 0.2293 | 759.18 | 111600 | 1.0382 | | 0.2293 | 759.86 | 111700 | 1.0208 | | 0.235 | 760.54 | 111800 | 1.0372 | | 0.2337 | 761.22 | 111900 | 1.0057 | | 0.2337 | 761.9 | 112000 | 1.0245 | | 0.2309 | 762.59 | 112100 | 0.9766 | | 0.2275 | 763.27 | 112200 | 0.9449 | | 0.2275 | 763.95 | 112300 | 0.9659 | | 0.2263 | 764.63 | 112400 | 0.9614 | | 0.2325 | 765.31 | 112500 | 0.9605 | | 0.2325 | 765.99 | 112600 | 0.9494 | | 0.2292 | 766.67 | 112700 | 0.9632 | | 0.2246 | 767.35 | 112800 | 0.9762 | | 0.2268 | 768.03 | 112900 | 0.9754 | | 0.2268 | 768.71 | 113000 | 0.9704 | | 0.2274 | 769.39 | 113100 | 0.9722 | | 0.2234 | 770.07 | 113200 | 0.9678 | | 0.2234 | 770.75 | 113300 | 0.9736 | | 0.2238 | 771.43 | 113400 | 1.0298 | | 0.2242 | 772.11 | 113500 | 0.9642 | | 0.2242 | 772.79 | 113600 | 0.9844 | | 0.2257 | 773.47 | 113700 | 0.9649 | | 0.225 | 774.15 | 113800 | 0.9992 | | 0.225 | 774.83 | 113900 | 0.9868 | | 0.2262 | 775.51 | 114000 | 1.0092 | | 0.2266 | 776.19 | 114100 | 0.9961 | | 0.2266 | 776.87 | 114200 | 0.9714 | | 0.2314 | 777.55 | 114300 | 0.9864 | | 0.217 | 778.23 | 114400 | 0.9824 | | 0.217 | 778.91 | 114500 | 0.9910 | | 0.2248 | 779.59 | 114600 | 0.9945 | | 0.223 | 780.27 | 114700 | 0.9858 | | 0.223 | 780.95 | 114800 | 0.9657 | | 0.2312 | 781.63 | 114900 | 1.0191 | | 0.2223 | 782.31 | 115000 | 1.0089 | | 0.2223 | 782.99 | 115100 | 1.0103 | | 0.2222 | 783.67 | 115200 | 1.0265 | | 0.2231 | 784.35 | 115300 | 1.0014 | | 0.2276 | 785.03 | 115400 | 0.9888 | | 0.2276 | 785.71 | 115500 | 0.9721 | | 0.222 | 786.39 | 115600 | 0.9885 | | 0.2142 | 787.07 | 115700 | 0.9856 | | 0.2142 | 787.76 | 115800 | 0.9973 | | 0.2208 | 788.44 | 115900 | 0.9472 | | 0.223 | 789.12 | 116000 | 0.9729 | | 0.223 | 789.8 | 116100 | 0.9979 | | 0.2207 | 790.48 | 116200 | 0.9717 | | 0.2329 | 791.16 | 116300 | 0.9832 | | 0.2329 | 791.84 | 116400 | 0.9535 | | 0.2174 | 792.52 | 116500 | 0.9792 | | 0.219 | 793.2 | 116600 | 0.9819 | | 0.219 | 793.88 | 116700 | 1.0191 | | 0.2262 | 794.56 | 116800 | 1.0070 | | 0.2202 | 795.24 | 116900 | 0.9743 | | 0.2202 | 795.92 | 117000 | 0.9888 | | 0.2205 | 796.6 | 117100 | 0.9719 | | 0.2217 | 797.28 | 117200 | 0.9671 | | 0.2217 | 797.96 | 117300 | 0.9480 | | 0.226 | 798.64 | 117400 | 0.9839 | | 0.2181 | 799.32 | 117500 | 0.9551 | | 0.2181 | 800.0 | 117600 | 0.9727 | | 0.2178 | 800.68 | 117700 | 0.9849 | | 0.2226 | 801.36 | 117800 | 0.9799 | | 0.2151 | 802.04 | 117900 | 0.9489 | | 0.2151 | 802.72 | 118000 | 0.9519 | | 0.2284 | 803.4 | 118100 | 0.9786 | | 0.2168 | 804.08 | 118200 | 0.9589 | | 0.2168 | 804.76 | 118300 | 0.9683 | | 0.2161 | 805.44 | 118400 | 0.9861 | | 0.2113 | 806.12 | 118500 | 0.9648 | | 0.2113 | 806.8 | 118600 | 0.9970 | | 0.2201 | 807.48 | 118700 | 0.9777 | | 0.2105 | 808.16 | 118800 | 0.9693 | | 0.2105 | 808.84 | 118900 | 0.9831 | | 0.2139 | 809.52 | 119000 | 0.9316 | | 0.2263 | 810.2 | 119100 | 0.9245 | | 0.2263 | 810.88 | 119200 | 0.9254 | | 0.2275 | 811.56 | 119300 | 0.9750 | | 0.2133 | 812.24 | 119400 | 0.9973 | | 0.2133 | 812.93 | 119500 | 0.9579 | | 0.2132 | 813.61 | 119600 | 0.9847 | | 0.2167 | 814.29 | 119700 | 0.9638 | | 0.2167 | 814.97 | 119800 | 0.9713 | | 0.2161 | 815.65 | 119900 | 0.9488 | | 0.2224 | 816.33 | 120000 | 1.0207 | | 0.215 | 817.01 | 120100 | 0.9745 | | 0.215 | 817.69 | 120200 | 0.9800 | | 0.2142 | 818.37 | 120300 | 0.9843 | | 0.2146 | 819.05 | 120400 | 0.9693 | | 0.2146 | 819.73 | 120500 | 0.9966 | | 0.2169 | 820.41 | 120600 | 0.9695 | | 0.2137 | 821.09 | 120700 | 0.9613 | | 0.2137 | 821.77 | 120800 | 0.9962 | | 0.2141 | 822.45 | 120900 | 0.9930 | | 0.2185 | 823.13 | 121000 | 0.9766 | | 0.2185 | 823.81 | 121100 | 0.9663 | | 0.2104 | 824.49 | 121200 | 0.9545 | | 0.2167 | 825.17 | 121300 | 0.9401 | | 0.2167 | 825.85 | 121400 | 0.9651 | | 0.2123 | 826.53 | 121500 | 0.9568 | | 0.2174 | 827.21 | 121600 | 0.9756 | | 0.2174 | 827.89 | 121700 | 0.9679 | | 0.2195 | 828.57 | 121800 | 0.9835 | | 0.2204 | 829.25 | 121900 | 0.9675 | | 0.2204 | 829.93 | 122000 | 0.9839 | | 0.2139 | 830.61 | 122100 | 0.9765 | | 0.2218 | 831.29 | 122200 | 0.9590 | | 0.2218 | 831.97 | 122300 | 0.9659 | | 0.2178 | 832.65 | 122400 | 0.9701 | | 0.2113 | 833.33 | 122500 | 0.9306 | | 0.2159 | 834.01 | 122600 | 0.9616 | | 0.2159 | 834.69 | 122700 | 0.9466 | | 0.2158 | 835.37 | 122800 | 0.9510 | | 0.2145 | 836.05 | 122900 | 0.9692 | | 0.2145 | 836.73 | 123000 | 0.9628 | | 0.2117 | 837.41 | 123100 | 0.9403 | | 0.2118 | 838.1 | 123200 | 0.9518 | | 0.2118 | 838.78 | 123300 | 0.9710 | | 0.2114 | 839.46 | 123400 | 0.9493 | | 0.2141 | 840.14 | 123500 | 0.9499 | | 0.2141 | 840.82 | 123600 | 0.9426 | | 0.2091 | 841.5 | 123700 | 0.9513 | | 0.2104 | 842.18 | 123800 | 0.9460 | | 0.2104 | 842.86 | 123900 | 0.9268 | | 0.2076 | 843.54 | 124000 | 0.9714 | | 0.2069 | 844.22 | 124100 | 0.9622 | | 0.2069 | 844.9 | 124200 | 0.9883 | | 0.2093 | 845.58 | 124300 | 0.9668 | | 0.2098 | 846.26 | 124400 | 0.9509 | | 0.2098 | 846.94 | 124500 | 0.9675 | | 0.2106 | 847.62 | 124600 | 0.9406 | | 0.2176 | 848.3 | 124700 | 0.9220 | | 0.2176 | 848.98 | 124800 | 0.9003 | | 0.2068 | 849.66 | 124900 | 0.9253 | | 0.2101 | 850.34 | 125000 | 0.8712 | | 0.2164 | 851.02 | 125100 | 0.9273 | | 0.2164 | 851.7 | 125200 | 0.9093 | | 0.214 | 852.38 | 125300 | 0.9479 | | 0.2191 | 853.06 | 125400 | 0.9132 | | 0.2191 | 853.74 | 125500 | 0.9244 | | 0.2205 | 854.42 | 125600 | 0.9187 | | 0.2082 | 855.1 | 125700 | 0.9112 | | 0.2082 | 855.78 | 125800 | 0.9785 | | 0.206 | 856.46 | 125900 | 1.0037 | | 0.203 | 857.14 | 126000 | 1.0003 | | 0.203 | 857.82 | 126100 | 0.9682 | | 0.2121 | 858.5 | 126200 | 0.9759 | | 0.2079 | 859.18 | 126300 | 0.9583 | | 0.2079 | 859.86 | 126400 | 0.9627 | | 0.2064 | 860.54 | 126500 | 0.9796 | | 0.2132 | 861.22 | 126600 | 0.9863 | | 0.2132 | 861.9 | 126700 | 0.9890 | | 0.2132 | 862.59 | 126800 | 1.0000 | | 0.2108 | 863.27 | 126900 | 0.9936 | | 0.2108 | 863.95 | 127000 | 0.9510 | | 0.2075 | 864.63 | 127100 | 0.9674 | | 0.2081 | 865.31 | 127200 | 0.9562 | | 0.2081 | 865.99 | 127300 | 0.9576 | | 0.2165 | 866.67 | 127400 | 0.9516 | | 0.2103 | 867.35 | 127500 | 0.9649 | | 0.2078 | 868.03 | 127600 | 0.9543 | | 0.2078 | 868.71 | 127700 | 0.9340 | | 0.2001 | 869.39 | 127800 | 0.9447 | | 0.2086 | 870.07 | 127900 | 0.9299 | | 0.2086 | 870.75 | 128000 | 0.9294 | | 0.2034 | 871.43 | 128100 | 0.9396 | | 0.205 | 872.11 | 128200 | 0.9387 | | 0.205 | 872.79 | 128300 | 0.9331 | | 0.2083 | 873.47 | 128400 | 0.9292 | | 0.2118 | 874.15 | 128500 | 0.9468 | | 0.2118 | 874.83 | 128600 | 0.9398 | | 0.2061 | 875.51 | 128700 | 0.9466 | | 0.2117 | 876.19 | 128800 | 0.9093 | | 0.2117 | 876.87 | 128900 | 0.9129 | | 0.207 | 877.55 | 129000 | 0.9233 | | 0.2038 | 878.23 | 129100 | 0.9220 | | 0.2038 | 878.91 | 129200 | 0.9356 | | 0.207 | 879.59 | 129300 | 0.9280 | | 0.2088 | 880.27 | 129400 | 0.9434 | | 0.2088 | 880.95 | 129500 | 0.9478 | | 0.2077 | 881.63 | 129600 | 0.9528 | | 0.2027 | 882.31 | 129700 | 0.9433 | | 0.2027 | 882.99 | 129800 | 0.9510 | | 0.2054 | 883.67 | 129900 | 0.9538 | | 0.2049 | 884.35 | 130000 | 0.9634 | | 0.2022 | 885.03 | 130100 | 0.9260 | | 0.2022 | 885.71 | 130200 | 0.9655 | | 0.206 | 886.39 | 130300 | 0.9469 | | 0.2027 | 887.07 | 130400 | 0.9635 | | 0.2027 | 887.76 | 130500 | 0.9606 | | 0.2003 | 888.44 | 130600 | 0.9452 | | 0.2049 | 889.12 | 130700 | 0.9407 | | 0.2049 | 889.8 | 130800 | 0.9174 | | 0.2086 | 890.48 | 130900 | 0.9513 | | 0.2018 | 891.16 | 131000 | 0.9203 | | 0.2018 | 891.84 | 131100 | 0.9370 | | 0.2109 | 892.52 | 131200 | 0.9344 | | 0.2041 | 893.2 | 131300 | 0.9300 | | 0.2041 | 893.88 | 131400 | 0.9149 | | 0.2009 | 894.56 | 131500 | 0.9109 | | 0.2037 | 895.24 | 131600 | 0.9259 | | 0.2037 | 895.92 | 131700 | 0.9581 | | 0.2082 | 896.6 | 131800 | 0.9198 | | 0.2067 | 897.28 | 131900 | 0.9171 | | 0.2067 | 897.96 | 132000 | 0.8966 | | 0.2119 | 898.64 | 132100 | 0.9311 | | 0.2023 | 899.32 | 132200 | 0.9210 | | 0.2023 | 900.0 | 132300 | 0.9106 | | 0.2087 | 900.68 | 132400 | 0.9157 | | 0.2152 | 901.36 | 132500 | 0.9347 | | 0.2087 | 902.04 | 132600 | 0.9516 | | 0.2087 | 902.72 | 132700 | 0.9711 | | 0.2057 | 903.4 | 132800 | 0.9298 | | 0.2071 | 904.08 | 132900 | 0.9421 | | 0.2071 | 904.76 | 133000 | 0.9209 | | 0.2097 | 905.44 | 133100 | 0.9325 | | 0.2081 | 906.12 | 133200 | 0.9231 | | 0.2081 | 906.8 | 133300 | 0.9227 | | 0.2012 | 907.48 | 133400 | 0.9220 | | 0.1995 | 908.16 | 133500 | 0.9500 | | 0.1995 | 908.84 | 133600 | 0.9587 | | 0.2058 | 909.52 | 133700 | 0.9579 | | 0.2011 | 910.2 | 133800 | 0.9512 | | 0.2011 | 910.88 | 133900 | 0.9445 | | 0.2083 | 911.56 | 134000 | 0.9482 | | 0.2022 | 912.24 | 134100 | 0.9282 | | 0.2022 | 912.93 | 134200 | 0.9387 | | 0.2003 | 913.61 | 134300 | 0.9509 | | 0.212 | 914.29 | 134400 | 0.9609 | | 0.212 | 914.97 | 134500 | 0.9430 | | 0.2045 | 915.65 | 134600 | 0.9330 | | 0.2045 | 916.33 | 134700 | 0.9764 | | 0.2049 | 917.01 | 134800 | 0.9311 | | 0.2049 | 917.69 | 134900 | 0.9344 | | 0.2028 | 918.37 | 135000 | 0.9538 | | 0.1993 | 919.05 | 135100 | 0.9359 | | 0.1993 | 919.73 | 135200 | 0.9695 | | 0.2068 | 920.41 | 135300 | 0.9354 | | 0.2036 | 921.09 | 135400 | 0.9817 | | 0.2036 | 921.77 | 135500 | 0.9404 | | 0.2054 | 922.45 | 135600 | 0.9537 | | 0.2017 | 923.13 | 135700 | 0.9613 | | 0.2017 | 923.81 | 135800 | 0.9340 | | 0.1973 | 924.49 | 135900 | 0.9313 | | 0.216 | 925.17 | 136000 | 0.9541 | | 0.216 | 925.85 | 136100 | 0.9556 | | 0.2032 | 926.53 | 136200 | 0.9236 | | 0.1984 | 927.21 | 136300 | 0.9243 | | 0.1984 | 927.89 | 136400 | 0.9497 | | 0.195 | 928.57 | 136500 | 0.9485 | | 0.196 | 929.25 | 136600 | 0.9370 | | 0.196 | 929.93 | 136700 | 0.9294 | | 0.1991 | 930.61 | 136800 | 0.9510 | | 0.2008 | 931.29 | 136900 | 0.9445 | | 0.2008 | 931.97 | 137000 | 0.9428 | | 0.1997 | 932.65 | 137100 | 0.9718 | | 0.1998 | 933.33 | 137200 | 0.9620 | | 0.1962 | 934.01 | 137300 | 0.9388 | | 0.1962 | 934.69 | 137400 | 0.9578 | | 0.1932 | 935.37 | 137500 | 0.9383 | | 0.1989 | 936.05 | 137600 | 0.9285 | | 0.1989 | 936.73 | 137700 | 0.9671 | | 0.1965 | 937.41 | 137800 | 0.9572 | | 0.1988 | 938.1 | 137900 | 0.9487 | | 0.1988 | 938.78 | 138000 | 0.9369 | | 0.2006 | 939.46 | 138100 | 0.9343 | | 0.1995 | 940.14 | 138200 | 0.9488 | | 0.1995 | 940.82 | 138300 | 0.9242 | | 0.2047 | 941.5 | 138400 | 0.9214 | | 0.2118 | 942.18 | 138500 | 0.9054 | | 0.2118 | 942.86 | 138600 | 0.9391 | | 0.1934 | 943.54 | 138700 | 0.9256 | | 0.2012 | 944.22 | 138800 | 0.9372 | | 0.2012 | 944.9 | 138900 | 0.9355 | | 0.1984 | 945.58 | 139000 | 0.9284 | | 0.1953 | 946.26 | 139100 | 0.9206 | | 0.1953 | 946.94 | 139200 | 0.9281 | | 0.1974 | 947.62 | 139300 | 0.9300 | | 0.1919 | 948.3 | 139400 | 0.9566 | | 0.1919 | 948.98 | 139500 | 0.9674 | | 0.1951 | 949.66 | 139600 | 0.9739 | | 0.1986 | 950.34 | 139700 | 0.9548 | | 0.2041 | 951.02 | 139800 | 0.9510 | | 0.2041 | 951.7 | 139900 | 0.9621 | | 0.198 | 952.38 | 140000 | 0.9119 | | 0.1954 | 953.06 | 140100 | 0.9355 | | 0.1954 | 953.74 | 140200 | 0.9858 | | 0.1986 | 954.42 | 140300 | 0.9534 | | 0.2021 | 955.1 | 140400 | 0.9391 | | 0.2021 | 955.78 | 140500 | 0.9440 | | 0.2 | 956.46 | 140600 | 0.9461 | | 0.1928 | 957.14 | 140700 | 0.9493 | | 0.1928 | 957.82 | 140800 | 0.9452 | | 0.1953 | 958.5 | 140900 | 0.9946 | | 0.1982 | 959.18 | 141000 | 0.9450 | | 0.1982 | 959.86 | 141100 | 0.9513 | | 0.2022 | 960.54 | 141200 | 0.9530 | | 0.1939 | 961.22 | 141300 | 0.9312 | | 0.1939 | 961.9 | 141400 | 0.9523 | | 0.2007 | 962.59 | 141500 | 0.9353 | | 0.1884 | 963.27 | 141600 | 0.9613 | | 0.1884 | 963.95 | 141700 | 0.9531 | | 0.1993 | 964.63 | 141800 | 0.9392 | | 0.1971 | 965.31 | 141900 | 0.9484 | | 0.1971 | 965.99 | 142000 | 0.9328 | | 0.1961 | 966.67 | 142100 | 0.9410 | | 0.1977 | 967.35 | 142200 | 0.9437 | | 0.1998 | 968.03 | 142300 | 0.9449 | | 0.1998 | 968.71 | 142400 | 0.9371 | | 0.1982 | 969.39 | 142500 | 0.9450 | | 0.1996 | 970.07 | 142600 | 0.9448 | | 0.1996 | 970.75 | 142700 | 0.9493 | | 0.1964 | 971.43 | 142800 | 0.9377 | | 0.1938 | 972.11 | 142900 | 0.9306 | | 0.1938 | 972.79 | 143000 | 0.9513 | | 0.1897 | 973.47 | 143100 | 0.9496 | | 0.2045 | 974.15 | 143200 | 0.9461 | | 0.2045 | 974.83 | 143300 | 0.9329 | | 0.1946 | 975.51 | 143400 | 0.9688 | | 0.197 | 976.19 | 143500 | 0.9371 | | 0.197 | 976.87 | 143600 | 0.9512 | | 0.2004 | 977.55 | 143700 | 0.9373 | | 0.2002 | 978.23 | 143800 | 0.9569 | | 0.2002 | 978.91 | 143900 | 0.9513 | | 0.1916 | 979.59 | 144000 | 0.9457 | | 0.1959 | 980.27 | 144100 | 0.9251 | | 0.1959 | 980.95 | 144200 | 0.9330 | | 0.1934 | 981.63 | 144300 | 0.9382 | | 0.1954 | 982.31 | 144400 | 0.9553 | | 0.1954 | 982.99 | 144500 | 0.9498 | | 0.1919 | 983.67 | 144600 | 0.9558 | | 0.1883 | 984.35 | 144700 | 0.9484 | | 0.1928 | 985.03 | 144800 | 0.9310 | | 0.1928 | 985.71 | 144900 | 0.9282 | | 0.1872 | 986.39 | 145000 | 0.9351 | | 0.1868 | 987.07 | 145100 | 0.9457 | | 0.1868 | 987.76 | 145200 | 0.9444 | | 0.1906 | 988.44 | 145300 | 0.9478 | | 0.1957 | 989.12 | 145400 | 0.9691 | | 0.1957 | 989.8 | 145500 | 0.9437 | | 0.1959 | 990.48 | 145600 | 0.9576 | | 0.1912 | 991.16 | 145700 | 0.9539 | | 0.1912 | 991.84 | 145800 | 0.9463 | | 0.1977 | 992.52 | 145900 | 0.9703 | | 0.1955 | 993.2 | 146000 | 0.9462 | | 0.1955 | 993.88 | 146100 | 0.9621 | | 0.1923 | 994.56 | 146200 | 0.9568 | | 0.1959 | 995.24 | 146300 | 0.9650 | | 0.1959 | 995.92 | 146400 | 0.9668 | | 0.1921 | 996.6 | 146500 | 0.9588 | | 0.1968 | 997.28 | 146600 | 0.9510 | | 0.1968 | 997.96 | 146700 | 0.9430 | | 0.1927 | 998.64 | 146800 | 0.9672 | | 0.1995 | 999.32 | 146900 | 0.9508 | | 0.1995 | 1000.0 | 147000 | 0.9548 | ### Framework versions - Transformers 4.33.1 - Pytorch 2.2.0.dev20230910+cu121 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "block", "footer", "header" ]
m28yhtd/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
kacper-cierzniewski/daigram_detr_r50_albumentations
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # daigram_detr_r50_albumentations This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the bpmn-shapes dataset. It achieves the following results on the evaluation set: - Loss: 1.0088 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 3.8163 | 2.63 | 50 | 3.0660 | | 2.9036 | 5.26 | 100 | 2.8878 | | 2.7516 | 7.89 | 150 | 2.8043 | | 2.6278 | 10.53 | 200 | 2.6820 | | 2.4806 | 13.16 | 250 | 2.5676 | | 2.3781 | 15.79 | 300 | 2.4282 | | 2.253 | 18.42 | 350 | 2.3161 | | 2.1405 | 21.05 | 400 | 2.1735 | | 2.0263 | 23.68 | 450 | 2.0909 | | 1.9732 | 26.32 | 500 | 2.0120 | | 1.8647 | 28.95 | 550 | 1.9260 | | 1.7793 | 31.58 | 600 | 1.8655 | | 1.7706 | 34.21 | 650 | 1.8166 | | 1.6792 | 36.84 | 700 | 1.7325 | | 1.5654 | 39.47 | 750 | 1.7061 | | 1.5802 | 42.11 | 800 | 1.6463 | | 1.5053 | 44.74 | 850 | 1.5985 | | 1.4858 | 47.37 | 900 | 1.6060 | | 1.4186 | 50.0 | 950 | 1.5563 | | 1.4391 | 52.63 | 1000 | 1.5219 | | 1.3938 | 55.26 | 1050 | 1.4995 | | 1.3734 | 57.89 | 1100 | 1.4661 | | 1.3379 | 60.53 | 1150 | 1.4451 | | 1.341 | 63.16 | 1200 | 1.4854 | | 1.3647 | 65.79 | 1250 | 1.4509 | | 1.3198 | 68.42 | 1300 | 1.4116 | | 1.3054 | 71.05 | 1350 | 1.3821 | | 1.2945 | 73.68 | 1400 | 1.3952 | | 1.2899 | 76.32 | 1450 | 1.3868 | | 1.2533 | 78.95 | 1500 | 1.3580 | | 1.2655 | 81.58 | 1550 | 1.3374 | | 1.2649 | 84.21 | 1600 | 1.3451 | | 1.2286 | 86.84 | 1650 | 1.2973 | | 1.2497 | 89.47 | 1700 | 1.3322 | | 1.2456 | 92.11 | 1750 | 1.3289 | | 1.2234 | 94.74 | 1800 | 1.3080 | | 1.1695 | 97.37 | 1850 | 1.3218 | | 1.2265 | 100.0 | 1900 | 1.3280 | | 1.1899 | 102.63 | 1950 | 1.2834 | | 1.1914 | 105.26 | 2000 | 1.2931 | | 1.1698 | 107.89 | 2050 | 1.3176 | | 1.177 | 110.53 | 2100 | 1.2896 | | 1.1625 | 113.16 | 2150 | 1.2936 | | 1.1626 | 115.79 | 2200 | 1.2614 | | 1.1698 | 118.42 | 2250 | 1.2545 | | 1.1703 | 121.05 | 2300 | 1.2398 | | 1.1659 | 123.68 | 2350 | 1.2254 | | 1.1734 | 126.32 | 2400 | 1.2489 | | 1.1234 | 128.95 | 2450 | 1.2072 | | 1.1464 | 131.58 | 2500 | 1.1707 | | 1.1268 | 134.21 | 2550 | 1.1971 | | 1.1511 | 136.84 | 2600 | 1.2247 | | 1.1234 | 139.47 | 2650 | 1.1921 | | 1.0923 | 142.11 | 2700 | 1.1751 | | 1.1267 | 144.74 | 2750 | 1.1905 | | 1.1021 | 147.37 | 2800 | 1.1885 | | 1.1075 | 150.0 | 2850 | 1.1780 | | 1.1116 | 152.63 | 2900 | 1.1666 | | 1.0987 | 155.26 | 2950 | 1.1694 | | 1.0974 | 157.89 | 3000 | 1.1931 | | 1.0867 | 160.53 | 3050 | 1.1461 | | 1.1076 | 163.16 | 3100 | 1.1501 | | 1.0912 | 165.79 | 3150 | 1.1611 | | 1.0671 | 168.42 | 3200 | 1.1718 | | 1.0981 | 171.05 | 3250 | 1.1961 | | 1.0602 | 173.68 | 3300 | 1.1786 | | 1.0305 | 176.32 | 3350 | 1.1640 | | 1.0647 | 178.95 | 3400 | 1.1416 | | 1.0628 | 181.58 | 3450 | 1.1296 | | 1.0856 | 184.21 | 3500 | 1.1140 | | 1.0626 | 186.84 | 3550 | 1.1214 | | 1.0782 | 189.47 | 3600 | 1.1449 | | 1.0601 | 192.11 | 3650 | 1.1441 | | 1.0906 | 194.74 | 3700 | 1.1396 | | 1.0376 | 197.37 | 3750 | 1.1271 | | 1.0625 | 200.0 | 3800 | 1.1397 | | 1.057 | 202.63 | 3850 | 1.1121 | | 1.0448 | 205.26 | 3900 | 1.1376 | | 1.0747 | 207.89 | 3950 | 1.1475 | | 1.0605 | 210.53 | 4000 | 1.0916 | | 1.0344 | 213.16 | 4050 | 1.1001 | | 1.0443 | 215.79 | 4100 | 1.0976 | | 1.0202 | 218.42 | 4150 | 1.1240 | | 1.078 | 221.05 | 4200 | 1.1024 | | 1.0251 | 223.68 | 4250 | 1.0793 | | 1.0353 | 226.32 | 4300 | 1.1153 | | 1.0047 | 228.95 | 4350 | 1.0972 | | 1.0143 | 231.58 | 4400 | 1.0948 | | 1.0172 | 234.21 | 4450 | 1.1265 | | 1.0299 | 236.84 | 4500 | 1.1038 | | 0.9968 | 239.47 | 4550 | 1.0901 | | 1.0233 | 242.11 | 4600 | 1.0945 | | 0.9943 | 244.74 | 4650 | 1.0918 | | 1.0321 | 247.37 | 4700 | 1.1270 | | 1.0113 | 250.0 | 4750 | 1.1060 | | 1.0229 | 252.63 | 4800 | 1.0859 | | 0.9945 | 255.26 | 4850 | 1.0875 | | 1.0073 | 257.89 | 4900 | 1.0976 | | 1.0096 | 260.53 | 4950 | 1.0933 | | 1.0 | 263.16 | 5000 | 1.0821 | | 1.0326 | 265.79 | 5050 | 1.0747 | | 0.997 | 268.42 | 5100 | 1.0931 | | 1.0056 | 271.05 | 5150 | 1.0853 | | 0.9858 | 273.68 | 5200 | 1.0945 | | 1.0005 | 276.32 | 5250 | 1.0669 | | 1.0217 | 278.95 | 5300 | 1.0497 | | 0.9777 | 281.58 | 5350 | 1.0672 | | 0.9888 | 284.21 | 5400 | 1.0844 | | 0.9662 | 286.84 | 5450 | 1.0524 | | 1.0029 | 289.47 | 5500 | 1.0519 | | 0.984 | 292.11 | 5550 | 1.0538 | | 0.9724 | 294.74 | 5600 | 1.0524 | | 0.991 | 297.37 | 5650 | 1.0553 | | 0.9936 | 300.0 | 5700 | 1.0601 | | 0.9817 | 302.63 | 5750 | 1.0524 | | 0.9868 | 305.26 | 5800 | 1.0644 | | 0.9982 | 307.89 | 5850 | 1.0523 | | 0.9814 | 310.53 | 5900 | 1.0611 | | 0.9761 | 313.16 | 5950 | 1.0505 | | 0.9507 | 315.79 | 6000 | 1.0361 | | 0.9786 | 318.42 | 6050 | 1.0275 | | 0.9684 | 321.05 | 6100 | 1.0292 | | 0.9759 | 323.68 | 6150 | 1.0529 | | 0.9442 | 326.32 | 6200 | 1.0689 | | 0.9653 | 328.95 | 6250 | 1.0696 | | 0.9579 | 331.58 | 6300 | 1.0572 | | 1.0016 | 334.21 | 6350 | 1.0660 | | 0.9462 | 336.84 | 6400 | 1.0525 | | 0.9596 | 339.47 | 6450 | 1.0505 | | 0.9655 | 342.11 | 6500 | 1.0514 | | 0.9713 | 344.74 | 6550 | 1.0616 | | 0.952 | 347.37 | 6600 | 1.0497 | | 0.9433 | 350.0 | 6650 | 1.0389 | | 0.9619 | 352.63 | 6700 | 1.0404 | | 0.9594 | 355.26 | 6750 | 1.0332 | | 0.9586 | 357.89 | 6800 | 1.0323 | | 0.9582 | 360.53 | 6850 | 1.0294 | | 0.9437 | 363.16 | 6900 | 1.0329 | | 0.9585 | 365.79 | 6950 | 1.0361 | | 0.9661 | 368.42 | 7000 | 1.0428 | | 0.9603 | 371.05 | 7050 | 1.0299 | | 0.9619 | 373.68 | 7100 | 1.0416 | | 0.9766 | 376.32 | 7150 | 1.0471 | | 0.9547 | 378.95 | 7200 | 1.0498 | | 0.967 | 381.58 | 7250 | 1.0318 | | 0.9463 | 384.21 | 7300 | 1.0238 | | 0.9531 | 386.84 | 7350 | 1.0329 | | 0.9342 | 389.47 | 7400 | 1.0354 | | 0.939 | 392.11 | 7450 | 1.0312 | | 0.9635 | 394.74 | 7500 | 1.0325 | | 0.9261 | 397.37 | 7550 | 1.0245 | | 0.962 | 400.0 | 7600 | 1.0381 | | 0.9385 | 402.63 | 7650 | 1.0243 | | 0.9422 | 405.26 | 7700 | 1.0235 | | 0.9285 | 407.89 | 7750 | 1.0286 | | 0.9598 | 410.53 | 7800 | 1.0353 | | 0.9529 | 413.16 | 7850 | 1.0361 | | 0.928 | 415.79 | 7900 | 1.0316 | | 0.935 | 418.42 | 7950 | 1.0263 | | 0.9456 | 421.05 | 8000 | 1.0368 | | 0.9387 | 423.68 | 8050 | 1.0440 | | 0.9321 | 426.32 | 8100 | 1.0440 | | 0.9236 | 428.95 | 8150 | 1.0394 | | 0.9448 | 431.58 | 8200 | 1.0467 | | 0.9151 | 434.21 | 8250 | 1.0516 | | 0.9373 | 436.84 | 8300 | 1.0383 | | 0.9577 | 439.47 | 8350 | 1.0190 | | 0.9199 | 442.11 | 8400 | 1.0215 | | 0.9321 | 444.74 | 8450 | 1.0184 | | 0.9387 | 447.37 | 8500 | 1.0236 | | 0.9382 | 450.0 | 8550 | 1.0259 | | 0.9391 | 452.63 | 8600 | 1.0282 | | 0.9392 | 455.26 | 8650 | 1.0193 | | 0.9438 | 457.89 | 8700 | 1.0124 | | 0.9398 | 460.53 | 8750 | 1.0060 | | 0.9246 | 463.16 | 8800 | 1.0140 | | 0.9383 | 465.79 | 8850 | 1.0145 | | 0.9267 | 468.42 | 8900 | 1.0122 | | 0.9253 | 471.05 | 8950 | 1.0144 | | 0.9238 | 473.68 | 9000 | 1.0065 | | 0.9082 | 476.32 | 9050 | 1.0136 | | 0.9287 | 478.95 | 9100 | 1.0120 | | 0.9161 | 481.58 | 9150 | 1.0120 | | 0.9093 | 484.21 | 9200 | 1.0128 | | 0.9264 | 486.84 | 9250 | 1.0125 | | 0.9487 | 489.47 | 9300 | 1.0131 | | 0.9398 | 492.11 | 9350 | 1.0101 | | 0.9039 | 494.74 | 9400 | 1.0090 | | 0.908 | 497.37 | 9450 | 1.0097 | | 0.944 | 500.0 | 9500 | 1.0088 | ### Framework versions - Transformers 4.34.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "shapes", "arrow", "circle", "diamond", "pointer", "rectangle", "triangle" ]
gkalsrudals/use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
koya3to/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
NoahMeissner/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
yoon6173/furniture_use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # furniture_use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
faldeus0092/detr-finetuned-thermal-dogs-and-people
# object-detection: detr-finetuned-thermal-dogs-and-people <!-- Provide a quick summary of what the model is/does. --> This model is a fine-tuned version of [DETR](https://huggingface.co/facebook/detr-resnet-50) on the Roboflow [Thermal Dogs and People](https://public.roboflow.com/object-detection/thermal-dogs-and-people/1) dataset. It achieves the following results on the evaluation set: ``` Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.681 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.870 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.778 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.189 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.489 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.720 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.641 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.733 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.746 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.500 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.542 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.794 ``` ## Intended purpose Main purpose for this model are solely for learning purposes. Thermal images have a wide array of applications: monitoring machine performance, seeing in low light conditions, and adding another dimension to standard RGB scenarios. Infrared imaging is useful in security, wildlife detection,and hunting / outdoors recreation. ## Training and evaluation data Data can be seen at [Weights and Biases](https://wandb.ai/faldeus0092/thermal-dogs-and-people/runs/zjt8bp9x?workspace=user-faldeus0092) ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-4 - lr_backbone: 1e-5 - weight_decay: 1e-4 - optimizer: AdamW - train_batch_size: 4 - eval_batch_size: 2 - train_set: 142 - test_set: 41 - num_epochs: 68 ### Example usage (transformers pipeline) ```py # Use a pipeline as a high-level helper from transformers import pipeline image = Image.open('/content/Thermal-Dogs-and-People-1/test/IMG_0006 5_jpg.rf.cd46e6a862d6ffb7fce6795067ce7cc7.jpg') # image = Image.open(requests.get(url, stream=True).raw) # if you want to open from url obj_detector = pipeline("object-detection", model="faldeus0092/detr-finetuned-thermal-dogs-and-people") draw = ImageDraw.Draw(image) for score, label, box in zip(results["scores"], results["labels"], results["boxes"]): box = [round(i, 2) for i in box.tolist()] x, y, x2, y2 = tuple(box) draw.rectangle((x, y, x2, y2), outline="red", width=1) draw.text((x, y), model.config.id2label[label.item()], fill="white") image ```
[ "label_0", "label_1", "label_2" ]
Zekrom997/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
theodullin/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the blood-cell-object-detection dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "platelets", "rbc", "wbc" ]
theodullin/detr-resnet-50_finetuned_cppe5_30-epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5_30-epochs This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the blood-cell-object-detection dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "platelets", "rbc", "wbc" ]
theodullin/deta-resnet-50_finetuned_blood_cell_10epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deta-resnet-50_finetuned_blood_cell_10epochs This model is a fine-tuned version of [jozhang97/deta-resnet-50](https://huggingface.co/jozhang97/deta-resnet-50) on the blood-cell-object-detection dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "platelets", "rbc", "wbc" ]
theodullin/conditional-detr-resnet-50_finetuned_blood_cell_10epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # conditional-detr-resnet-50_finetuned_blood_cell_10epochs This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the blood-cell-object-detection dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "platelets", "rbc", "wbc" ]
theodullin/detr-resnet-50_finetuned_blood_cell_15epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_blood_cell_15epochs This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the blood-cell-object-detection dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 ### Training results ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "platelets", "rbc", "wbc" ]
LeeRuben/cppe5_use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cppe5_use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
EUNSEO56/use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
NoahMeissner/detr-resnet-50_finetuned_uni_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_uni_model This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu121 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "door", "stairs" ]
panda47/cppe5_use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cppe5_use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
James332/cppe5_use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cppe5_use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
gobk/furniture-ngpea_use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # furniture-ngpea_use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
philona/use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
miriaiml/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
taewon99/use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.6 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]
EUNSEO56/furniture_use_data_finetuning
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # furniture_use_data_finetuning This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results ### Framework versions - Transformers 4.34.1 - Pytorch 2.1.0+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "furniture", "chair", "sofa", "table" ]