model_id
stringlengths 9
102
| model_card
stringlengths 4
343k
| model_labels
listlengths 2
50.8k
|
---|---|---|
cisimon7/trainer_output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trainer_output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1867
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
buraq-k/detr-fashionpedia |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashionpedia
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2414
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
Iya-roos/detr-fashionpedia-output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashionpedia-output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 4.4742
- eval_model_preparation_time: 0.0071
- eval_runtime: 34.6373
- eval_samples_per_second: 8.661
- eval_steps_per_second: 2.165
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
greete/detr-output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4862
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
orlandosss/detr-fashionpedia |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashionpedia
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0526
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
IlmarM/detr-fashion-clothes |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashion-clothes
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2290
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 7
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
heiliaavola/detr-resnet50-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet50-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0939
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.4455 | 1.0 | 200 | 3.2410 |
| 2.3826 | 2.0 | 400 | 2.3457 |
| 2.1902 | 3.0 | 600 | 2.1963 |
| 2.1091 | 4.0 | 800 | 2.1092 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
naveenpranesh/results |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3174
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.3475 | 2.0 | 500 | 1.6325 |
| 1.3375 | 4.0 | 1000 | 1.5893 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
AhmedWael11/detr-fashionpedia-4cat |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashionpedia-4cat
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3170
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
Xubi23/trainer_output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trainer_output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4267
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
kelem2/detr-output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1673
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 22
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.8638 | 1.0 | 375 | 1.7503 |
| 1.6596 | 2.0 | 750 | 1.5301 |
| 1.5249 | 3.0 | 1125 | 1.4235 |
| 1.4242 | 4.0 | 1500 | 1.3329 |
| 1.4742 | 5.0 | 1875 | 1.2970 |
| 1.4456 | 6.0 | 2250 | 1.2932 |
| 1.3141 | 7.0 | 2625 | 1.2498 |
| 1.324 | 8.0 | 3000 | 1.2866 |
| 1.2409 | 9.0 | 3375 | 1.2176 |
| 1.2627 | 10.0 | 3750 | 1.2144 |
| 1.2766 | 11.0 | 4125 | 1.2022 |
| 1.1991 | 12.0 | 4500 | 1.2160 |
| 1.2742 | 13.0 | 4875 | 1.1897 |
| 1.1589 | 14.0 | 5250 | 1.1766 |
| 1.2242 | 15.0 | 5625 | 1.1663 |
| 1.1877 | 16.0 | 6000 | 1.1514 |
| 1.1424 | 17.0 | 6375 | 1.1267 |
| 1.0781 | 18.0 | 6750 | 1.1135 |
| 1.1314 | 19.0 | 7125 | 1.1248 |
| 1.1174 | 20.0 | 7500 | 1.1034 |
| 1.0465 | 21.0 | 7875 | 1.1073 |
| 1.1111 | 22.0 | 8250 | 1.1166 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
Godouche/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.50.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"crack",
"overlap"
] |
mahernto/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8209
- Map: 0.5813
- Map 50: 0.8161
- Map 75: 0.6682
- Map Small: -1.0
- Map Medium: 0.6283
- Map Large: 0.5888
- Mar 1: 0.4242
- Mar 10: 0.7055
- Mar 100: 0.7704
- Mar Small: -1.0
- Mar Medium: 0.6886
- Mar Large: 0.7816
- Map Banana: 0.4339
- Mar 100 Banana: 0.7225
- Map Orange: 0.6177
- Mar 100 Orange: 0.7857
- Map Apple: 0.6923
- Mar 100 Apple: 0.8029
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 2.1499 | 0.0136 | 0.0448 | 0.0056 | -1.0 | 0.0109 | 0.0158 | 0.075 | 0.1852 | 0.3357 | -1.0 | 0.1843 | 0.3553 | 0.013 | 0.37 | 0.0078 | 0.3143 | 0.0201 | 0.3229 |
| No log | 2.0 | 120 | 1.7782 | 0.0292 | 0.0773 | 0.0143 | -1.0 | 0.0276 | 0.0386 | 0.1073 | 0.2172 | 0.3738 | -1.0 | 0.1543 | 0.3976 | 0.0311 | 0.5425 | 0.0182 | 0.1476 | 0.0383 | 0.4314 |
| No log | 3.0 | 180 | 1.5906 | 0.0594 | 0.1414 | 0.0417 | -1.0 | 0.1115 | 0.0605 | 0.152 | 0.3742 | 0.5341 | -1.0 | 0.4 | 0.5537 | 0.0778 | 0.56 | 0.0529 | 0.5167 | 0.0476 | 0.5257 |
| No log | 4.0 | 240 | 1.5383 | 0.0861 | 0.202 | 0.0501 | -1.0 | 0.2612 | 0.0865 | 0.151 | 0.3671 | 0.5304 | -1.0 | 0.45 | 0.5377 | 0.1303 | 0.6025 | 0.0745 | 0.5 | 0.0535 | 0.4886 |
| No log | 5.0 | 300 | 1.1837 | 0.1558 | 0.2537 | 0.1816 | -1.0 | 0.2695 | 0.1583 | 0.2698 | 0.4915 | 0.6304 | -1.0 | 0.6171 | 0.6306 | 0.1721 | 0.665 | 0.1067 | 0.4976 | 0.1887 | 0.7286 |
| No log | 6.0 | 360 | 1.0734 | 0.157 | 0.2964 | 0.1582 | -1.0 | 0.3468 | 0.187 | 0.2915 | 0.5425 | 0.6648 | -1.0 | 0.6343 | 0.6689 | 0.2002 | 0.655 | 0.1633 | 0.6881 | 0.1074 | 0.6514 |
| No log | 7.0 | 420 | 1.0573 | 0.2775 | 0.4635 | 0.3247 | -1.0 | 0.4621 | 0.2992 | 0.3344 | 0.5898 | 0.6521 | -1.0 | 0.6143 | 0.6591 | 0.2421 | 0.6525 | 0.3061 | 0.581 | 0.2844 | 0.7229 |
| No log | 8.0 | 480 | 1.0384 | 0.2976 | 0.4884 | 0.3472 | -1.0 | 0.3785 | 0.3332 | 0.349 | 0.5867 | 0.6615 | -1.0 | 0.5629 | 0.6758 | 0.2774 | 0.655 | 0.2988 | 0.6095 | 0.3166 | 0.72 |
| 1.3795 | 9.0 | 540 | 1.0118 | 0.3836 | 0.6136 | 0.4243 | -1.0 | 0.5103 | 0.4155 | 0.3625 | 0.6428 | 0.7234 | -1.0 | 0.6757 | 0.7321 | 0.3059 | 0.7025 | 0.418 | 0.7048 | 0.4267 | 0.7629 |
| 1.3795 | 10.0 | 600 | 0.9245 | 0.435 | 0.6491 | 0.5092 | -1.0 | 0.5728 | 0.4373 | 0.3755 | 0.6479 | 0.7627 | -1.0 | 0.67 | 0.7771 | 0.3134 | 0.7225 | 0.4386 | 0.7571 | 0.5529 | 0.8086 |
| 1.3795 | 11.0 | 660 | 0.9402 | 0.4402 | 0.6789 | 0.4961 | -1.0 | 0.5685 | 0.4575 | 0.3954 | 0.6632 | 0.7531 | -1.0 | 0.6486 | 0.769 | 0.2956 | 0.7225 | 0.4795 | 0.7452 | 0.5453 | 0.7914 |
| 1.3795 | 12.0 | 720 | 0.9860 | 0.4799 | 0.732 | 0.5485 | -1.0 | 0.5748 | 0.4896 | 0.3923 | 0.6661 | 0.7248 | -1.0 | 0.64 | 0.7374 | 0.3637 | 0.6825 | 0.4651 | 0.7119 | 0.611 | 0.78 |
| 1.3795 | 13.0 | 780 | 0.9429 | 0.5169 | 0.7922 | 0.5961 | -1.0 | 0.5773 | 0.5318 | 0.3917 | 0.6751 | 0.7439 | -1.0 | 0.6871 | 0.7558 | 0.3606 | 0.6675 | 0.5592 | 0.7643 | 0.631 | 0.8 |
| 1.3795 | 14.0 | 840 | 0.8865 | 0.5173 | 0.758 | 0.5911 | -1.0 | 0.6596 | 0.5182 | 0.4012 | 0.678 | 0.7499 | -1.0 | 0.6986 | 0.7576 | 0.3531 | 0.705 | 0.5424 | 0.7619 | 0.6563 | 0.7829 |
| 1.3795 | 15.0 | 900 | 0.8419 | 0.5406 | 0.7763 | 0.6074 | -1.0 | 0.5919 | 0.5512 | 0.4255 | 0.6973 | 0.7671 | -1.0 | 0.7114 | 0.7778 | 0.4123 | 0.6975 | 0.5349 | 0.7952 | 0.6745 | 0.8086 |
| 1.3795 | 16.0 | 960 | 0.8329 | 0.5395 | 0.7552 | 0.6311 | -1.0 | 0.5883 | 0.5466 | 0.4152 | 0.7104 | 0.757 | -1.0 | 0.7 | 0.7684 | 0.4031 | 0.7 | 0.5438 | 0.7738 | 0.6716 | 0.7971 |
| 0.7998 | 17.0 | 1020 | 0.8817 | 0.534 | 0.7852 | 0.6453 | -1.0 | 0.5942 | 0.5434 | 0.3962 | 0.6775 | 0.7507 | -1.0 | 0.71 | 0.7613 | 0.4026 | 0.685 | 0.5503 | 0.7643 | 0.6492 | 0.8029 |
| 0.7998 | 18.0 | 1080 | 0.8657 | 0.5663 | 0.8226 | 0.6633 | -1.0 | 0.6353 | 0.5746 | 0.4164 | 0.6948 | 0.7529 | -1.0 | 0.7186 | 0.7613 | 0.415 | 0.685 | 0.5936 | 0.7595 | 0.6903 | 0.8143 |
| 0.7998 | 19.0 | 1140 | 0.8733 | 0.5511 | 0.8041 | 0.6633 | -1.0 | 0.5608 | 0.5704 | 0.402 | 0.7012 | 0.7453 | -1.0 | 0.6757 | 0.7573 | 0.4056 | 0.7025 | 0.5905 | 0.7619 | 0.6572 | 0.7714 |
| 0.7998 | 20.0 | 1200 | 0.8267 | 0.5838 | 0.8199 | 0.6795 | -1.0 | 0.6184 | 0.5922 | 0.4153 | 0.7223 | 0.7688 | -1.0 | 0.7086 | 0.779 | 0.4281 | 0.7075 | 0.6191 | 0.7905 | 0.7042 | 0.8086 |
| 0.7998 | 21.0 | 1260 | 0.8072 | 0.5746 | 0.8082 | 0.669 | -1.0 | 0.6242 | 0.5837 | 0.424 | 0.7139 | 0.774 | -1.0 | 0.7086 | 0.7843 | 0.417 | 0.7225 | 0.5945 | 0.7881 | 0.7124 | 0.8114 |
| 0.7998 | 22.0 | 1320 | 0.8209 | 0.5833 | 0.8172 | 0.6688 | -1.0 | 0.6298 | 0.5924 | 0.4248 | 0.7034 | 0.7666 | -1.0 | 0.7229 | 0.7737 | 0.4388 | 0.7175 | 0.6002 | 0.7738 | 0.7108 | 0.8086 |
| 0.7998 | 23.0 | 1380 | 0.8103 | 0.5882 | 0.8115 | 0.6759 | -1.0 | 0.6302 | 0.5949 | 0.4237 | 0.7178 | 0.7796 | -1.0 | 0.7571 | 0.7845 | 0.4453 | 0.725 | 0.6136 | 0.7881 | 0.7059 | 0.8257 |
| 0.7998 | 24.0 | 1440 | 0.8106 | 0.5867 | 0.8113 | 0.6811 | -1.0 | 0.6585 | 0.5931 | 0.4273 | 0.7175 | 0.7777 | -1.0 | 0.73 | 0.7851 | 0.4353 | 0.7275 | 0.6169 | 0.7857 | 0.7077 | 0.82 |
| 0.6151 | 25.0 | 1500 | 0.8246 | 0.5815 | 0.8161 | 0.6787 | -1.0 | 0.6404 | 0.5954 | 0.424 | 0.7167 | 0.7696 | -1.0 | 0.72 | 0.7772 | 0.4355 | 0.7175 | 0.615 | 0.7714 | 0.6941 | 0.82 |
| 0.6151 | 26.0 | 1560 | 0.8168 | 0.5812 | 0.8151 | 0.6754 | -1.0 | 0.6353 | 0.5892 | 0.4254 | 0.7088 | 0.7707 | -1.0 | 0.7229 | 0.778 | 0.4366 | 0.725 | 0.6096 | 0.7786 | 0.6972 | 0.8086 |
| 0.6151 | 27.0 | 1620 | 0.8339 | 0.5809 | 0.8164 | 0.6778 | -1.0 | 0.6162 | 0.5896 | 0.4188 | 0.7077 | 0.7702 | -1.0 | 0.7057 | 0.7798 | 0.4323 | 0.7225 | 0.6103 | 0.7738 | 0.7 | 0.8143 |
| 0.6151 | 28.0 | 1680 | 0.8239 | 0.5779 | 0.8163 | 0.6688 | -1.0 | 0.617 | 0.5864 | 0.4218 | 0.7038 | 0.7647 | -1.0 | 0.6786 | 0.7764 | 0.4304 | 0.715 | 0.6121 | 0.7762 | 0.6911 | 0.8029 |
| 0.6151 | 29.0 | 1740 | 0.8207 | 0.5819 | 0.8169 | 0.6689 | -1.0 | 0.6283 | 0.5899 | 0.4235 | 0.7046 | 0.767 | -1.0 | 0.6886 | 0.7778 | 0.4342 | 0.72 | 0.6167 | 0.781 | 0.6948 | 0.8 |
| 0.6151 | 30.0 | 1800 | 0.8209 | 0.5813 | 0.8161 | 0.6682 | -1.0 | 0.6283 | 0.5888 | 0.4242 | 0.7055 | 0.7704 | -1.0 | 0.6886 | 0.7816 | 0.4339 | 0.7225 | 0.6177 | 0.7857 | 0.6923 | 0.8029 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
MarioGL/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8341
- Map: 0.572
- Map 50: 0.8556
- Map 75: 0.6387
- Map Small: -1.0
- Map Medium: 0.5995
- Map Large: 0.5779
- Mar 1: 0.4112
- Mar 10: 0.7057
- Mar 100: 0.7578
- Mar Small: -1.0
- Mar Medium: 0.7325
- Mar Large: 0.7609
- Map Banana: 0.4363
- Mar 100 Banana: 0.7325
- Map Orange: 0.6275
- Mar 100 Orange: 0.781
- Map Apple: 0.6522
- Mar 100 Apple: 0.76
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 1.5440 | 0.0387 | 0.0859 | 0.0348 | -1.0 | 0.1394 | 0.0336 | 0.149 | 0.2869 | 0.5454 | -1.0 | 0.485 | 0.5483 | 0.0331 | 0.5975 | 0.0514 | 0.4929 | 0.0315 | 0.5457 |
| No log | 2.0 | 120 | 1.5123 | 0.0855 | 0.2047 | 0.0596 | -1.0 | 0.2024 | 0.0905 | 0.176 | 0.366 | 0.5231 | -1.0 | 0.43 | 0.5303 | 0.0686 | 0.575 | 0.0483 | 0.3143 | 0.1395 | 0.68 |
| No log | 3.0 | 180 | 1.4236 | 0.0718 | 0.1411 | 0.0629 | -1.0 | 0.1677 | 0.0714 | 0.2189 | 0.4188 | 0.5707 | -1.0 | 0.6325 | 0.5636 | 0.0422 | 0.5925 | 0.0798 | 0.5738 | 0.0934 | 0.5457 |
| No log | 4.0 | 240 | 1.2437 | 0.1361 | 0.2456 | 0.1491 | -1.0 | 0.3305 | 0.1522 | 0.2948 | 0.5091 | 0.6615 | -1.0 | 0.625 | 0.6675 | 0.0816 | 0.6175 | 0.1462 | 0.6786 | 0.1805 | 0.6886 |
| No log | 5.0 | 300 | 1.1642 | 0.1941 | 0.3089 | 0.2199 | -1.0 | 0.3199 | 0.2035 | 0.3128 | 0.5666 | 0.6821 | -1.0 | 0.705 | 0.6824 | 0.0805 | 0.635 | 0.1943 | 0.6429 | 0.3076 | 0.7686 |
| No log | 6.0 | 360 | 1.1856 | 0.3147 | 0.5352 | 0.3616 | -1.0 | 0.3609 | 0.3281 | 0.3224 | 0.5628 | 0.66 | -1.0 | 0.57 | 0.6692 | 0.1343 | 0.63 | 0.3586 | 0.6214 | 0.4513 | 0.7286 |
| No log | 7.0 | 420 | 0.9729 | 0.3946 | 0.6053 | 0.4763 | -1.0 | 0.3824 | 0.4076 | 0.3595 | 0.6093 | 0.7112 | -1.0 | 0.6675 | 0.7153 | 0.2312 | 0.705 | 0.4634 | 0.7286 | 0.4894 | 0.7 |
| No log | 8.0 | 480 | 1.0144 | 0.4255 | 0.7172 | 0.4726 | -1.0 | 0.4703 | 0.4381 | 0.362 | 0.6152 | 0.6965 | -1.0 | 0.6825 | 0.7014 | 0.2774 | 0.6475 | 0.4481 | 0.6905 | 0.5511 | 0.7514 |
| 1.1634 | 9.0 | 540 | 0.9774 | 0.48 | 0.7801 | 0.5204 | -1.0 | 0.515 | 0.5061 | 0.3615 | 0.641 | 0.7079 | -1.0 | 0.6325 | 0.7183 | 0.32 | 0.67 | 0.5217 | 0.731 | 0.5984 | 0.7229 |
| 1.1634 | 10.0 | 600 | 1.0095 | 0.4681 | 0.7863 | 0.4974 | -1.0 | 0.5686 | 0.4764 | 0.3608 | 0.6471 | 0.7063 | -1.0 | 0.645 | 0.7137 | 0.3044 | 0.665 | 0.5478 | 0.731 | 0.5521 | 0.7229 |
| 1.1634 | 11.0 | 660 | 0.9365 | 0.4856 | 0.785 | 0.5537 | -1.0 | 0.5393 | 0.4932 | 0.3753 | 0.6683 | 0.7209 | -1.0 | 0.71 | 0.7258 | 0.3324 | 0.6675 | 0.5215 | 0.7667 | 0.603 | 0.7286 |
| 1.1634 | 12.0 | 720 | 0.9318 | 0.5065 | 0.7759 | 0.5698 | -1.0 | 0.4812 | 0.5166 | 0.3932 | 0.6754 | 0.7317 | -1.0 | 0.7025 | 0.7373 | 0.3646 | 0.685 | 0.4942 | 0.7357 | 0.6606 | 0.7743 |
| 1.1634 | 13.0 | 780 | 0.8694 | 0.5439 | 0.8237 | 0.6188 | -1.0 | 0.5939 | 0.5536 | 0.3957 | 0.6971 | 0.7484 | -1.0 | 0.755 | 0.7513 | 0.4012 | 0.7075 | 0.5879 | 0.7833 | 0.6427 | 0.7543 |
| 1.1634 | 14.0 | 840 | 0.8888 | 0.537 | 0.8231 | 0.5881 | -1.0 | 0.471 | 0.5495 | 0.3965 | 0.6842 | 0.7273 | -1.0 | 0.7275 | 0.7298 | 0.4131 | 0.6875 | 0.557 | 0.7571 | 0.6408 | 0.7371 |
| 1.1634 | 15.0 | 900 | 0.8759 | 0.5486 | 0.8215 | 0.6192 | -1.0 | 0.4901 | 0.5642 | 0.4162 | 0.6849 | 0.7504 | -1.0 | 0.7175 | 0.7571 | 0.4077 | 0.6975 | 0.5634 | 0.7738 | 0.6749 | 0.78 |
| 1.1634 | 16.0 | 960 | 0.8709 | 0.5503 | 0.856 | 0.6079 | -1.0 | 0.6038 | 0.5588 | 0.3988 | 0.6788 | 0.7389 | -1.0 | 0.6925 | 0.7459 | 0.4131 | 0.6925 | 0.5928 | 0.7643 | 0.645 | 0.76 |
| 0.739 | 17.0 | 1020 | 0.9051 | 0.5407 | 0.8343 | 0.6075 | -1.0 | 0.6395 | 0.544 | 0.3903 | 0.6884 | 0.7336 | -1.0 | 0.7475 | 0.7349 | 0.3945 | 0.685 | 0.5774 | 0.7643 | 0.6501 | 0.7514 |
| 0.739 | 18.0 | 1080 | 0.8992 | 0.5441 | 0.84 | 0.5738 | -1.0 | 0.6025 | 0.5492 | 0.4014 | 0.684 | 0.7301 | -1.0 | 0.705 | 0.7349 | 0.4046 | 0.685 | 0.5938 | 0.7738 | 0.6341 | 0.7314 |
| 0.739 | 19.0 | 1140 | 0.8874 | 0.5597 | 0.8492 | 0.6127 | -1.0 | 0.637 | 0.5648 | 0.4083 | 0.6959 | 0.7476 | -1.0 | 0.7375 | 0.7512 | 0.4149 | 0.7 | 0.6086 | 0.7857 | 0.6555 | 0.7571 |
| 0.739 | 20.0 | 1200 | 0.8511 | 0.5739 | 0.8539 | 0.6164 | -1.0 | 0.6501 | 0.5792 | 0.4119 | 0.7027 | 0.7512 | -1.0 | 0.765 | 0.7526 | 0.4278 | 0.685 | 0.598 | 0.7857 | 0.6958 | 0.7829 |
| 0.739 | 21.0 | 1260 | 0.8410 | 0.5585 | 0.8335 | 0.602 | -1.0 | 0.617 | 0.562 | 0.4049 | 0.6914 | 0.7379 | -1.0 | 0.7225 | 0.7408 | 0.4426 | 0.695 | 0.598 | 0.7786 | 0.635 | 0.74 |
| 0.739 | 22.0 | 1320 | 0.8601 | 0.5661 | 0.8578 | 0.6273 | -1.0 | 0.59 | 0.5698 | 0.402 | 0.6915 | 0.7349 | -1.0 | 0.69 | 0.7399 | 0.4617 | 0.7075 | 0.5998 | 0.7714 | 0.6367 | 0.7257 |
| 0.739 | 23.0 | 1380 | 0.8342 | 0.5768 | 0.8697 | 0.6525 | -1.0 | 0.5742 | 0.5857 | 0.4092 | 0.6926 | 0.7453 | -1.0 | 0.7125 | 0.7495 | 0.4508 | 0.715 | 0.6183 | 0.781 | 0.6612 | 0.74 |
| 0.739 | 24.0 | 1440 | 0.8332 | 0.5754 | 0.8542 | 0.6483 | -1.0 | 0.5912 | 0.5811 | 0.4106 | 0.6929 | 0.7493 | -1.0 | 0.735 | 0.7519 | 0.4558 | 0.7175 | 0.6252 | 0.7905 | 0.6453 | 0.74 |
| 0.5743 | 25.0 | 1500 | 0.8418 | 0.5749 | 0.8527 | 0.6509 | -1.0 | 0.589 | 0.5814 | 0.4114 | 0.6978 | 0.7517 | -1.0 | 0.725 | 0.7552 | 0.4595 | 0.7275 | 0.6192 | 0.7905 | 0.6461 | 0.7371 |
| 0.5743 | 26.0 | 1560 | 0.8364 | 0.573 | 0.854 | 0.6416 | -1.0 | 0.6126 | 0.5773 | 0.4096 | 0.6985 | 0.7505 | -1.0 | 0.745 | 0.752 | 0.4485 | 0.7225 | 0.6224 | 0.7833 | 0.6482 | 0.7457 |
| 0.5743 | 27.0 | 1620 | 0.8337 | 0.574 | 0.8561 | 0.6405 | -1.0 | 0.6115 | 0.579 | 0.4104 | 0.6971 | 0.7515 | -1.0 | 0.7325 | 0.754 | 0.4423 | 0.7225 | 0.6291 | 0.7833 | 0.6504 | 0.7486 |
| 0.5743 | 28.0 | 1680 | 0.8323 | 0.5702 | 0.8556 | 0.6335 | -1.0 | 0.6109 | 0.5749 | 0.4104 | 0.704 | 0.7544 | -1.0 | 0.7225 | 0.7583 | 0.4356 | 0.7275 | 0.6258 | 0.7786 | 0.6491 | 0.7571 |
| 0.5743 | 29.0 | 1740 | 0.8336 | 0.5719 | 0.8555 | 0.6387 | -1.0 | 0.5994 | 0.5779 | 0.4112 | 0.7057 | 0.7578 | -1.0 | 0.7325 | 0.7609 | 0.4356 | 0.7325 | 0.6275 | 0.781 | 0.6526 | 0.76 |
| 0.5743 | 30.0 | 1800 | 0.8341 | 0.572 | 0.8556 | 0.6387 | -1.0 | 0.5995 | 0.5779 | 0.4112 | 0.7057 | 0.7578 | -1.0 | 0.7325 | 0.7609 | 0.4363 | 0.7325 | 0.6275 | 0.781 | 0.6522 | 0.76 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
JAAN555/detr-object-detection |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-object-detection
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8362
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
KollaneMesilane/detr-fashion |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashion
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2605
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
ceebdev/detr-fashionpedia |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashionpedia
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6505
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 4.1251 | 1.0 | 125 | 3.8414 |
| 2.3946 | 2.0 | 250 | 2.2576 |
| 2.0764 | 3.0 | 375 | 1.9349 |
| 1.8577 | 4.0 | 500 | 1.7802 |
| 1.7753 | 5.0 | 625 | 1.7106 |
| 1.7306 | 6.0 | 750 | 1.6355 |
| 1.6549 | 7.0 | 875 | 1.5866 |
| 1.634 | 8.0 | 1000 | 1.5723 |
| 1.6345 | 9.0 | 1125 | 1.5614 |
| 1.6894 | 10.0 | 1250 | 1.5593 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
Megameeth/trainer_output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trainer_output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6950
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 125 | 2.8205 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
opria123/detr-resnet-50-dc5-hardhat-finetuned |
# Model Card for Model ID
A finetuned model for object detection.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [Joshua Opria]
- **Model type:** [Object Detection]
- **License:** [MIT]
- **Finetuned from model [optional]:** facebook/detr-resnet-50-dc5]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [https://huggingface.co/opria123/detr-resnet-50-dc5-hardhat-finetuned]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[anindya64/hardhat] | [
"head",
"helmet",
"person"
] |
alanahmet/rtdetr-v2-r50-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r50-cppe5-finetune-2
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 9.2996
- Map: 0.3651
- Map 50: 0.5466
- Map 75: 0.383
- Map Small: 0.2958
- Map Medium: 0.3972
- Map Large: 0.4626
- Mar 1: 0.2229
- Mar 10: 0.4668
- Mar 100: 0.5843
- Mar Small: 0.4974
- Mar Medium: 0.5996
- Mar Large: 0.6166
- Map Plane: 0.6432
- Mar 100 Plane: 0.7129
- Map Ship: 0.4814
- Mar 100 Ship: 0.5906
- Map Storage-tank: 0.5778
- Mar 100 Storage-tank: 0.74
- Map Baseball-diamond: 0.0738
- Mar 100 Baseball-diamond: 0.5
- Map Tennis-court: 0.5923
- Mar 100 Tennis-court: 0.68
- Map Basketball-court: 0.3908
- Mar 100 Basketball-court: 0.6833
- Map Ground-track-field: 0.4067
- Mar 100 Ground-track-field: 0.55
- Map Harbor: 0.2198
- Mar 100 Harbor: 0.5298
- Map Bridge: 0.045
- Mar 100 Bridge: 0.24
- Map Small-vehicle: 0.3045
- Mar 100 Small-vehicle: 0.4403
- Map Large-vehicle: 0.5136
- Mar 100 Large-vehicle: 0.6855
- Map Roundabout: 0.3772
- Mar 100 Roundabout: 0.7182
- Map Swimming-pool: 0.1756
- Mar 100 Swimming-pool: 0.36
- Map Helicopter: -1.0
- Mar 100 Helicopter: -1.0
- Map Soccer-ball-field: 0.3102
- Mar 100 Soccer-ball-field: 0.75
- Map Container-crane: -1.0
- Mar 100 Container-crane: -1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Plane | Mar 100 Plane | Map Ship | Mar 100 Ship | Map Storage-tank | Mar 100 Storage-tank | Map Baseball-diamond | Mar 100 Baseball-diamond | Map Tennis-court | Mar 100 Tennis-court | Map Basketball-court | Mar 100 Basketball-court | Map Ground-track-field | Mar 100 Ground-track-field | Map Harbor | Mar 100 Harbor | Map Bridge | Mar 100 Bridge | Map Small-vehicle | Mar 100 Small-vehicle | Map Large-vehicle | Mar 100 Large-vehicle | Map Roundabout | Mar 100 Roundabout | Map Swimming-pool | Mar 100 Swimming-pool | Map Helicopter | Mar 100 Helicopter | Map Soccer-ball-field | Mar 100 Soccer-ball-field | Map Container-crane | Mar 100 Container-crane |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:--------------------:|:------------------------:|:----------------:|:--------------------:|:--------------------:|:------------------------:|:----------------------:|:--------------------------:|:----------:|:--------------:|:----------:|:--------------:|:-----------------:|:---------------------:|:-----------------:|:---------------------:|:--------------:|:------------------:|:-----------------:|:---------------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:-------------------:|:-----------------------:|
| No log | 1.0 | 88 | 25.4517 | 0.0042 | 0.0068 | 0.0041 | 0.0025 | 0.0061 | 0.0314 | 0.0045 | 0.0145 | 0.0303 | 0.0074 | 0.0293 | 0.0986 | 0.0 | 0.0 | 0.0168 | 0.0379 | 0.0149 | 0.0263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.24 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0213 | 0.1037 | 0.0093 | 0.014 | 0.0 | 0.0 | 0.0003 | 0.032 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 2.0 | 176 | 20.1694 | 0.0533 | 0.09 | 0.0587 | 0.0359 | 0.0726 | 0.0803 | 0.0371 | 0.1015 | 0.1769 | 0.1142 | 0.2239 | 0.3184 | 0.1794 | 0.4671 | 0.2892 | 0.4992 | 0.0037 | 0.3816 | 0.0 | 0.0 | 0.0241 | 0.1583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0059 | 0.0291 | 0.0 | 0.0 | 0.1353 | 0.2807 | 0.1285 | 0.527 | 0.0303 | 0.2 | 0.004 | 0.11 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 3.0 | 264 | 14.9836 | 0.0846 | 0.1417 | 0.087 | 0.0635 | 0.1184 | 0.1826 | 0.0576 | 0.1991 | 0.291 | 0.2215 | 0.2839 | 0.4473 | 0.2161 | 0.5027 | 0.2589 | 0.5276 | 0.1487 | 0.6605 | 0.0 | 0.0 | 0.1102 | 0.2167 | 0.0 | 0.0 | 0.0002 | 0.12 | 0.0682 | 0.2646 | 0.0 | 0.0 | 0.1915 | 0.3319 | 0.2193 | 0.5586 | 0.0443 | 0.4222 | 0.0116 | 0.16 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0005 | 0.6 |
| No log | 4.0 | 352 | 14.1814 | 0.1282 | 0.2111 | 0.1337 | 0.1045 | 0.1603 | 0.2022 | 0.0609 | 0.2 | 0.282 | 0.2162 | 0.3251 | 0.5015 | 0.3962 | 0.5658 | 0.3724 | 0.5957 | 0.363 | 0.6447 | 0.0 | 0.0 | 0.1996 | 0.2333 | 0.0 | 0.0 | 0.0006 | 0.28 | 0.1563 | 0.4797 | 0.0 | 0.0 | 0.2066 | 0.3478 | 0.2081 | 0.5797 | 0.004 | 0.1778 | 0.0158 | 0.176 | -1.0 | -1.0 | 0.0004 | 0.15 | 0.0 | 0.0 |
| No log | 5.0 | 440 | 13.4076 | 0.1585 | 0.2597 | 0.1673 | 0.1216 | 0.2051 | 0.1611 | 0.0816 | 0.2437 | 0.3324 | 0.3078 | 0.3586 | 0.4479 | 0.399 | 0.5493 | 0.3981 | 0.5844 | 0.5165 | 0.6816 | 0.0765 | 0.3143 | 0.1683 | 0.1667 | 0.0 | 0.0 | 0.0001 | 0.18 | 0.1232 | 0.4595 | 0.0011 | 0.0889 | 0.2144 | 0.3478 | 0.3437 | 0.6486 | 0.0531 | 0.3333 | 0.0794 | 0.282 | -1.0 | -1.0 | 0.003 | 0.25 | 0.0003 | 0.1 |
| 29.0102 | 6.0 | 528 | 11.9939 | 0.1826 | 0.3022 | 0.1882 | 0.1223 | 0.2211 | 0.267 | 0.0845 | 0.2492 | 0.3486 | 0.2705 | 0.4467 | 0.503 | 0.425 | 0.5521 | 0.4321 | 0.6317 | 0.5051 | 0.7026 | 0.0762 | 0.2857 | 0.2724 | 0.3333 | 0.0 | 0.0 | 0.0059 | 0.44 | 0.1342 | 0.4848 | 0.0001 | 0.0111 | 0.2457 | 0.3793 | 0.4366 | 0.709 | 0.1391 | 0.3667 | 0.0663 | 0.308 | -1.0 | -1.0 | 0.0006 | 0.025 | 0.0 | 0.0 |
| 29.0102 | 7.0 | 616 | 11.6131 | 0.1816 | 0.2996 | 0.1866 | 0.1624 | 0.2153 | 0.2611 | 0.1083 | 0.2786 | 0.3697 | 0.3105 | 0.4143 | 0.5789 | 0.4016 | 0.5247 | 0.4322 | 0.6217 | 0.5639 | 0.7026 | 0.109 | 0.3429 | 0.2366 | 0.275 | 0.0 | 0.0 | 0.0029 | 0.46 | 0.1711 | 0.4911 | 0.0036 | 0.1111 | 0.2409 | 0.3801 | 0.3971 | 0.6874 | 0.1032 | 0.4333 | 0.0572 | 0.34 | -1.0 | -1.0 | 0.0053 | 0.175 | 0.0 | 0.0 |
| 29.0102 | 8.0 | 704 | 11.5147 | 0.197 | 0.3432 | 0.1936 | 0.1485 | 0.2397 | 0.2802 | 0.1312 | 0.3303 | 0.4256 | 0.2678 | 0.4227 | 0.6935 | 0.4432 | 0.5603 | 0.4805 | 0.6419 | 0.5221 | 0.6447 | 0.1193 | 0.5143 | 0.2468 | 0.3 | 0.0 | 0.0 | 0.022 | 0.64 | 0.233 | 0.5684 | 0.0162 | 0.1 | 0.2532 | 0.3784 | 0.3621 | 0.6374 | 0.0735 | 0.4 | 0.1258 | 0.298 | -1.0 | -1.0 | 0.0098 | 0.2 | 0.0477 | 0.5 |
| 29.0102 | 9.0 | 792 | 11.5088 | 0.2292 | 0.3903 | 0.2354 | 0.1937 | 0.2654 | 0.3181 | 0.1335 | 0.3328 | 0.4433 | 0.3076 | 0.4504 | 0.6811 | 0.4572 | 0.5973 | 0.4886 | 0.6639 | 0.5129 | 0.6842 | 0.211 | 0.5143 | 0.2526 | 0.35 | 0.0 | 0.0 | 0.0051 | 0.7 | 0.2677 | 0.5139 | 0.0063 | 0.1222 | 0.2591 | 0.3817 | 0.49 | 0.7131 | 0.2461 | 0.4556 | 0.1023 | 0.378 | -1.0 | -1.0 | 0.0054 | 0.175 | 0.1333 | 0.4 |
| 29.0102 | 10.0 | 880 | 11.1732 | 0.224 | 0.3831 | 0.2293 | 0.1834 | 0.2493 | 0.3322 | 0.1103 | 0.334 | 0.4504 | 0.3186 | 0.4659 | 0.6957 | 0.4729 | 0.5671 | 0.4927 | 0.6598 | 0.5245 | 0.6789 | 0.0966 | 0.5 | 0.2592 | 0.4 | 0.0 | 0.0 | 0.13 | 0.62 | 0.2712 | 0.5468 | 0.0134 | 0.1111 | 0.2703 | 0.3857 | 0.4913 | 0.709 | 0.0934 | 0.4 | 0.1138 | 0.378 | -1.0 | -1.0 | 0.0229 | 0.2 | 0.1083 | 0.6 |
| 29.0102 | 11.0 | 968 | 10.8764 | 0.211 | 0.3604 | 0.2169 | 0.156 | 0.2607 | 0.2969 | 0.0939 | 0.3353 | 0.4286 | 0.3252 | 0.4256 | 0.6603 | 0.4866 | 0.5918 | 0.4991 | 0.6678 | 0.5128 | 0.6658 | 0.1055 | 0.4714 | 0.2119 | 0.3083 | 0.0 | 0.0 | 0.0139 | 0.52 | 0.2939 | 0.5367 | 0.0182 | 0.1111 | 0.2688 | 0.3957 | 0.453 | 0.7212 | 0.167 | 0.4444 | 0.1301 | 0.394 | -1.0 | -1.0 | 0.0017 | 0.2 | 0.0031 | 0.4 |
| 15.7212 | 12.0 | 1056 | 10.6630 | 0.2335 | 0.4047 | 0.2246 | 0.1821 | 0.2791 | 0.3896 | 0.1324 | 0.3383 | 0.4547 | 0.3184 | 0.4906 | 0.6587 | 0.4914 | 0.5808 | 0.4921 | 0.6678 | 0.5194 | 0.6737 | 0.1295 | 0.4571 | 0.1784 | 0.3583 | 0.0 | 0.0 | 0.0229 | 0.6 | 0.3227 | 0.5823 | 0.0428 | 0.1778 | 0.266 | 0.3982 | 0.5104 | 0.7131 | 0.1807 | 0.4556 | 0.204 | 0.456 | -1.0 | -1.0 | 0.0036 | 0.1 | 0.1392 | 0.6 |
| 15.7212 | 13.0 | 1144 | 11.0687 | 0.23 | 0.387 | 0.2471 | 0.1712 | 0.2709 | 0.3033 | 0.1183 | 0.3425 | 0.4458 | 0.325 | 0.4967 | 0.6931 | 0.5014 | 0.6068 | 0.4947 | 0.6637 | 0.5254 | 0.6763 | 0.0657 | 0.3857 | 0.2253 | 0.3583 | 0.0 | 0.0 | 0.1651 | 0.64 | 0.3136 | 0.5835 | 0.0243 | 0.1556 | 0.2579 | 0.3849 | 0.4517 | 0.709 | 0.2071 | 0.4556 | 0.1705 | 0.368 | -1.0 | -1.0 | 0.0139 | 0.2 | 0.033 | 0.5 |
| 15.7212 | 14.0 | 1232 | 10.9887 | 0.2391 | 0.3985 | 0.2554 | 0.1828 | 0.3059 | 0.2786 | 0.1245 | 0.3792 | 0.4764 | 0.3224 | 0.513 | 0.6981 | 0.5166 | 0.5986 | 0.4857 | 0.6785 | 0.5542 | 0.7079 | 0.1125 | 0.5143 | 0.2716 | 0.3833 | 0.0 | 0.0 | 0.048 | 0.66 | 0.3393 | 0.6025 | 0.0761 | 0.1778 | 0.2643 | 0.3893 | 0.4831 | 0.7176 | 0.2294 | 0.4667 | 0.1915 | 0.45 | -1.0 | -1.0 | 0.0137 | 0.2 | 0.0011 | 0.6 |
| 15.7212 | 15.0 | 1320 | 10.7109 | 0.2637 | 0.4469 | 0.2786 | 0.1669 | 0.3115 | 0.3557 | 0.136 | 0.3748 | 0.4761 | 0.3325 | 0.4656 | 0.6819 | 0.5098 | 0.6027 | 0.4819 | 0.6683 | 0.5734 | 0.7237 | 0.1718 | 0.4286 | 0.2432 | 0.3 | 0.0 | 0.0 | 0.035 | 0.6 | 0.3464 | 0.6139 | 0.0467 | 0.1889 | 0.2751 | 0.4018 | 0.4934 | 0.7257 | 0.3354 | 0.4889 | 0.1784 | 0.474 | -1.0 | -1.0 | 0.0772 | 0.225 | 0.1875 | 0.7 |
| 15.7212 | 16.0 | 1408 | 10.5691 | 0.2425 | 0.4412 | 0.2375 | 0.1652 | 0.2995 | 0.2878 | 0.1185 | 0.3349 | 0.4241 | 0.3633 | 0.5052 | 0.5158 | 0.4893 | 0.6137 | 0.4967 | 0.6675 | 0.555 | 0.6868 | 0.0762 | 0.3714 | 0.1303 | 0.375 | 0.0004 | 0.0167 | 0.032 | 0.68 | 0.3526 | 0.6278 | 0.0519 | 0.1111 | 0.2677 | 0.3886 | 0.4878 | 0.6856 | 0.2026 | 0.3889 | 0.1958 | 0.448 | -1.0 | -1.0 | 0.0 | 0.0 | 0.3 | 0.3 |
| 15.7212 | 17.0 | 1496 | 10.4504 | 0.275 | 0.467 | 0.2711 | 0.1953 | 0.3251 | 0.3186 | 0.1366 | 0.3566 | 0.4787 | 0.3615 | 0.5271 | 0.7095 | 0.5494 | 0.6932 | 0.4902 | 0.6829 | 0.586 | 0.7263 | 0.1781 | 0.4286 | 0.1429 | 0.35 | 0.0 | 0.0 | 0.0715 | 0.7 | 0.3606 | 0.619 | 0.0286 | 0.1667 | 0.276 | 0.4056 | 0.5021 | 0.6937 | 0.2912 | 0.4556 | 0.2373 | 0.484 | -1.0 | -1.0 | 0.0007 | 0.175 | 0.4111 | 0.6 |
| 13.6954 | 18.0 | 1584 | 10.5193 | 0.2749 | 0.4593 | 0.2754 | 0.1839 | 0.313 | 0.3305 | 0.1285 | 0.3674 | 0.4762 | 0.3443 | 0.467 | 0.7019 | 0.5493 | 0.6452 | 0.5076 | 0.6816 | 0.6065 | 0.7342 | 0.2002 | 0.4571 | 0.1805 | 0.3083 | 0.0 | 0.0 | 0.0895 | 0.6 | 0.3764 | 0.6494 | 0.0045 | 0.1778 | 0.2813 | 0.3996 | 0.4814 | 0.6941 | 0.2725 | 0.4667 | 0.2098 | 0.454 | -1.0 | -1.0 | 0.0021 | 0.175 | 0.362 | 0.7 |
| 13.6954 | 19.0 | 1672 | 10.5809 | 0.2674 | 0.4699 | 0.2734 | 0.1747 | 0.3078 | 0.3692 | 0.1318 | 0.3668 | 0.4769 | 0.3993 | 0.469 | 0.6847 | 0.5343 | 0.6274 | 0.4819 | 0.6747 | 0.5896 | 0.7026 | 0.2098 | 0.4714 | 0.2291 | 0.3417 | 0.0 | 0.0 | 0.04 | 0.68 | 0.3682 | 0.6152 | 0.0235 | 0.2111 | 0.2784 | 0.4021 | 0.4625 | 0.6923 | 0.2632 | 0.5222 | 0.1979 | 0.438 | -1.0 | -1.0 | 0.0012 | 0.175 | 0.3313 | 0.6 |
| 13.6954 | 20.0 | 1760 | 10.3693 | 0.2694 | 0.4794 | 0.2715 | 0.1737 | 0.3039 | 0.369 | 0.1389 | 0.3637 | 0.5011 | 0.3714 | 0.5108 | 0.6987 | 0.5429 | 0.6575 | 0.4967 | 0.6806 | 0.5717 | 0.6789 | 0.2065 | 0.4714 | 0.2301 | 0.3083 | 0.0082 | 0.0417 | 0.0851 | 0.68 | 0.3568 | 0.6519 | 0.0214 | 0.2222 | 0.2882 | 0.4002 | 0.4941 | 0.7081 | 0.198 | 0.4333 | 0.2161 | 0.458 | -1.0 | -1.0 | 0.0015 | 0.425 | 0.3235 | 0.7 |
| 13.6954 | 21.0 | 1848 | 10.3957 | 0.2674 | 0.4494 | 0.2784 | 0.1869 | 0.3176 | 0.3623 | 0.1462 | 0.3646 | 0.461 | 0.3673 | 0.461 | 0.629 | 0.5379 | 0.6411 | 0.5007 | 0.6877 | 0.607 | 0.7105 | 0.2067 | 0.4714 | 0.2495 | 0.2667 | 0.0 | 0.0 | 0.1708 | 0.58 | 0.3583 | 0.6139 | 0.0665 | 0.3111 | 0.2838 | 0.4067 | 0.4652 | 0.7 | 0.273 | 0.4333 | 0.2261 | 0.442 | -1.0 | -1.0 | 0.0029 | 0.05 | 0.063 | 0.6 |
| 13.6954 | 22.0 | 1936 | 10.5870 | 0.2837 | 0.5025 | 0.3011 | 0.2016 | 0.319 | 0.3547 | 0.1312 | 0.3848 | 0.4868 | 0.3699 | 0.4782 | 0.6906 | 0.5304 | 0.6479 | 0.4985 | 0.6783 | 0.6024 | 0.7105 | 0.2921 | 0.5143 | 0.2317 | 0.3167 | 0.0041 | 0.05 | 0.1305 | 0.56 | 0.3841 | 0.6582 | 0.074 | 0.3556 | 0.2898 | 0.4075 | 0.4938 | 0.7032 | 0.2397 | 0.4444 | 0.2329 | 0.48 | -1.0 | -1.0 | 0.0011 | 0.175 | 0.25 | 0.6 |
| 12.5344 | 23.0 | 2024 | 10.5880 | 0.2836 | 0.5099 | 0.2943 | 0.1866 | 0.3357 | 0.3583 | 0.1269 | 0.3666 | 0.4672 | 0.3569 | 0.5405 | 0.6325 | 0.5484 | 0.6329 | 0.5029 | 0.6918 | 0.5995 | 0.7263 | 0.2531 | 0.5143 | 0.2404 | 0.375 | 0.0028 | 0.05 | 0.1286 | 0.66 | 0.3662 | 0.6354 | 0.0582 | 0.2889 | 0.2892 | 0.4014 | 0.5032 | 0.6712 | 0.3226 | 0.4333 | 0.2879 | 0.478 | -1.0 | -1.0 | 0.0008 | 0.05 | 0.15 | 0.4 |
| 12.5344 | 24.0 | 2112 | 10.3784 | 0.2725 | 0.4892 | 0.2814 | 0.1911 | 0.33 | 0.3301 | 0.125 | 0.3539 | 0.471 | 0.3633 | 0.5435 | 0.6243 | 0.5351 | 0.6658 | 0.5011 | 0.6859 | 0.6006 | 0.7211 | 0.1963 | 0.5143 | 0.2734 | 0.4083 | 0.0046 | 0.0417 | 0.0466 | 0.62 | 0.3587 | 0.6342 | 0.0568 | 0.2778 | 0.2892 | 0.4057 | 0.4952 | 0.6802 | 0.3295 | 0.4556 | 0.2775 | 0.48 | -1.0 | -1.0 | 0.0016 | 0.075 | 0.1214 | 0.4 |
| 12.5344 | 25.0 | 2200 | 10.5771 | 0.2856 | 0.5048 | 0.2945 | 0.1905 | 0.3346 | 0.3629 | 0.1378 | 0.3627 | 0.4881 | 0.3714 | 0.4738 | 0.7082 | 0.535 | 0.6548 | 0.5096 | 0.6931 | 0.5848 | 0.6868 | 0.2974 | 0.5571 | 0.2531 | 0.3167 | 0.0091 | 0.05 | 0.0533 | 0.58 | 0.3663 | 0.6342 | 0.0672 | 0.3222 | 0.2893 | 0.405 | 0.5009 | 0.6856 | 0.3279 | 0.4667 | 0.2681 | 0.494 | -1.0 | -1.0 | 0.0021 | 0.175 | 0.2197 | 0.6 |
| 12.5344 | 26.0 | 2288 | 10.4449 | 0.2858 | 0.4949 | 0.2954 | 0.191 | 0.3275 | 0.3666 | 0.142 | 0.3613 | 0.4886 | 0.3864 | 0.5312 | 0.6736 | 0.5488 | 0.6726 | 0.4946 | 0.6841 | 0.5982 | 0.7158 | 0.2536 | 0.4429 | 0.256 | 0.3667 | 0.0035 | 0.0583 | 0.0957 | 0.58 | 0.3819 | 0.6177 | 0.038 | 0.4 | 0.2885 | 0.4019 | 0.4975 | 0.6802 | 0.3247 | 0.4444 | 0.2817 | 0.49 | -1.0 | -1.0 | 0.0015 | 0.175 | 0.2222 | 0.6 |
| 12.5344 | 27.0 | 2376 | 10.5696 | 0.2848 | 0.5067 | 0.3072 | 0.1958 | 0.3306 | 0.3707 | 0.1398 | 0.3585 | 0.492 | 0.3713 | 0.5444 | 0.6851 | 0.5364 | 0.6616 | 0.5001 | 0.6801 | 0.6067 | 0.7079 | 0.3179 | 0.5286 | 0.2572 | 0.375 | 0.007 | 0.0667 | 0.091 | 0.66 | 0.3871 | 0.6203 | 0.0478 | 0.3 | 0.2897 | 0.3992 | 0.4964 | 0.6896 | 0.2966 | 0.4222 | 0.2978 | 0.494 | -1.0 | -1.0 | 0.0012 | 0.175 | 0.1385 | 0.6 |
| 12.5344 | 28.0 | 2464 | 10.6176 | 0.2853 | 0.5035 | 0.3096 | 0.1949 | 0.3304 | 0.3639 | 0.1329 | 0.3858 | 0.4886 | 0.3558 | 0.5407 | 0.7019 | 0.5433 | 0.6644 | 0.5061 | 0.6859 | 0.602 | 0.7026 | 0.3147 | 0.5429 | 0.2489 | 0.3583 | 0.0041 | 0.0417 | 0.0717 | 0.64 | 0.3868 | 0.6342 | 0.0604 | 0.2778 | 0.2874 | 0.4002 | 0.5051 | 0.695 | 0.3014 | 0.4222 | 0.2856 | 0.488 | -1.0 | -1.0 | 0.0012 | 0.175 | 0.1614 | 0.6 |
| 11.8366 | 29.0 | 2552 | 10.5827 | 0.2833 | 0.5027 | 0.3071 | 0.1993 | 0.3317 | 0.355 | 0.1358 | 0.3625 | 0.4866 | 0.3758 | 0.5408 | 0.648 | 0.536 | 0.6603 | 0.5022 | 0.6816 | 0.5885 | 0.7132 | 0.3216 | 0.5286 | 0.2557 | 0.35 | 0.0078 | 0.0667 | 0.0491 | 0.62 | 0.3801 | 0.6329 | 0.0619 | 0.3444 | 0.2867 | 0.3989 | 0.5099 | 0.7045 | 0.3289 | 0.4556 | 0.2806 | 0.468 | -1.0 | -1.0 | 0.0007 | 0.075 | 0.1392 | 0.6 |
| 11.8366 | 30.0 | 2640 | 10.5255 | 0.28 | 0.4993 | 0.2974 | 0.2079 | 0.3297 | 0.3254 | 0.1378 | 0.36 | 0.4847 | 0.3734 | 0.544 | 0.6451 | 0.5284 | 0.6562 | 0.5023 | 0.6872 | 0.5925 | 0.6974 | 0.2781 | 0.5429 | 0.2486 | 0.35 | 0.0085 | 0.0667 | 0.0526 | 0.62 | 0.3821 | 0.6304 | 0.0567 | 0.3111 | 0.2855 | 0.3999 | 0.5044 | 0.6968 | 0.3226 | 0.4444 | 0.3012 | 0.492 | -1.0 | -1.0 | 0.0002 | 0.075 | 0.136 | 0.6 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"plane",
"ship",
"storage-tank",
"baseball-diamond",
"tennis-court",
"basketball-court",
"ground-track-field",
"harbor",
"bridge",
"small-vehicle",
"large-vehicle",
"roundabout",
"swimming-pool",
"helicopter",
"soccer-ball-field",
"container-crane"
] |
Vasily-Lapinsky/Indoor-RT-Detrv2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr_v2_r18vd-finetuned-indoor-batch8-loss-finall
This model is a fine-tuned version of [PekingU/rtdetr_v2_r18vd](https://huggingface.co/PekingU/rtdetr_v2_r18vd/) on the indoor dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7451
- Map: 0.7714
- Map 50: 0.9402
- Map 75: 0.8807
- Map Small: 0.4279
- Map Medium: 0.6977
- Map Large: 0.7784
- Mar 1: 0.704
- Mar 10: 0.8268
- Mar 100: 0.8378
- Mar Small: 0.4723
- Mar Medium: 0.7949
- Mar Large: 0.8414
- Map Exit: 0.7325
- Mar 100 Exit: 0.7923
- Map Fireextinguisher: 0.7845
- Mar 100 Fireextinguisher: 0.8203
- Map Chair: 0.7354
- Mar 100 Chair: 0.8301
- Map Clock: 0.7962
- Mar 100 Clock: 0.837
- Map Trashbin: 0.5619
- Mar 100 Trashbin: 0.725
- Map Printer: 0.8914
- Mar 100 Printer: 0.9375
- Map Screen: 0.8976
- Mar 100 Screen: 0.9222
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 25.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Exit | Mar 100 Exit | Map Fireextinguisher | Mar 100 Fireextinguisher | Map Chair | Mar 100 Chair | Map Clock | Mar 100 Clock | Map Trashbin | Mar 100 Trashbin | Map Printer | Mar 100 Printer | Map Screen | Mar 100 Screen |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------:|:------------:|:--------------------:|:------------------------:|:---------:|:-------------:|:---------:|:-------------:|:------------:|:----------------:|:-----------:|:---------------:|:----------:|:--------------:|
| 0.8218 | 1.0 | 112 | 0.9142 | 0.6911 | 0.8734 | 0.7937 | 0.2594 | 0.6244 | 0.7302 | 0.6551 | 0.7783 | 0.7887 | 0.3233 | 0.7267 | 0.8169 | 0.6682 | 0.7519 | 0.6719 | 0.7531 | 0.6624 | 0.7687 | 0.761 | 0.7963 | 0.3997 | 0.64 | 0.8667 | 0.9 | 0.8078 | 0.9111 |
| 0.801 | 2.0 | 224 | 0.8670 | 0.7147 | 0.8926 | 0.8306 | 0.2687 | 0.62 | 0.7324 | 0.6732 | 0.7871 | 0.7982 | 0.3294 | 0.7339 | 0.8116 | 0.6883 | 0.7577 | 0.7121 | 0.7797 | 0.6997 | 0.7982 | 0.7695 | 0.7889 | 0.4928 | 0.685 | 0.871 | 0.9 | 0.7696 | 0.8778 |
| 0.7682 | 3.0 | 336 | 0.9007 | 0.7075 | 0.9011 | 0.8123 | 0.2875 | 0.6152 | 0.7243 | 0.6701 | 0.7838 | 0.794 | 0.401 | 0.7275 | 0.8217 | 0.687 | 0.7442 | 0.7061 | 0.7741 | 0.6667 | 0.7804 | 0.7607 | 0.7963 | 0.4857 | 0.72 | 0.8775 | 0.8875 | 0.769 | 0.8556 |
| 0.7507 | 4.0 | 448 | 0.8265 | 0.7392 | 0.9242 | 0.8584 | 0.3972 | 0.6715 | 0.7561 | 0.6817 | 0.808 | 0.8202 | 0.4516 | 0.7653 | 0.8352 | 0.705 | 0.7654 | 0.7247 | 0.7811 | 0.709 | 0.8117 | 0.783 | 0.8333 | 0.5202 | 0.715 | 0.8842 | 0.9125 | 0.848 | 0.9222 |
| 0.7301 | 5.0 | 560 | 0.8271 | 0.7227 | 0.9141 | 0.8282 | 0.3326 | 0.6377 | 0.7385 | 0.6691 | 0.7986 | 0.8064 | 0.4591 | 0.7376 | 0.8487 | 0.7092 | 0.7942 | 0.7371 | 0.7902 | 0.7047 | 0.7969 | 0.7924 | 0.8481 | 0.5106 | 0.675 | 0.8083 | 0.8625 | 0.7967 | 0.8778 |
| 0.7197 | 6.0 | 672 | 0.8127 | 0.7561 | 0.9331 | 0.8718 | 0.3553 | 0.6725 | 0.7661 | 0.6901 | 0.8124 | 0.8204 | 0.4527 | 0.7664 | 0.8489 | 0.7203 | 0.7788 | 0.7444 | 0.8007 | 0.6974 | 0.8 | 0.7891 | 0.8148 | 0.5839 | 0.725 | 0.8725 | 0.9125 | 0.8853 | 0.9111 |
| 0.7041 | 7.0 | 784 | 0.7735 | 0.7637 | 0.9447 | 0.8864 | 0.4044 | 0.6757 | 0.7658 | 0.6935 | 0.8205 | 0.8326 | 0.5206 | 0.7725 | 0.8235 | 0.7446 | 0.8 | 0.7543 | 0.8154 | 0.7246 | 0.8209 | 0.7881 | 0.8407 | 0.5854 | 0.715 | 0.8675 | 0.925 | 0.8814 | 0.9111 |
| 0.7111 | 8.0 | 896 | 0.7977 | 0.7312 | 0.9297 | 0.8389 | 0.3618 | 0.6202 | 0.7366 | 0.6735 | 0.7931 | 0.8146 | 0.419 | 0.7552 | 0.8517 | 0.7106 | 0.7981 | 0.7485 | 0.8028 | 0.7261 | 0.8178 | 0.7699 | 0.8074 | 0.486 | 0.665 | 0.8737 | 0.9 | 0.8038 | 0.9111 |
| 0.6902 | 9.0 | 1008 | 0.7777 | 0.7398 | 0.9262 | 0.8423 | 0.4019 | 0.6672 | 0.7482 | 0.6855 | 0.8141 | 0.8267 | 0.4626 | 0.7678 | 0.849 | 0.7328 | 0.7981 | 0.7337 | 0.8035 | 0.7393 | 0.8442 | 0.777 | 0.8074 | 0.499 | 0.685 | 0.8638 | 0.9375 | 0.8332 | 0.9111 |
| 0.6712 | 10.0 | 1120 | 0.7881 | 0.7498 | 0.9418 | 0.8711 | 0.4102 | 0.6709 | 0.7691 | 0.6778 | 0.8058 | 0.815 | 0.4473 | 0.7647 | 0.843 | 0.6979 | 0.7712 | 0.7497 | 0.8014 | 0.7325 | 0.8325 | 0.7693 | 0.8111 | 0.5833 | 0.725 | 0.8505 | 0.875 | 0.8655 | 0.8889 |
| 0.6763 | 11.0 | 1232 | 0.7937 | 0.7666 | 0.9359 | 0.8797 | 0.3943 | 0.6884 | 0.7803 | 0.6931 | 0.8147 | 0.8237 | 0.4368 | 0.7714 | 0.8322 | 0.7247 | 0.7788 | 0.7458 | 0.7944 | 0.727 | 0.8215 | 0.7673 | 0.8037 | 0.5958 | 0.695 | 0.9138 | 0.95 | 0.8916 | 0.9222 |
| 0.6792 | 12.0 | 1344 | 0.8028 | 0.7465 | 0.9163 | 0.8525 | 0.3173 | 0.6541 | 0.7807 | 0.6781 | 0.8056 | 0.8187 | 0.3531 | 0.7869 | 0.882 | 0.7167 | 0.775 | 0.7393 | 0.7874 | 0.7217 | 0.8166 | 0.7675 | 0.7889 | 0.5595 | 0.695 | 0.886 | 0.9125 | 0.8346 | 0.9556 |
| 0.6844 | 13.0 | 1456 | 0.7748 | 0.7528 | 0.9263 | 0.8639 | 0.3421 | 0.6686 | 0.776 | 0.6968 | 0.8168 | 0.8265 | 0.4386 | 0.7729 | 0.8377 | 0.7173 | 0.7923 | 0.7556 | 0.8021 | 0.7344 | 0.8252 | 0.7799 | 0.8222 | 0.5131 | 0.695 | 0.9105 | 0.9375 | 0.8591 | 0.9111 |
| 0.667 | 14.0 | 1568 | 0.7876 | 0.7551 | 0.9325 | 0.8741 | 0.3361 | 0.6796 | 0.7733 | 0.6921 | 0.8056 | 0.8199 | 0.3841 | 0.7683 | 0.8661 | 0.7108 | 0.7865 | 0.7369 | 0.793 | 0.7277 | 0.8141 | 0.7921 | 0.8259 | 0.5695 | 0.685 | 0.8835 | 0.9125 | 0.8654 | 0.9222 |
| 0.6596 | 15.0 | 1680 | 0.7691 | 0.7683 | 0.9294 | 0.8897 | 0.3874 | 0.6794 | 0.787 | 0.6952 | 0.8234 | 0.8342 | 0.4475 | 0.7897 | 0.8442 | 0.7322 | 0.7923 | 0.7666 | 0.8231 | 0.7281 | 0.8307 | 0.8055 | 0.837 | 0.5706 | 0.72 | 0.8986 | 0.925 | 0.8762 | 0.9111 |
| 0.6349 | 16.0 | 1792 | 0.7581 | 0.7683 | 0.9414 | 0.8772 | 0.3607 | 0.6942 | 0.7746 | 0.7047 | 0.8307 | 0.8391 | 0.4571 | 0.8117 | 0.8584 | 0.7252 | 0.8019 | 0.7592 | 0.8161 | 0.7307 | 0.8331 | 0.8069 | 0.8444 | 0.5626 | 0.72 | 0.9082 | 0.925 | 0.8853 | 0.9333 |
| 0.6433 | 17.0 | 1904 | 0.7571 | 0.7604 | 0.9339 | 0.881 | 0.3855 | 0.6851 | 0.7799 | 0.6981 | 0.8191 | 0.839 | 0.4629 | 0.8256 | 0.8329 | 0.7378 | 0.8077 | 0.7639 | 0.8168 | 0.7335 | 0.835 | 0.7824 | 0.8222 | 0.5558 | 0.725 | 0.8643 | 0.9 | 0.8849 | 0.9667 |
| 0.6348 | 18.0 | 2016 | 0.7545 | 0.7708 | 0.9345 | 0.8934 | 0.3228 | 0.6765 | 0.7954 | 0.7003 | 0.8238 | 0.8348 | 0.4469 | 0.7905 | 0.8576 | 0.7379 | 0.8019 | 0.7763 | 0.8182 | 0.7278 | 0.8245 | 0.7998 | 0.8407 | 0.5623 | 0.7 | 0.9131 | 0.925 | 0.8783 | 0.9333 |
| 0.6322 | 19.0 | 2128 | 0.7642 | 0.7651 | 0.931 | 0.8875 | 0.3233 | 0.6763 | 0.7673 | 0.6998 | 0.818 | 0.8329 | 0.4618 | 0.7852 | 0.835 | 0.7146 | 0.7827 | 0.7603 | 0.8273 | 0.7405 | 0.8294 | 0.795 | 0.8333 | 0.54 | 0.71 | 0.9184 | 0.925 | 0.887 | 0.9222 |
| 0.6219 | 20.0 | 2240 | 0.7587 | 0.7702 | 0.9434 | 0.8849 | 0.4091 | 0.6879 | 0.7744 | 0.6981 | 0.8254 | 0.8447 | 0.4631 | 0.8049 | 0.8641 | 0.734 | 0.8019 | 0.7715 | 0.8252 | 0.7387 | 0.8356 | 0.8136 | 0.8481 | 0.5692 | 0.72 | 0.8794 | 0.9375 | 0.8849 | 0.9444 |
| 0.6105 | 21.0 | 2352 | 0.7471 | 0.7701 | 0.9393 | 0.8868 | 0.4202 | 0.678 | 0.7805 | 0.6996 | 0.8218 | 0.8356 | 0.4765 | 0.7799 | 0.8389 | 0.7193 | 0.7923 | 0.7781 | 0.8259 | 0.7343 | 0.8337 | 0.8069 | 0.8444 | 0.5684 | 0.715 | 0.9144 | 0.9375 | 0.8691 | 0.9 |
| 0.6123 | 22.0 | 2464 | 0.7461 | 0.7691 | 0.9426 | 0.8919 | 0.4237 | 0.6834 | 0.7659 | 0.6897 | 0.8198 | 0.8336 | 0.4648 | 0.7924 | 0.8395 | 0.7271 | 0.7942 | 0.7794 | 0.828 | 0.7359 | 0.8301 | 0.7978 | 0.8333 | 0.5777 | 0.715 | 0.8965 | 0.9125 | 0.8695 | 0.9222 |
| 0.6125 | 23.0 | 2576 | 0.7466 | 0.7649 | 0.9344 | 0.8811 | 0.4351 | 0.6806 | 0.7737 | 0.6987 | 0.821 | 0.8323 | 0.4805 | 0.7883 | 0.8369 | 0.7274 | 0.7962 | 0.7802 | 0.8259 | 0.7339 | 0.8221 | 0.7908 | 0.8296 | 0.5378 | 0.705 | 0.911 | 0.925 | 0.8735 | 0.9222 |
| 0.616 | 24.0 | 2688 | 0.7563 | 0.7612 | 0.9365 | 0.872 | 0.4332 | 0.6812 | 0.7718 | 0.698 | 0.8246 | 0.8337 | 0.4809 | 0.7915 | 0.8422 | 0.7376 | 0.8019 | 0.7817 | 0.8273 | 0.7296 | 0.8184 | 0.793 | 0.8333 | 0.5472 | 0.72 | 0.8612 | 0.9125 | 0.8778 | 0.9222 |
| 0.6079 | 25.0 | 2800 | 0.7451 | 0.7714 | 0.9402 | 0.8807 | 0.4279 | 0.6977 | 0.7784 | 0.704 | 0.8268 | 0.8378 | 0.4723 | 0.7949 | 0.8414 | 0.7325 | 0.7923 | 0.7845 | 0.8203 | 0.7354 | 0.8301 | 0.7962 | 0.837 | 0.5619 | 0.725 | 0.8914 | 0.9375 | 0.8976 | 0.9222 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"exit",
"fireextinguisher",
"chair",
"clock",
"trashbin",
"printer",
"screen"
] |
ivangorbachenko/detr-fashion-output |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-fashion-output
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2650
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
iancu003/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8676
- Map: 0.5394
- Map 50: 0.8117
- Map 75: 0.5772
- Map Small: -1.0
- Map Medium: 0.5578
- Map Large: 0.5596
- Mar 1: 0.4162
- Mar 10: 0.6989
- Mar 100: 0.7526
- Mar Small: -1.0
- Mar Medium: 0.6964
- Mar Large: 0.7625
- Map Banana: 0.3767
- Mar 100 Banana: 0.7025
- Map Orange: 0.6021
- Mar 100 Orange: 0.781
- Map Apple: 0.6395
- Mar 100 Apple: 0.7743
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 2.0901 | 0.0074 | 0.0224 | 0.0017 | -1.0 | 0.0028 | 0.008 | 0.0246 | 0.0869 | 0.231 | -1.0 | 0.2071 | 0.2142 | 0.0197 | 0.47 | 0.0003 | 0.0286 | 0.0023 | 0.1943 |
| No log | 2.0 | 120 | 1.7436 | 0.0145 | 0.0392 | 0.0075 | -1.0 | 0.0379 | 0.0145 | 0.1026 | 0.2509 | 0.3812 | -1.0 | 0.4464 | 0.3701 | 0.0175 | 0.465 | 0.0122 | 0.3071 | 0.0137 | 0.3714 |
| No log | 3.0 | 180 | 1.7765 | 0.0153 | 0.0438 | 0.0066 | -1.0 | 0.1254 | 0.0127 | 0.0923 | 0.2462 | 0.3911 | -1.0 | 0.3452 | 0.391 | 0.0193 | 0.4775 | 0.0085 | 0.15 | 0.0181 | 0.5457 |
| No log | 4.0 | 240 | 1.4905 | 0.0578 | 0.1483 | 0.0341 | -1.0 | 0.0389 | 0.0583 | 0.1225 | 0.2717 | 0.4299 | -1.0 | 0.325 | 0.4311 | 0.1009 | 0.565 | 0.0586 | 0.5333 | 0.014 | 0.1914 |
| No log | 5.0 | 300 | 1.5330 | 0.0456 | 0.1036 | 0.0321 | -1.0 | 0.1046 | 0.042 | 0.1628 | 0.3144 | 0.4846 | -1.0 | 0.3512 | 0.4991 | 0.0605 | 0.5575 | 0.0272 | 0.1762 | 0.0492 | 0.72 |
| No log | 6.0 | 360 | 1.4123 | 0.0756 | 0.1598 | 0.0707 | -1.0 | 0.1356 | 0.0839 | 0.2321 | 0.4085 | 0.5868 | -1.0 | 0.525 | 0.5984 | 0.0484 | 0.56 | 0.094 | 0.4833 | 0.0844 | 0.7171 |
| No log | 7.0 | 420 | 1.2390 | 0.0987 | 0.1985 | 0.0931 | -1.0 | 0.26 | 0.1056 | 0.2354 | 0.4165 | 0.5435 | -1.0 | 0.4881 | 0.5502 | 0.0766 | 0.61 | 0.0658 | 0.2262 | 0.1536 | 0.7943 |
| No log | 8.0 | 480 | 1.1741 | 0.135 | 0.229 | 0.1462 | -1.0 | 0.2255 | 0.1517 | 0.3017 | 0.5152 | 0.6331 | -1.0 | 0.5488 | 0.6469 | 0.1319 | 0.6275 | 0.118 | 0.5119 | 0.1551 | 0.76 |
| 1.5201 | 9.0 | 540 | 1.1199 | 0.144 | 0.2737 | 0.1613 | -1.0 | 0.2836 | 0.133 | 0.3014 | 0.5292 | 0.6615 | -1.0 | 0.6571 | 0.6651 | 0.1324 | 0.6325 | 0.1457 | 0.5833 | 0.1538 | 0.7686 |
| 1.5201 | 10.0 | 600 | 1.1057 | 0.1897 | 0.3545 | 0.2102 | -1.0 | 0.3063 | 0.2052 | 0.3206 | 0.5446 | 0.6786 | -1.0 | 0.625 | 0.6912 | 0.1053 | 0.62 | 0.2139 | 0.5929 | 0.25 | 0.8229 |
| 1.5201 | 11.0 | 660 | 1.0601 | 0.2859 | 0.5094 | 0.3305 | -1.0 | 0.2939 | 0.3321 | 0.3744 | 0.605 | 0.7146 | -1.0 | 0.6524 | 0.7286 | 0.1843 | 0.6425 | 0.3504 | 0.7214 | 0.323 | 0.78 |
| 1.5201 | 12.0 | 720 | 0.9949 | 0.4173 | 0.6847 | 0.4656 | -1.0 | 0.4611 | 0.4292 | 0.368 | 0.6462 | 0.7211 | -1.0 | 0.6821 | 0.7285 | 0.2863 | 0.68 | 0.4488 | 0.7405 | 0.5169 | 0.7429 |
| 1.5201 | 13.0 | 780 | 0.9413 | 0.4504 | 0.7103 | 0.4867 | -1.0 | 0.5579 | 0.4581 | 0.3937 | 0.664 | 0.7316 | -1.0 | 0.6881 | 0.7424 | 0.2734 | 0.6525 | 0.5246 | 0.7452 | 0.5532 | 0.7971 |
| 1.5201 | 14.0 | 840 | 0.9419 | 0.4598 | 0.7369 | 0.4896 | -1.0 | 0.449 | 0.4773 | 0.3844 | 0.6482 | 0.7272 | -1.0 | 0.6917 | 0.7331 | 0.3544 | 0.6825 | 0.4781 | 0.719 | 0.5468 | 0.78 |
| 1.5201 | 15.0 | 900 | 0.8860 | 0.4941 | 0.7598 | 0.5238 | -1.0 | 0.5195 | 0.5081 | 0.408 | 0.6824 | 0.73 | -1.0 | 0.6786 | 0.7407 | 0.3449 | 0.6575 | 0.5216 | 0.7381 | 0.6159 | 0.7943 |
| 1.5201 | 16.0 | 960 | 0.8809 | 0.5304 | 0.8082 | 0.5719 | -1.0 | 0.5741 | 0.5432 | 0.4173 | 0.6913 | 0.7546 | -1.0 | 0.6952 | 0.7664 | 0.3713 | 0.69 | 0.5719 | 0.7595 | 0.648 | 0.8143 |
| 0.8101 | 17.0 | 1020 | 0.9158 | 0.4802 | 0.7448 | 0.5285 | -1.0 | 0.5376 | 0.4955 | 0.4039 | 0.6769 | 0.7491 | -1.0 | 0.6548 | 0.7643 | 0.3247 | 0.6925 | 0.4984 | 0.7548 | 0.6176 | 0.8 |
| 0.8101 | 18.0 | 1080 | 0.8549 | 0.5396 | 0.8097 | 0.6048 | -1.0 | 0.5375 | 0.5553 | 0.406 | 0.6998 | 0.7552 | -1.0 | 0.725 | 0.7632 | 0.3893 | 0.68 | 0.5748 | 0.7714 | 0.6548 | 0.8143 |
| 0.8101 | 19.0 | 1140 | 0.8724 | 0.5418 | 0.8146 | 0.6113 | -1.0 | 0.5818 | 0.551 | 0.4085 | 0.6925 | 0.7454 | -1.0 | 0.6893 | 0.7561 | 0.4059 | 0.69 | 0.5754 | 0.769 | 0.6442 | 0.7771 |
| 0.8101 | 20.0 | 1200 | 0.8617 | 0.5549 | 0.8222 | 0.6196 | -1.0 | 0.6036 | 0.5666 | 0.4141 | 0.6867 | 0.7508 | -1.0 | 0.6738 | 0.7637 | 0.3944 | 0.7025 | 0.6056 | 0.7786 | 0.6646 | 0.7714 |
| 0.8101 | 21.0 | 1260 | 0.8689 | 0.5427 | 0.8069 | 0.5713 | -1.0 | 0.562 | 0.5591 | 0.4159 | 0.689 | 0.7415 | -1.0 | 0.6631 | 0.7545 | 0.3838 | 0.6825 | 0.5622 | 0.7619 | 0.6822 | 0.78 |
| 0.8101 | 22.0 | 1320 | 0.8742 | 0.5497 | 0.8267 | 0.6029 | -1.0 | 0.5915 | 0.563 | 0.4059 | 0.6873 | 0.7472 | -1.0 | 0.681 | 0.7589 | 0.3903 | 0.695 | 0.5687 | 0.7667 | 0.6902 | 0.78 |
| 0.8101 | 23.0 | 1380 | 0.8810 | 0.5515 | 0.8169 | 0.6052 | -1.0 | 0.5805 | 0.5659 | 0.4156 | 0.6908 | 0.7519 | -1.0 | 0.6881 | 0.7627 | 0.3879 | 0.7075 | 0.5915 | 0.7595 | 0.675 | 0.7886 |
| 0.8101 | 24.0 | 1440 | 0.8649 | 0.5516 | 0.8241 | 0.6151 | -1.0 | 0.5987 | 0.5665 | 0.4212 | 0.6886 | 0.7512 | -1.0 | 0.6893 | 0.7621 | 0.3902 | 0.7025 | 0.6039 | 0.7738 | 0.6607 | 0.7771 |
| 0.5872 | 25.0 | 1500 | 0.8597 | 0.5432 | 0.8141 | 0.5873 | -1.0 | 0.5651 | 0.5612 | 0.4228 | 0.6995 | 0.7556 | -1.0 | 0.6964 | 0.7658 | 0.3837 | 0.705 | 0.6076 | 0.7905 | 0.6384 | 0.7714 |
| 0.5872 | 26.0 | 1560 | 0.8558 | 0.5455 | 0.8128 | 0.5911 | -1.0 | 0.5707 | 0.5635 | 0.4179 | 0.6965 | 0.7549 | -1.0 | 0.6893 | 0.766 | 0.3787 | 0.7075 | 0.6146 | 0.7857 | 0.6432 | 0.7714 |
| 0.5872 | 27.0 | 1620 | 0.8620 | 0.5494 | 0.8133 | 0.6002 | -1.0 | 0.5652 | 0.5681 | 0.4186 | 0.7004 | 0.7534 | -1.0 | 0.6964 | 0.7634 | 0.3837 | 0.7025 | 0.6187 | 0.7833 | 0.6459 | 0.7743 |
| 0.5872 | 28.0 | 1680 | 0.8668 | 0.5457 | 0.8118 | 0.589 | -1.0 | 0.5653 | 0.5655 | 0.4186 | 0.6971 | 0.7525 | -1.0 | 0.6964 | 0.7626 | 0.3839 | 0.7 | 0.6146 | 0.7833 | 0.6387 | 0.7743 |
| 0.5872 | 29.0 | 1740 | 0.8677 | 0.5392 | 0.8117 | 0.577 | -1.0 | 0.5573 | 0.5593 | 0.4162 | 0.6989 | 0.7526 | -1.0 | 0.6964 | 0.7625 | 0.3765 | 0.7025 | 0.6019 | 0.781 | 0.6392 | 0.7743 |
| 0.5872 | 30.0 | 1800 | 0.8676 | 0.5394 | 0.8117 | 0.5772 | -1.0 | 0.5578 | 0.5596 | 0.4162 | 0.6989 | 0.7526 | -1.0 | 0.6964 | 0.7625 | 0.3767 | 0.7025 | 0.6021 | 0.781 | 0.6395 | 0.7743 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
zhengyu998/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1505
- Map: 0.2397
- Map 50: 0.4842
- Map 75: 0.2145
- Map Small: 0.0771
- Map Medium: 0.1892
- Map Large: 0.3666
- Mar 1: 0.2729
- Mar 10: 0.4204
- Mar 100: 0.4418
- Mar Small: 0.1732
- Mar Medium: 0.3953
- Mar Large: 0.6037
- Map Coverall: 0.5417
- Mar 100 Coverall: 0.6581
- Map Face Shield: 0.1556
- Mar 100 Face Shield: 0.4253
- Map Gloves: 0.1615
- Mar 100 Gloves: 0.3464
- Map Goggles: 0.0883
- Mar 100 Goggles: 0.3831
- Map Mask: 0.2513
- Mar 100 Mask: 0.396
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 1.9254 | 0.0089 | 0.0292 | 0.0041 | 0.0076 | 0.0056 | 0.0152 | 0.0283 | 0.1434 | 0.1834 | 0.1216 | 0.1419 | 0.2288 | 0.021 | 0.3333 | 0.0041 | 0.138 | 0.0023 | 0.1237 | 0.0008 | 0.0646 | 0.0163 | 0.2573 |
| No log | 2.0 | 214 | 1.7315 | 0.0316 | 0.0824 | 0.0217 | 0.0064 | 0.0117 | 0.0431 | 0.0674 | 0.1563 | 0.2029 | 0.0736 | 0.137 | 0.2704 | 0.1228 | 0.4757 | 0.0142 | 0.0797 | 0.0036 | 0.1656 | 0.0012 | 0.0308 | 0.0164 | 0.2627 |
| No log | 3.0 | 321 | 1.6433 | 0.0295 | 0.0741 | 0.0226 | 0.0058 | 0.0264 | 0.0369 | 0.0783 | 0.1926 | 0.2497 | 0.0829 | 0.1988 | 0.3205 | 0.0975 | 0.5032 | 0.0078 | 0.1772 | 0.0044 | 0.192 | 0.0019 | 0.0923 | 0.0358 | 0.284 |
| No log | 4.0 | 428 | 1.5178 | 0.0511 | 0.1239 | 0.0353 | 0.0198 | 0.0536 | 0.067 | 0.1029 | 0.223 | 0.2837 | 0.1222 | 0.2297 | 0.3573 | 0.1569 | 0.6212 | 0.0298 | 0.1848 | 0.0053 | 0.217 | 0.0067 | 0.0508 | 0.0567 | 0.3449 |
| 2.3097 | 5.0 | 535 | 1.4526 | 0.0713 | 0.1539 | 0.0603 | 0.0169 | 0.0549 | 0.089 | 0.1343 | 0.277 | 0.3258 | 0.1162 | 0.268 | 0.4345 | 0.2458 | 0.6248 | 0.0208 | 0.2405 | 0.0144 | 0.2906 | 0.0139 | 0.1292 | 0.0617 | 0.344 |
| 2.3097 | 6.0 | 642 | 1.5010 | 0.0801 | 0.1644 | 0.0688 | 0.0056 | 0.0541 | 0.0964 | 0.1051 | 0.249 | 0.295 | 0.1058 | 0.2399 | 0.3782 | 0.3258 | 0.6401 | 0.0106 | 0.1873 | 0.0087 | 0.2219 | 0.0063 | 0.1169 | 0.0494 | 0.3089 |
| 2.3097 | 7.0 | 749 | 1.4414 | 0.1159 | 0.248 | 0.1043 | 0.0213 | 0.0832 | 0.1497 | 0.1409 | 0.3233 | 0.3544 | 0.1386 | 0.285 | 0.5048 | 0.401 | 0.6419 | 0.0515 | 0.3228 | 0.024 | 0.2634 | 0.0076 | 0.1954 | 0.0954 | 0.3484 |
| 2.3097 | 8.0 | 856 | 1.3548 | 0.1377 | 0.2836 | 0.1153 | 0.0262 | 0.1127 | 0.1769 | 0.1715 | 0.3524 | 0.3806 | 0.1773 | 0.3246 | 0.5433 | 0.4279 | 0.6063 | 0.0503 | 0.3291 | 0.0598 | 0.3241 | 0.0244 | 0.2769 | 0.126 | 0.3667 |
| 2.3097 | 9.0 | 963 | 1.3714 | 0.1387 | 0.3026 | 0.1118 | 0.0471 | 0.1076 | 0.1801 | 0.1768 | 0.3338 | 0.3622 | 0.1575 | 0.3076 | 0.4957 | 0.4347 | 0.6207 | 0.0763 | 0.3405 | 0.0557 | 0.2812 | 0.0108 | 0.2338 | 0.1161 | 0.3347 |
| 1.266 | 10.0 | 1070 | 1.3475 | 0.147 | 0.3108 | 0.1229 | 0.054 | 0.1173 | 0.2096 | 0.1726 | 0.3343 | 0.3685 | 0.1741 | 0.3064 | 0.5156 | 0.4417 | 0.6054 | 0.0569 | 0.319 | 0.0784 | 0.3071 | 0.0236 | 0.2646 | 0.1343 | 0.3462 |
| 1.266 | 11.0 | 1177 | 1.3020 | 0.1686 | 0.3368 | 0.1441 | 0.0417 | 0.1373 | 0.2309 | 0.196 | 0.3778 | 0.4038 | 0.1531 | 0.3567 | 0.5621 | 0.4751 | 0.6387 | 0.0649 | 0.3861 | 0.0861 | 0.3192 | 0.0425 | 0.2938 | 0.1745 | 0.3813 |
| 1.266 | 12.0 | 1284 | 1.2834 | 0.1783 | 0.3679 | 0.1553 | 0.0602 | 0.1293 | 0.2556 | 0.2056 | 0.3728 | 0.4095 | 0.1542 | 0.3621 | 0.5692 | 0.4902 | 0.6356 | 0.0793 | 0.4152 | 0.1067 | 0.3027 | 0.0312 | 0.3231 | 0.1842 | 0.3711 |
| 1.266 | 13.0 | 1391 | 1.2809 | 0.1884 | 0.3905 | 0.1642 | 0.0763 | 0.1413 | 0.2712 | 0.2209 | 0.3812 | 0.4131 | 0.149 | 0.3702 | 0.5711 | 0.5076 | 0.6514 | 0.1031 | 0.4241 | 0.1121 | 0.3152 | 0.0267 | 0.3138 | 0.1927 | 0.3609 |
| 1.266 | 14.0 | 1498 | 1.2472 | 0.2063 | 0.4264 | 0.1738 | 0.0719 | 0.165 | 0.2975 | 0.2314 | 0.392 | 0.4239 | 0.1678 | 0.3819 | 0.5731 | 0.5065 | 0.6468 | 0.1438 | 0.443 | 0.1169 | 0.3321 | 0.0456 | 0.3185 | 0.2188 | 0.3791 |
| 1.1184 | 15.0 | 1605 | 1.2362 | 0.1995 | 0.4184 | 0.1744 | 0.0717 | 0.1504 | 0.3102 | 0.2327 | 0.3969 | 0.4225 | 0.1598 | 0.3799 | 0.5834 | 0.5193 | 0.6414 | 0.1235 | 0.4139 | 0.1171 | 0.3237 | 0.0365 | 0.3631 | 0.201 | 0.3702 |
| 1.1184 | 16.0 | 1712 | 1.2272 | 0.2058 | 0.4247 | 0.1817 | 0.0802 | 0.1523 | 0.3163 | 0.2416 | 0.4039 | 0.4325 | 0.1692 | 0.381 | 0.6015 | 0.5089 | 0.6514 | 0.1292 | 0.4456 | 0.1208 | 0.3366 | 0.0421 | 0.3431 | 0.2278 | 0.3858 |
| 1.1184 | 17.0 | 1819 | 1.2129 | 0.2126 | 0.4398 | 0.1768 | 0.0687 | 0.1595 | 0.3418 | 0.2568 | 0.4052 | 0.4281 | 0.1488 | 0.3755 | 0.5999 | 0.5196 | 0.6505 | 0.1524 | 0.4405 | 0.1173 | 0.3179 | 0.0507 | 0.3492 | 0.2228 | 0.3822 |
| 1.1184 | 18.0 | 1926 | 1.1863 | 0.2217 | 0.4585 | 0.1855 | 0.0758 | 0.1655 | 0.3558 | 0.2671 | 0.418 | 0.4418 | 0.1641 | 0.3919 | 0.6083 | 0.5137 | 0.6491 | 0.1634 | 0.4608 | 0.1441 | 0.3326 | 0.0524 | 0.3631 | 0.2348 | 0.4036 |
| 0.9987 | 19.0 | 2033 | 1.1810 | 0.2248 | 0.4596 | 0.1896 | 0.085 | 0.1722 | 0.3442 | 0.2613 | 0.4204 | 0.441 | 0.1612 | 0.3939 | 0.6136 | 0.5193 | 0.6541 | 0.1567 | 0.4405 | 0.1443 | 0.3335 | 0.0503 | 0.3708 | 0.2533 | 0.4062 |
| 0.9987 | 20.0 | 2140 | 1.1736 | 0.2239 | 0.4592 | 0.1928 | 0.0785 | 0.1673 | 0.351 | 0.265 | 0.4142 | 0.4379 | 0.1884 | 0.3805 | 0.6085 | 0.5237 | 0.65 | 0.1457 | 0.4342 | 0.1585 | 0.3415 | 0.0583 | 0.3785 | 0.2332 | 0.3853 |
| 0.9987 | 21.0 | 2247 | 1.1634 | 0.2311 | 0.4658 | 0.2071 | 0.0757 | 0.1792 | 0.3625 | 0.2713 | 0.4179 | 0.4377 | 0.1625 | 0.3869 | 0.6066 | 0.5357 | 0.6572 | 0.1398 | 0.4241 | 0.1576 | 0.342 | 0.0803 | 0.3631 | 0.2423 | 0.4022 |
| 0.9987 | 22.0 | 2354 | 1.1715 | 0.2264 | 0.4584 | 0.2126 | 0.0775 | 0.179 | 0.3555 | 0.2674 | 0.4136 | 0.4337 | 0.1694 | 0.3896 | 0.5918 | 0.5298 | 0.65 | 0.1425 | 0.4165 | 0.1609 | 0.3442 | 0.0645 | 0.3662 | 0.2341 | 0.3916 |
| 0.9987 | 23.0 | 2461 | 1.1680 | 0.2304 | 0.4713 | 0.2057 | 0.0824 | 0.1768 | 0.3583 | 0.2659 | 0.4208 | 0.4387 | 0.1748 | 0.3881 | 0.6052 | 0.538 | 0.6545 | 0.1487 | 0.4329 | 0.1599 | 0.3353 | 0.0588 | 0.3831 | 0.2468 | 0.3876 |
| 0.9095 | 24.0 | 2568 | 1.1550 | 0.2405 | 0.4887 | 0.2174 | 0.0799 | 0.1828 | 0.3681 | 0.2698 | 0.4198 | 0.4399 | 0.1723 | 0.39 | 0.602 | 0.5444 | 0.659 | 0.1684 | 0.4241 | 0.1645 | 0.3411 | 0.0824 | 0.3877 | 0.2428 | 0.3876 |
| 0.9095 | 25.0 | 2675 | 1.1538 | 0.2397 | 0.488 | 0.2138 | 0.0792 | 0.1892 | 0.3626 | 0.273 | 0.423 | 0.4439 | 0.1731 | 0.3978 | 0.6037 | 0.5364 | 0.6581 | 0.1658 | 0.4392 | 0.1641 | 0.3433 | 0.0838 | 0.3846 | 0.2486 | 0.3942 |
| 0.9095 | 26.0 | 2782 | 1.1572 | 0.2427 | 0.4879 | 0.2162 | 0.0775 | 0.1905 | 0.3668 | 0.2728 | 0.4217 | 0.4414 | 0.1622 | 0.3959 | 0.6015 | 0.5404 | 0.6577 | 0.1703 | 0.4329 | 0.1614 | 0.342 | 0.091 | 0.38 | 0.2502 | 0.3942 |
| 0.9095 | 27.0 | 2889 | 1.1502 | 0.239 | 0.4833 | 0.2093 | 0.075 | 0.1866 | 0.3686 | 0.2698 | 0.4184 | 0.4403 | 0.1693 | 0.3919 | 0.6056 | 0.5419 | 0.6581 | 0.1542 | 0.4165 | 0.1592 | 0.3455 | 0.0893 | 0.3831 | 0.2506 | 0.3982 |
| 0.9095 | 28.0 | 2996 | 1.1521 | 0.2399 | 0.4842 | 0.2118 | 0.0775 | 0.1882 | 0.3694 | 0.2705 | 0.4206 | 0.4436 | 0.1705 | 0.3977 | 0.6073 | 0.5412 | 0.6577 | 0.1545 | 0.4241 | 0.1615 | 0.3464 | 0.0909 | 0.3923 | 0.2516 | 0.3973 |
| 0.858 | 29.0 | 3103 | 1.1511 | 0.2399 | 0.4839 | 0.2146 | 0.0767 | 0.1895 | 0.3676 | 0.2732 | 0.4208 | 0.4422 | 0.1726 | 0.3959 | 0.6043 | 0.5422 | 0.6586 | 0.1557 | 0.4253 | 0.1618 | 0.3469 | 0.0886 | 0.3846 | 0.2512 | 0.3956 |
| 0.858 | 30.0 | 3210 | 1.1505 | 0.2397 | 0.4842 | 0.2145 | 0.0771 | 0.1892 | 0.3666 | 0.2729 | 0.4204 | 0.4418 | 0.1732 | 0.3953 | 0.6037 | 0.5417 | 0.6581 | 0.1556 | 0.4253 | 0.1615 | 0.3464 | 0.0883 | 0.3831 | 0.2513 | 0.396 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.4.1+cu118
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
vs1739561/COL780 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8"
] |
archiii/detr-resnet-50-dc5-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3703
- Map: 0.3703
- Map 50: 0.4284
- Map 75: 0.3912
- Map Small: -1.0
- Map Medium: -1.0
- Map Large: 0.3703
- Mar 1: 0.4491
- Mar 10: 0.5038
- Mar 100: 0.5934
- Mar Small: -1.0
- Mar Medium: -1.0
- Mar Large: 0.5934
- Map Metal: 0.4782
- Mar 100 Metal: 0.6095
- Map Paper: 0.0261
- Mar 100 Paper: 0.4483
- Map Plastic: 0.6064
- Mar 100 Plastic: 0.7225
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Background | Mar 100 Background | Map Metal | Mar 100 Metal | Map Paper | Mar 100 Paper | Map Plastic | Mar 100 Plastic |
|:-------------:|:-------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:---------:|:-------------:|:---------:|:-------------:|:-----------:|:---------------:|
| 1.9796 | 0.1018 | 50 | 2.1283 | 0.0237 | 0.0421 | 0.0238 | -1.0 | -1.0 | 0.0284 | 0.1464 | 0.2542 | 0.3226 | -1.0 | -1.0 | 0.3226 | -1.0 | -1.0 | 0.0546 | 0.6222 | 0.0015 | 0.1103 | 0.015 | 0.2353 |
| 1.9882 | 0.2037 | 100 | 1.6940 | 0.0243 | 0.0456 | 0.0214 | -1.0 | -1.0 | 0.0261 | 0.162 | 0.243 | 0.3188 | -1.0 | -1.0 | 0.3188 | -1.0 | -1.0 | 0.0461 | 0.7143 | 0.0028 | 0.0655 | 0.024 | 0.1765 |
| 2.1489 | 0.3055 | 150 | 1.3699 | 0.0273 | 0.053 | 0.026 | -1.0 | -1.0 | 0.0284 | 0.2013 | 0.2889 | 0.3846 | -1.0 | -1.0 | 0.3846 | -1.0 | -1.0 | 0.0474 | 0.719 | 0.0193 | 0.2759 | 0.0153 | 0.1588 |
| 1.4834 | 0.4073 | 200 | 1.1051 | 0.026 | 0.0432 | 0.0272 | -1.0 | -1.0 | 0.0274 | 0.1959 | 0.2952 | 0.3738 | -1.0 | -1.0 | 0.3738 | -1.0 | -1.0 | 0.06 | 0.846 | 0.0009 | 0.0655 | 0.0172 | 0.2098 |
| 0.8501 | 0.5092 | 250 | 0.9289 | 0.0354 | 0.0526 | 0.0387 | -1.0 | -1.0 | 0.0361 | 0.2408 | 0.3419 | 0.4098 | -1.0 | -1.0 | 0.4098 | -1.0 | -1.0 | 0.0755 | 0.8651 | 0.0017 | 0.1034 | 0.0289 | 0.2608 |
| 1.0472 | 0.6110 | 300 | 0.8789 | 0.0271 | 0.0397 | 0.028 | -1.0 | -1.0 | 0.028 | 0.2794 | 0.4266 | 0.4883 | -1.0 | -1.0 | 0.4883 | -1.0 | -1.0 | 0.0637 | 0.8508 | 0.0027 | 0.1897 | 0.015 | 0.4245 |
| 1.09 | 0.7128 | 350 | 0.8033 | 0.034 | 0.0482 | 0.0364 | -1.0 | -1.0 | 0.0355 | 0.2929 | 0.4048 | 0.4568 | -1.0 | -1.0 | 0.4568 | -1.0 | -1.0 | 0.0848 | 0.8841 | 0.003 | 0.1724 | 0.0142 | 0.3137 |
| 1.1819 | 0.8147 | 400 | 0.7859 | 0.0354 | 0.0485 | 0.039 | -1.0 | -1.0 | 0.0367 | 0.2686 | 0.3644 | 0.4162 | -1.0 | -1.0 | 0.4162 | -1.0 | -1.0 | 0.0947 | 0.8825 | 0.0018 | 0.0966 | 0.0098 | 0.2696 |
| 0.9802 | 0.9165 | 450 | 0.7289 | 0.0409 | 0.0559 | 0.0441 | -1.0 | -1.0 | 0.041 | 0.2321 | 0.3313 | 0.3894 | -1.0 | -1.0 | 0.3894 | -1.0 | -1.0 | 0.1141 | 0.8857 | 0.0008 | 0.0276 | 0.0078 | 0.2549 |
| 0.5236 | 1.0183 | 500 | 0.7572 | 0.0175 | 0.028 | 0.019 | -1.0 | -1.0 | 0.0176 | 0.1883 | 0.3483 | 0.434 | -1.0 | -1.0 | 0.434 | -1.0 | -1.0 | 0.0411 | 0.8254 | 0.0 | 0.0 | 0.0113 | 0.4765 |
| 0.7924 | 1.1202 | 550 | 0.7086 | 0.0306 | 0.0434 | 0.0344 | -1.0 | -1.0 | 0.0309 | 0.2802 | 0.4428 | 0.4935 | -1.0 | -1.0 | 0.4935 | -1.0 | -1.0 | 0.0687 | 0.846 | 0.0001 | 0.0069 | 0.0228 | 0.6275 |
| 1.0112 | 1.2220 | 600 | 0.6612 | 0.0405 | 0.0504 | 0.0436 | -1.0 | -1.0 | 0.0411 | 0.2854 | 0.4361 | 0.4979 | -1.0 | -1.0 | 0.4979 | -1.0 | -1.0 | 0.0938 | 0.8063 | 0.0 | 0.0 | 0.0277 | 0.6873 |
| 0.7764 | 1.3238 | 650 | 0.6513 | 0.031 | 0.0421 | 0.0343 | -1.0 | -1.0 | 0.0331 | 0.2634 | 0.4552 | 0.5265 | -1.0 | -1.0 | 0.5265 | -1.0 | -1.0 | 0.0614 | 0.846 | 0.0013 | 0.0414 | 0.0304 | 0.6922 |
| 0.7316 | 1.4257 | 700 | 0.6150 | 0.0367 | 0.0451 | 0.0409 | -1.0 | -1.0 | 0.0399 | 0.3747 | 0.513 | 0.5556 | -1.0 | -1.0 | 0.5556 | -1.0 | -1.0 | 0.0579 | 0.873 | 0.001 | 0.0448 | 0.0514 | 0.749 |
| 0.5439 | 1.5275 | 750 | 0.7063 | 0.0334 | 0.0471 | 0.038 | -1.0 | -1.0 | 0.0342 | 0.2454 | 0.4007 | 0.4784 | -1.0 | -1.0 | 0.4784 | -1.0 | -1.0 | 0.0568 | 0.6016 | 0.0005 | 0.0621 | 0.0429 | 0.7716 |
| 0.5359 | 1.6293 | 800 | 0.6803 | 0.0407 | 0.0578 | 0.0432 | -1.0 | -1.0 | 0.0415 | 0.2942 | 0.4281 | 0.4878 | -1.0 | -1.0 | 0.4878 | -1.0 | -1.0 | 0.0621 | 0.5841 | 0.0011 | 0.0793 | 0.059 | 0.8 |
| 0.8857 | 1.7312 | 850 | 0.6697 | 0.0539 | 0.077 | 0.0623 | -1.0 | -1.0 | 0.0556 | 0.3079 | 0.431 | 0.507 | -1.0 | -1.0 | 0.507 | -1.0 | -1.0 | 0.0609 | 0.5778 | 0.0353 | 0.1414 | 0.0654 | 0.802 |
| 1.4486 | 1.8330 | 900 | 0.6321 | 0.0436 | 0.0576 | 0.0514 | -1.0 | -1.0 | 0.05 | 0.3419 | 0.473 | 0.5298 | -1.0 | -1.0 | 0.5298 | -1.0 | -1.0 | 0.0473 | 0.6937 | 0.0103 | 0.1034 | 0.0732 | 0.7922 |
| 0.7727 | 1.9348 | 950 | 0.6602 | 0.0298 | 0.0406 | 0.034 | -1.0 | -1.0 | 0.0337 | 0.3176 | 0.4942 | 0.5725 | -1.0 | -1.0 | 0.5725 | -1.0 | -1.0 | 0.0371 | 0.7238 | 0.0108 | 0.2241 | 0.0416 | 0.7696 |
| 1.6886 | 2.0367 | 1000 | 0.6232 | 0.0744 | 0.1046 | 0.0872 | -1.0 | -1.0 | 0.0766 | 0.3938 | 0.5073 | 0.5349 | -1.0 | -1.0 | 0.5349 | -1.0 | -1.0 | 0.15 | 0.7873 | 0.0028 | 0.0241 | 0.0705 | 0.7931 |
| 0.5422 | 2.1385 | 1050 | 0.5752 | 0.0743 | 0.0964 | 0.086 | -1.0 | -1.0 | 0.0809 | 0.4497 | 0.5463 | 0.5649 | -1.0 | -1.0 | 0.5649 | -1.0 | -1.0 | 0.1314 | 0.8349 | 0.008 | 0.0655 | 0.0834 | 0.7941 |
| 0.5052 | 2.2403 | 1100 | 0.5267 | 0.0661 | 0.0825 | 0.0702 | -1.0 | -1.0 | 0.0796 | 0.4281 | 0.5492 | 0.5817 | -1.0 | -1.0 | 0.5817 | -1.0 | -1.0 | 0.0794 | 0.7667 | 0.0424 | 0.0862 | 0.0766 | 0.8922 |
| 0.6956 | 2.3422 | 1150 | 0.5499 | 0.0647 | 0.0912 | 0.0715 | -1.0 | -1.0 | 0.073 | 0.3427 | 0.4879 | 0.5429 | -1.0 | -1.0 | 0.5429 | 0.0743 | 0.6571 | 0.0556 | 0.0724 | 0.0644 | 0.899 |
| 0.5097 | 2.4440 | 1200 | 0.5346 | 0.0748 | 0.0924 | 0.0858 | -1.0 | -1.0 | 0.0819 | 0.4112 | 0.5247 | 0.558 | -1.0 | -1.0 | 0.558 | -1.0 | -1.0 | 0.1015 | 0.7254 | 0.0332 | 0.0379 | 0.0898 | 0.9108 |
| 0.5358 | 2.5458 | 1250 | 0.5622 | 0.095 | 0.1303 | 0.1139 | -1.0 | -1.0 | 0.0971 | 0.4281 | 0.533 | 0.5571 | -1.0 | -1.0 | 0.5571 | 0.1414 | 0.7683 | 0.0284 | 0.0345 | 0.1151 | 0.8686 |
| 1.2882 | 2.6477 | 1300 | 0.5378 | 0.0764 | 0.0945 | 0.089 | -1.0 | -1.0 | 0.0852 | 0.3972 | 0.4941 | 0.5378 | -1.0 | -1.0 | 0.5378 | 0.1073 | 0.6603 | 0.034 | 0.0483 | 0.0878 | 0.9049 |
| 0.8294 | 2.7495 | 1350 | 0.5371 | 0.0767 | 0.0961 | 0.0866 | -1.0 | -1.0 | 0.0844 | 0.4005 | 0.5034 | 0.5468 | -1.0 | -1.0 | 0.5468 | 0.1278 | 0.6302 | 0.0068 | 0.0966 | 0.0955 | 0.9137 |
| 0.9082 | 2.8513 | 1400 | 0.5802 | 0.0719 | 0.0917 | 0.0829 | -1.0 | -1.0 | 0.0781 | 0.3829 | 0.4934 | 0.536 | -1.0 | -1.0 | 0.536 | 0.0747 | 0.654 | 0.0037 | 0.1138 | 0.1374 | 0.8402 |
| 0.8392 | 2.9532 | 1450 | 0.5597 | 0.0599 | 0.0715 | 0.0663 | -1.0 | -1.0 | 0.064 | 0.3819 | 0.5131 | 0.5684 | -1.0 | -1.0 | 0.5684 | 0.0673 | 0.7476 | 0.0009 | 0.131 | 0.1115 | 0.8265 |
| 0.601 | 3.0550 | 1500 | 0.5938 | 0.0492 | 0.0625 | 0.0532 | -1.0 | -1.0 | 0.0545 | 0.3377 | 0.4861 | 0.5589 | -1.0 | -1.0 | 0.5589 | 0.0563 | 0.7 | 0.0037 | 0.1207 | 0.0874 | 0.8559 |
| 0.7053 | 3.1568 | 1550 | 0.5777 | 0.0739 | 0.094 | 0.0844 | -1.0 | -1.0 | 0.0793 | 0.3408 | 0.4853 | 0.5219 | -1.0 | -1.0 | 0.5219 | 0.0683 | 0.5952 | 0.0343 | 0.0931 | 0.1192 | 0.8775 |
| 0.7158 | 3.2587 | 1600 | 0.5854 | 0.0637 | 0.0843 | 0.0666 | -1.0 | -1.0 | 0.0667 | 0.3329 | 0.4633 | 0.5151 | -1.0 | -1.0 | 0.5151 | 0.0646 | 0.5683 | 0.0048 | 0.0966 | 0.1218 | 0.8804 |
| 0.9677 | 3.3605 | 1650 | 0.5985 | 0.0989 | 0.1221 | 0.1134 | -1.0 | -1.0 | 0.1111 | 0.2947 | 0.4177 | 0.4762 | -1.0 | -1.0 | 0.4762 | 0.0786 | 0.4968 | 0.0018 | 0.0966 | 0.2164 | 0.8353 |
| 0.8341 | 3.4623 | 1700 | 0.5789 | 0.1197 | 0.1507 | 0.134 | -1.0 | -1.0 | 0.126 | 0.3459 | 0.4435 | 0.4582 | -1.0 | -1.0 | 0.4582 | 0.1337 | 0.4524 | 0.0003 | 0.0586 | 0.2251 | 0.8637 |
| 0.962 | 3.5642 | 1750 | 0.5207 | 0.11 | 0.1313 | 0.1236 | -1.0 | -1.0 | 0.1173 | 0.3525 | 0.4689 | 0.4927 | -1.0 | -1.0 | 0.4927 | 0.1182 | 0.5032 | 0.001 | 0.0966 | 0.2107 | 0.8784 |
| 0.4853 | 3.6660 | 1800 | 0.5804 | 0.1082 | 0.1377 | 0.1261 | -1.0 | -1.0 | 0.1116 | 0.3381 | 0.4678 | 0.489 | -1.0 | -1.0 | 0.489 | 0.1781 | 0.5048 | 0.0023 | 0.1241 | 0.1441 | 0.8382 |
| 0.9292 | 3.7678 | 1850 | 0.5584 | 0.0925 | 0.1167 | 0.1028 | -1.0 | -1.0 | 0.099 | 0.3454 | 0.4664 | 0.5041 | -1.0 | -1.0 | 0.5041 | 0.1293 | 0.5619 | 0.0063 | 0.0828 | 0.142 | 0.8676 |
| 0.552 | 3.8697 | 1900 | 0.5131 | 0.1002 | 0.1219 | 0.1123 | -1.0 | -1.0 | 0.1138 | 0.3719 | 0.4999 | 0.5265 | -1.0 | -1.0 | 0.5265 | 0.13 | 0.5841 | 0.0026 | 0.1276 | 0.168 | 0.8676 |
| 0.6468 | 3.9715 | 1950 | 0.5834 | 0.1186 | 0.1491 | 0.1313 | -1.0 | -1.0 | 0.1292 | 0.3901 | 0.49 | 0.5076 | -1.0 | -1.0 | 0.5076 | 0.1583 | 0.519 | 0.01 | 0.1655 | 0.1875 | 0.8382 |
| 0.925 | 4.0733 | 2000 | 0.4856 | 0.1184 | 0.1539 | 0.1284 | -1.0 | -1.0 | 0.1381 | 0.4197 | 0.5425 | 0.5677 | -1.0 | -1.0 | 0.5677 | 0.1504 | 0.6048 | 0.0149 | 0.1897 | 0.1899 | 0.9088 |
| 0.5978 | 4.1752 | 2050 | 0.4895 | 0.115 | 0.1406 | 0.1295 | -1.0 | -1.0 | 0.1282 | 0.3839 | 0.5032 | 0.5218 | -1.0 | -1.0 | 0.5218 | 0.1538 | 0.5587 | 0.0038 | 0.131 | 0.1875 | 0.8755 |
| 0.4953 | 4.2770 | 2100 | 0.4927 | 0.1009 | 0.1264 | 0.1104 | -1.0 | -1.0 | 0.1084 | 0.367 | 0.5038 | 0.5164 | -1.0 | -1.0 | 0.5164 | 0.1419 | 0.5 | 0.0081 | 0.1483 | 0.1528 | 0.901 |
| 0.8966 | 4.3788 | 2150 | 0.4611 | 0.1002 | 0.1254 | 0.1132 | -1.0 | -1.0 | 0.1154 | 0.3965 | 0.5181 | 0.5348 | -1.0 | -1.0 | 0.5348 | 0.1167 | 0.5429 | 0.0086 | 0.1655 | 0.1753 | 0.8961 |
| 0.5029 | 4.4807 | 2200 | 0.4622 | 0.1134 | 0.1431 | 0.1231 | -1.0 | -1.0 | 0.1239 | 0.4481 | 0.5523 | 0.5951 | -1.0 | -1.0 | 0.5951 | 0.1493 | 0.6857 | 0.0132 | 0.2103 | 0.1777 | 0.8892 |
| 0.791 | 4.5825 | 2250 | 0.5093 | 0.0874 | 0.1051 | 0.0997 | -1.0 | -1.0 | 0.0966 | 0.3781 | 0.4987 | 0.5458 | -1.0 | -1.0 | 0.5458 | 0.1002 | 0.6333 | 0.0007 | 0.1138 | 0.1613 | 0.8902 |
| 0.6545 | 4.6843 | 2300 | 0.4820 | 0.0976 | 0.1167 | 0.1102 | -1.0 | -1.0 | 0.1174 | 0.4407 | 0.5537 | 0.5862 | -1.0 | -1.0 | 0.5862 | 0.1573 | 0.6984 | 0.0012 | 0.1483 | 0.1344 | 0.9118 |
| 0.5782 | 4.7862 | 2350 | 0.4653 | 0.1146 | 0.1374 | 0.1307 | -1.0 | -1.0 | 0.1423 | 0.4482 | 0.5451 | 0.5855 | -1.0 | -1.0 | 0.5855 | 0.1619 | 0.6667 | 0.0034 | 0.2034 | 0.1785 | 0.8863 |
| 0.5309 | 4.8880 | 2400 | 0.4830 | 0.1009 | 0.1269 | 0.1202 | -1.0 | -1.0 | 0.1223 | 0.4266 | 0.5057 | 0.5419 | -1.0 | -1.0 | 0.5419 | 0.1643 | 0.6317 | 0.0015 | 0.1448 | 0.137 | 0.849 |
| 0.4851 | 4.9898 | 2450 | 0.4866 | 0.1464 | 0.1851 | 0.1672 | -1.0 | -1.0 | 0.1619 | 0.4202 | 0.5202 | 0.5719 | -1.0 | -1.0 | 0.5719 | 0.2452 | 0.6905 | 0.0029 | 0.1724 | 0.1909 | 0.8529 |
| 0.7152 | 5.0916 | 2500 | 0.4744 | 0.1459 | 0.1799 | 0.1648 | -1.0 | -1.0 | 0.1547 | 0.4379 | 0.5467 | 0.5785 | -1.0 | -1.0 | 0.5785 | 0.2351 | 0.7333 | 0.0018 | 0.169 | 0.2008 | 0.8333 |
| 0.2953 | 5.1935 | 2550 | 0.5260 | 0.1592 | 0.1982 | 0.1827 | -1.0 | -1.0 | 0.1695 | 0.4385 | 0.5087 | 0.5527 | -1.0 | -1.0 | 0.5527 | 0.2435 | 0.6365 | 0.0037 | 0.1793 | 0.2304 | 0.8422 |
| 0.5106 | 5.2953 | 2600 | 0.5279 | 0.1627 | 0.2051 | 0.18 | -1.0 | -1.0 | 0.1706 | 0.4105 | 0.4839 | 0.5344 | -1.0 | -1.0 | 0.5344 | 0.2688 | 0.619 | 0.001 | 0.1931 | 0.2183 | 0.7912 |
| 1.5618 | 5.3971 | 2650 | 0.4933 | 0.1624 | 0.1978 | 0.1818 | -1.0 | -1.0 | 0.1707 | 0.3791 | 0.4933 | 0.5232 | -1.0 | -1.0 | 0.5232 | 0.2336 | 0.573 | 0.0021 | 0.2241 | 0.2515 | 0.7725 |
| 0.6616 | 5.4990 | 2700 | 0.4607 | 0.1411 | 0.1741 | 0.1561 | -1.0 | -1.0 | 0.1554 | 0.3631 | 0.5007 | 0.5528 | -1.0 | -1.0 | 0.5528 | 0.1914 | 0.5476 | 0.0036 | 0.2793 | 0.2283 | 0.8314 |
| 1.5876 | 5.6008 | 2750 | 0.4688 | 0.1714 | 0.2055 | 0.1822 | -1.0 | -1.0 | 0.1853 | 0.426 | 0.5362 | 0.5845 | -1.0 | -1.0 | 0.5845 | 0.2096 | 0.6063 | 0.0069 | 0.3069 | 0.2976 | 0.8402 |
| 0.9064 | 5.7026 | 2800 | 0.4834 | 0.1512 | 0.1818 | 0.1695 | -1.0 | -1.0 | 0.1715 | 0.4667 | 0.5708 | 0.6274 | -1.0 | -1.0 | 0.6274 | 0.2307 | 0.6968 | 0.0076 | 0.3414 | 0.2153 | 0.8441 |
| 0.6631 | 5.8045 | 2850 | 0.4966 | 0.1668 | 0.2034 | 0.185 | -1.0 | -1.0 | 0.1879 | 0.4371 | 0.5178 | 0.5715 | -1.0 | -1.0 | 0.5715 | 0.2648 | 0.6349 | 0.0099 | 0.2483 | 0.2257 | 0.8314 |
| 2.0204 | 5.9063 | 2900 | 0.5232 | 0.1848 | 0.2317 | 0.1976 | -1.0 | -1.0 | 0.2007 | 0.4219 | 0.5219 | 0.5843 | -1.0 | -1.0 | 0.5843 | 0.3123 | 0.681 | 0.0237 | 0.3483 | 0.2183 | 0.7235 |
| 0.6208 | 6.0081 | 2950 | 0.4736 | 0.1749 | 0.2204 | 0.1895 | -1.0 | -1.0 | 0.1871 | 0.4238 | 0.5162 | 0.5837 | -1.0 | -1.0 | 0.5837 | 0.2772 | 0.646 | 0.016 | 0.3138 | 0.2315 | 0.7912 |
| 0.7104 | 6.1100 | 3000 | 0.4668 | 0.1787 | 0.2152 | 0.1945 | -1.0 | -1.0 | 0.1884 | 0.4139 | 0.5343 | 0.6019 | -1.0 | -1.0 | 0.6019 | 0.2652 | 0.6127 | 0.0075 | 0.3862 | 0.2635 | 0.8069 |
| 0.7477 | 6.2118 | 3050 | 0.5456 | 0.1533 | 0.1916 | 0.1728 | -1.0 | -1.0 | 0.1588 | 0.4022 | 0.5193 | 0.6083 | -1.0 | -1.0 | 0.6083 | 0.2289 | 0.5714 | 0.0248 | 0.4897 | 0.2062 | 0.7637 |
| 0.6839 | 6.3136 | 3100 | 0.5166 | 0.1163 | 0.1449 | 0.1254 | -1.0 | -1.0 | 0.1238 | 0.3623 | 0.5177 | 0.6251 | -1.0 | -1.0 | 0.6251 | 0.182 | 0.6349 | 0.0081 | 0.4552 | 0.1588 | 0.7853 |
| 1.0881 | 6.4155 | 3150 | 0.4868 | 0.1468 | 0.1806 | 0.1631 | -1.0 | -1.0 | 0.1538 | 0.3639 | 0.4961 | 0.5872 | -1.0 | -1.0 | 0.5872 | 0.201 | 0.5063 | 0.0072 | 0.4828 | 0.2324 | 0.7725 |
| 0.6136 | 6.5173 | 3200 | 0.5087 | 0.1154 | 0.1432 | 0.1284 | -1.0 | -1.0 | 0.1218 | 0.3638 | 0.5158 | 0.5888 | -1.0 | -1.0 | 0.5888 | 0.189 | 0.5794 | 0.0028 | 0.3724 | 0.1544 | 0.8147 |
| 0.7524 | 6.6191 | 3250 | 0.4734 | 0.153 | 0.1884 | 0.1643 | -1.0 | -1.0 | 0.1582 | 0.3827 | 0.509 | 0.6 | -1.0 | -1.0 | 0.6 | 0.2354 | 0.573 | 0.002 | 0.4034 | 0.2215 | 0.8235 |
| 0.6668 | 6.7210 | 3300 | 0.4848 | 0.1622 | 0.1973 | 0.1718 | -1.0 | -1.0 | 0.1668 | 0.3723 | 0.4999 | 0.591 | -1.0 | -1.0 | 0.591 | 0.2335 | 0.5127 | 0.0024 | 0.4172 | 0.2508 | 0.8431 |
| 0.6529 | 6.8228 | 3350 | 0.4945 | 0.117 | 0.1462 | 0.1282 | -1.0 | -1.0 | 0.1258 | 0.3225 | 0.4903 | 0.5723 | -1.0 | -1.0 | 0.5723 | 0.1648 | 0.5048 | 0.0032 | 0.3966 | 0.1829 | 0.8157 |
| 0.4778 | 6.9246 | 3400 | 0.5310 | 0.1211 | 0.1577 | 0.1307 | -1.0 | -1.0 | 0.1278 | 0.3232 | 0.4381 | 0.5086 | -1.0 | -1.0 | 0.5086 | 0.1466 | 0.481 | 0.0033 | 0.3379 | 0.2133 | 0.7069 |
| 0.6646 | 7.0265 | 3450 | 0.5276 | 0.125 | 0.1627 | 0.1357 | -1.0 | -1.0 | 0.1333 | 0.3163 | 0.43 | 0.5238 | -1.0 | -1.0 | 0.5238 | 0.1462 | 0.4508 | 0.0072 | 0.4 | 0.2216 | 0.7206 |
| 0.8104 | 7.1283 | 3500 | 0.5576 | 0.1225 | 0.1635 | 0.1285 | -1.0 | -1.0 | 0.132 | 0.3087 | 0.4179 | 0.511 | -1.0 | -1.0 | 0.511 | 0.1528 | 0.4603 | 0.007 | 0.4 | 0.2077 | 0.6725 |
| 0.801 | 7.2301 | 3550 | 0.5051 | 0.1465 | 0.1832 | 0.1588 | -1.0 | -1.0 | 0.1558 | 0.3237 | 0.4378 | 0.5166 | -1.0 | -1.0 | 0.5166 | 0.1797 | 0.4444 | 0.0098 | 0.3828 | 0.25 | 0.7225 |
| 0.5484 | 7.3320 | 3600 | 0.4944 | 0.1617 | 0.1932 | 0.1775 | -1.0 | -1.0 | 0.1667 | 0.3525 | 0.4632 | 0.5622 | -1.0 | -1.0 | 0.5622 | 0.2367 | 0.4841 | 0.0066 | 0.4241 | 0.2417 | 0.7784 |
| 0.5271 | 7.4338 | 3650 | 0.4661 | 0.1811 | 0.2237 | 0.1965 | -1.0 | -1.0 | 0.1862 | 0.3673 | 0.4609 | 0.5317 | -1.0 | -1.0 | 0.5317 | 0.2413 | 0.4635 | 0.011 | 0.3345 | 0.291 | 0.7971 |
| 0.4769 | 7.5356 | 3700 | 0.4799 | 0.206 | 0.2478 | 0.2217 | -1.0 | -1.0 | 0.2101 | 0.3542 | 0.4457 | 0.5043 | -1.0 | -1.0 | 0.5043 | 0.2343 | 0.427 | 0.014 | 0.3103 | 0.3699 | 0.7755 |
| 0.5822 | 7.6375 | 3750 | 0.4613 | 0.1984 | 0.2378 | 0.2212 | -1.0 | -1.0 | 0.2083 | 0.3518 | 0.4378 | 0.5114 | -1.0 | -1.0 | 0.5114 | 0.227 | 0.4413 | 0.0079 | 0.3448 | 0.3605 | 0.748 |
| 0.5706 | 7.7393 | 3800 | 0.4651 | 0.2286 | 0.2703 | 0.2447 | -1.0 | -1.0 | 0.2381 | 0.3655 | 0.452 | 0.5405 | -1.0 | -1.0 | 0.5405 | 0.2425 | 0.4714 | 0.013 | 0.4069 | 0.4301 | 0.7431 |
| 0.5366 | 7.8411 | 3850 | 0.4488 | 0.2597 | 0.3022 | 0.2783 | -1.0 | -1.0 | 0.266 | 0.4008 | 0.4565 | 0.5151 | -1.0 | -1.0 | 0.5151 | 0.3008 | 0.5254 | 0.0121 | 0.2897 | 0.4663 | 0.7304 |
| 0.3441 | 7.9430 | 3900 | 0.4445 | 0.2552 | 0.2979 | 0.2728 | -1.0 | -1.0 | 0.2594 | 0.4021 | 0.4654 | 0.5401 | -1.0 | -1.0 | 0.5401 | 0.3088 | 0.5524 | 0.0186 | 0.3345 | 0.4383 | 0.7333 |
| 0.3097 | 8.0448 | 3950 | 0.4632 | 0.2153 | 0.2686 | 0.2372 | -1.0 | -1.0 | 0.2228 | 0.3759 | 0.4661 | 0.5719 | -1.0 | -1.0 | 0.5719 | 0.2775 | 0.5635 | 0.0102 | 0.4207 | 0.3581 | 0.7314 |
| 0.5967 | 8.1466 | 4000 | 0.4446 | 0.2026 | 0.249 | 0.2207 | -1.0 | -1.0 | 0.2129 | 0.3703 | 0.4724 | 0.5804 | -1.0 | -1.0 | 0.5804 | 0.2722 | 0.5397 | 0.0129 | 0.4379 | 0.3227 | 0.7637 |
| 0.5614 | 8.2485 | 4050 | 0.4194 | 0.2358 | 0.2855 | 0.2567 | -1.0 | -1.0 | 0.2442 | 0.3982 | 0.4728 | 0.5751 | -1.0 | -1.0 | 0.5751 | 0.2696 | 0.4952 | 0.0156 | 0.4448 | 0.4222 | 0.7853 |
| 0.4889 | 8.3503 | 4100 | 0.4300 | 0.217 | 0.2592 | 0.2456 | -1.0 | -1.0 | 0.2235 | 0.394 | 0.4763 | 0.5915 | -1.0 | -1.0 | 0.5915 | 0.2868 | 0.5317 | 0.0051 | 0.4379 | 0.3592 | 0.8049 |
| 1.2673 | 8.4521 | 4150 | 0.4346 | 0.2151 | 0.2509 | 0.2369 | -1.0 | -1.0 | 0.2218 | 0.3905 | 0.482 | 0.5879 | -1.0 | -1.0 | 0.5879 | 0.2719 | 0.5222 | 0.0087 | 0.4552 | 0.3646 | 0.7863 |
| 0.6729 | 8.5540 | 4200 | 0.4171 | 0.2327 | 0.2748 | 0.255 | -1.0 | -1.0 | 0.2413 | 0.4106 | 0.4912 | 0.5866 | -1.0 | -1.0 | 0.5866 | 0.2814 | 0.5175 | 0.008 | 0.4207 | 0.4085 | 0.8216 |
| 0.7014 | 8.6558 | 4250 | 0.4043 | 0.2677 | 0.3162 | 0.3003 | -1.0 | -1.0 | 0.2748 | 0.4198 | 0.4855 | 0.5993 | -1.0 | -1.0 | 0.5993 | 0.3233 | 0.5222 | 0.0083 | 0.4552 | 0.4714 | 0.8206 |
| 0.7492 | 8.7576 | 4300 | 0.4004 | 0.2642 | 0.3078 | 0.2941 | -1.0 | -1.0 | 0.271 | 0.4279 | 0.5052 | 0.6121 | -1.0 | -1.0 | 0.6121 | 0.3225 | 0.5635 | 0.0066 | 0.4759 | 0.4634 | 0.7971 |
| 0.4239 | 8.8595 | 4350 | 0.4437 | 0.2589 | 0.3104 | 0.2932 | -1.0 | -1.0 | 0.266 | 0.3972 | 0.4808 | 0.5762 | -1.0 | -1.0 | 0.5762 | 0.2964 | 0.4873 | 0.0077 | 0.4483 | 0.4727 | 0.7931 |
| 0.5212 | 8.9613 | 4400 | 0.4309 | 0.2673 | 0.3196 | 0.2887 | -1.0 | -1.0 | 0.2705 | 0.4035 | 0.4836 | 0.5894 | -1.0 | -1.0 | 0.5894 | 0.3463 | 0.5333 | 0.0077 | 0.4966 | 0.4479 | 0.7382 |
| 0.8995 | 9.0631 | 4450 | 0.4414 | 0.2576 | 0.3093 | 0.2763 | -1.0 | -1.0 | 0.2595 | 0.4094 | 0.4847 | 0.5996 | -1.0 | -1.0 | 0.5996 | 0.3314 | 0.5508 | 0.017 | 0.5 | 0.4245 | 0.748 |
| 1.5476 | 9.1650 | 4500 | 0.4524 | 0.242 | 0.2912 | 0.2629 | -1.0 | -1.0 | 0.2489 | 0.4091 | 0.4879 | 0.5937 | -1.0 | -1.0 | 0.5937 | 0.3223 | 0.5556 | 0.0154 | 0.4931 | 0.3882 | 0.7324 |
| 0.7314 | 9.2668 | 4550 | 0.4297 | 0.2381 | 0.2907 | 0.2602 | -1.0 | -1.0 | 0.244 | 0.4097 | 0.501 | 0.5964 | -1.0 | -1.0 | 0.5964 | 0.3204 | 0.554 | 0.0224 | 0.5138 | 0.3715 | 0.7216 |
| 0.3865 | 9.3686 | 4600 | 0.4467 | 0.2645 | 0.314 | 0.2845 | -1.0 | -1.0 | 0.2701 | 0.4264 | 0.5071 | 0.6163 | -1.0 | -1.0 | 0.6163 | 0.3504 | 0.5905 | 0.02 | 0.4966 | 0.423 | 0.7618 |
| 0.3017 | 9.4705 | 4650 | 0.4504 | 0.282 | 0.3464 | 0.3033 | -1.0 | -1.0 | 0.2892 | 0.4396 | 0.5249 | 0.6318 | -1.0 | -1.0 | 0.6318 | 0.416 | 0.6921 | 0.0176 | 0.4897 | 0.4124 | 0.7137 |
| 0.7409 | 9.5723 | 4700 | 0.4369 | 0.301 | 0.356 | 0.3257 | -1.0 | -1.0 | 0.3071 | 0.4518 | 0.5085 | 0.6085 | -1.0 | -1.0 | 0.6085 | 0.4307 | 0.6571 | 0.0215 | 0.4724 | 0.4508 | 0.6961 |
| 1.5215 | 9.6741 | 4750 | 0.4673 | 0.3043 | 0.3455 | 0.3268 | -1.0 | -1.0 | 0.3056 | 0.4372 | 0.5032 | 0.6009 | -1.0 | -1.0 | 0.6009 | 0.4026 | 0.6095 | 0.0084 | 0.4862 | 0.5019 | 0.7069 |
| 0.3358 | 9.7760 | 4800 | 0.4813 | 0.3042 | 0.3514 | 0.3316 | -1.0 | -1.0 | 0.306 | 0.4493 | 0.4972 | 0.6007 | -1.0 | -1.0 | 0.6007 | 0.3894 | 0.5889 | 0.014 | 0.4828 | 0.5092 | 0.7304 |
| 0.9629 | 9.8778 | 4850 | 0.4473 | 0.3066 | 0.3599 | 0.3268 | -1.0 | -1.0 | 0.3105 | 0.4398 | 0.497 | 0.5878 | -1.0 | -1.0 | 0.5878 | 0.399 | 0.5984 | 0.0233 | 0.4552 | 0.4975 | 0.7098 |
| 0.524 | 9.9796 | 4900 | 0.4228 | 0.3062 | 0.3519 | 0.3275 | -1.0 | -1.0 | 0.3086 | 0.4226 | 0.4831 | 0.5544 | -1.0 | -1.0 | 0.5544 | 0.3748 | 0.554 | 0.0186 | 0.3897 | 0.5251 | 0.7196 |
| 0.6995 | 10.0815 | 4950 | 0.4831 | 0.2857 | 0.3441 | 0.318 | -1.0 | -1.0 | 0.2873 | 0.4042 | 0.4572 | 0.5353 | -1.0 | -1.0 | 0.5353 | 0.318 | 0.4825 | 0.0201 | 0.3724 | 0.5189 | 0.751 |
| 0.3898 | 10.1833 | 5000 | 0.4427 | 0.2371 | 0.2802 | 0.2588 | -1.0 | -1.0 | 0.2426 | 0.3619 | 0.4637 | 0.5534 | -1.0 | -1.0 | 0.5534 | 0.2811 | 0.4683 | 0.0103 | 0.4517 | 0.4199 | 0.7402 |
| 1.4225 | 10.2851 | 5050 | 0.4479 | 0.2618 | 0.3105 | 0.2844 | -1.0 | -1.0 | 0.2657 | 0.3895 | 0.4483 | 0.5483 | -1.0 | -1.0 | 0.5483 | 0.2747 | 0.4476 | 0.0189 | 0.4345 | 0.4917 | 0.7627 |
| 0.538 | 10.3870 | 5100 | 0.4383 | 0.2825 | 0.3342 | 0.3014 | -1.0 | -1.0 | 0.2861 | 0.3928 | 0.4584 | 0.5561 | -1.0 | -1.0 | 0.5561 | 0.288 | 0.4317 | 0.0161 | 0.4414 | 0.5432 | 0.7951 |
| 0.3099 | 10.4888 | 5150 | 0.4078 | 0.2763 | 0.3263 | 0.2933 | -1.0 | -1.0 | 0.2824 | 0.402 | 0.477 | 0.5678 | -1.0 | -1.0 | 0.5678 | 0.275 | 0.4556 | 0.0213 | 0.4517 | 0.5324 | 0.7961 |
| 0.4364 | 10.5906 | 5200 | 0.4022 | 0.287 | 0.3355 | 0.3045 | -1.0 | -1.0 | 0.2901 | 0.4022 | 0.4638 | 0.5534 | -1.0 | -1.0 | 0.5534 | 0.2937 | 0.473 | 0.0187 | 0.4 | 0.5487 | 0.7873 |
| 0.5209 | 10.6925 | 5250 | 0.4358 | 0.2923 | 0.3345 | 0.3092 | -1.0 | -1.0 | 0.2941 | 0.4067 | 0.463 | 0.5251 | -1.0 | -1.0 | 0.5251 | 0.2984 | 0.4571 | 0.0152 | 0.3759 | 0.5634 | 0.7422 |
| 0.3867 | 10.7943 | 5300 | 0.4289 | 0.2775 | 0.3189 | 0.2994 | -1.0 | -1.0 | 0.2802 | 0.3961 | 0.4691 | 0.5462 | -1.0 | -1.0 | 0.5462 | 0.2774 | 0.454 | 0.0131 | 0.4345 | 0.5419 | 0.75 |
| 0.6977 | 10.8961 | 5350 | 0.3995 | 0.2918 | 0.3262 | 0.3128 | -1.0 | -1.0 | 0.2926 | 0.4155 | 0.4672 | 0.5683 | -1.0 | -1.0 | 0.5683 | 0.3237 | 0.5032 | 0.012 | 0.4724 | 0.5398 | 0.7294 |
| 0.3935 | 10.9980 | 5400 | 0.4075 | 0.304 | 0.3499 | 0.3286 | -1.0 | -1.0 | 0.3054 | 0.4403 | 0.4865 | 0.6049 | -1.0 | -1.0 | 0.6049 | 0.3784 | 0.5921 | 0.017 | 0.4931 | 0.5166 | 0.7294 |
| 0.3714 | 11.0998 | 5450 | 0.4077 | 0.2963 | 0.3365 | 0.3237 | -1.0 | -1.0 | 0.2983 | 0.4231 | 0.4831 | 0.6061 | -1.0 | -1.0 | 0.6061 | 0.3805 | 0.5873 | 0.0091 | 0.531 | 0.4992 | 0.7 |
| 1.6946 | 11.2016 | 5500 | 0.4363 | 0.3013 | 0.3373 | 0.3221 | -1.0 | -1.0 | 0.3021 | 0.4214 | 0.4836 | 0.5916 | -1.0 | -1.0 | 0.5916 | 0.3619 | 0.5571 | 0.0071 | 0.5069 | 0.5349 | 0.7108 |
| 0.329 | 11.3035 | 5550 | 0.4442 | 0.2945 | 0.3349 | 0.3152 | -1.0 | -1.0 | 0.2964 | 0.3984 | 0.4753 | 0.5868 | -1.0 | -1.0 | 0.5868 | 0.3299 | 0.5333 | 0.0164 | 0.5103 | 0.5373 | 0.7167 |
| 0.6864 | 11.4053 | 5600 | 0.4718 | 0.2981 | 0.3363 | 0.3213 | -1.0 | -1.0 | 0.3016 | 0.4057 | 0.466 | 0.5568 | -1.0 | -1.0 | 0.5568 | 0.3361 | 0.5159 | 0.0082 | 0.4241 | 0.5501 | 0.7304 |
| 0.5806 | 11.5071 | 5650 | 0.4585 | 0.2923 | 0.333 | 0.313 | -1.0 | -1.0 | 0.2993 | 0.3925 | 0.4592 | 0.5627 | -1.0 | -1.0 | 0.5627 | 0.3377 | 0.519 | 0.0069 | 0.4621 | 0.5323 | 0.7069 |
| 0.4736 | 11.6090 | 5700 | 0.4330 | 0.2974 | 0.3369 | 0.3147 | -1.0 | -1.0 | 0.302 | 0.3923 | 0.4525 | 0.5617 | -1.0 | -1.0 | 0.5617 | 0.3266 | 0.4873 | 0.0165 | 0.4655 | 0.549 | 0.7324 |
| 0.633 | 11.7108 | 5750 | 0.4385 | 0.2962 | 0.3312 | 0.3188 | -1.0 | -1.0 | 0.3 | 0.3825 | 0.4397 | 0.5478 | -1.0 | -1.0 | 0.5478 | 0.3222 | 0.4794 | 0.0076 | 0.4552 | 0.5588 | 0.7088 |
| 1.298 | 11.8126 | 5800 | 0.4444 | 0.3013 | 0.3393 | 0.323 | -1.0 | -1.0 | 0.3038 | 0.4001 | 0.4537 | 0.5502 | -1.0 | -1.0 | 0.5502 | 0.3416 | 0.5063 | 0.0096 | 0.4414 | 0.5528 | 0.7029 |
| 0.8351 | 11.9145 | 5850 | 0.4634 | 0.3017 | 0.3379 | 0.324 | -1.0 | -1.0 | 0.3082 | 0.3907 | 0.4491 | 0.5365 | -1.0 | -1.0 | 0.5365 | 0.3151 | 0.4746 | 0.0088 | 0.4034 | 0.5812 | 0.7314 |
| 1.0177 | 12.0163 | 5900 | 0.4514 | 0.3039 | 0.334 | 0.3228 | -1.0 | -1.0 | 0.3096 | 0.4035 | 0.4511 | 0.5477 | -1.0 | -1.0 | 0.5477 | 0.3202 | 0.4841 | 0.0064 | 0.4207 | 0.5853 | 0.7382 |
| 1.3682 | 12.1181 | 5950 | 0.4470 | 0.3057 | 0.3383 | 0.3269 | -1.0 | -1.0 | 0.3108 | 0.4154 | 0.4693 | 0.5624 | -1.0 | -1.0 | 0.5624 | 0.311 | 0.454 | 0.0069 | 0.4724 | 0.5993 | 0.7608 |
| 0.8238 | 12.2200 | 6000 | 0.4072 | 0.2973 | 0.327 | 0.3156 | -1.0 | -1.0 | 0.3013 | 0.3962 | 0.4594 | 0.5594 | -1.0 | -1.0 | 0.5594 | 0.3261 | 0.4698 | 0.0069 | 0.4828 | 0.5588 | 0.7255 |
| 0.6402 | 12.3218 | 6050 | 0.4230 | 0.3007 | 0.3494 | 0.3217 | -1.0 | -1.0 | 0.3043 | 0.3922 | 0.4391 | 0.5483 | -1.0 | -1.0 | 0.5483 | 0.3108 | 0.4333 | 0.0178 | 0.4793 | 0.5735 | 0.7324 |
| 0.537 | 12.4236 | 6100 | 0.4037 | 0.3244 | 0.3624 | 0.347 | -1.0 | -1.0 | 0.3287 | 0.4309 | 0.4829 | 0.5771 | -1.0 | -1.0 | 0.5771 | 0.3955 | 0.5698 | 0.0125 | 0.4448 | 0.5652 | 0.7167 |
| 0.4765 | 12.5255 | 6150 | 0.4080 | 0.33 | 0.3676 | 0.3505 | -1.0 | -1.0 | 0.3343 | 0.4206 | 0.4785 | 0.5532 | -1.0 | -1.0 | 0.5532 | 0.3935 | 0.5667 | 0.0143 | 0.3655 | 0.5821 | 0.7275 |
| 1.7486 | 12.6273 | 6200 | 0.4253 | 0.3357 | 0.3728 | 0.3523 | -1.0 | -1.0 | 0.3413 | 0.4408 | 0.4794 | 0.5461 | -1.0 | -1.0 | 0.5461 | 0.4151 | 0.5984 | 0.0193 | 0.3103 | 0.5726 | 0.7294 |
| 0.3369 | 12.7291 | 6250 | 0.4018 | 0.3303 | 0.3768 | 0.3485 | -1.0 | -1.0 | 0.3348 | 0.4193 | 0.4688 | 0.5343 | -1.0 | -1.0 | 0.5343 | 0.3943 | 0.5667 | 0.0199 | 0.3207 | 0.5766 | 0.7157 |
| 0.4465 | 12.8310 | 6300 | 0.4211 | 0.3166 | 0.3477 | 0.3357 | -1.0 | -1.0 | 0.3214 | 0.4098 | 0.4517 | 0.5413 | -1.0 | -1.0 | 0.5413 | 0.3691 | 0.5286 | 0.0099 | 0.3759 | 0.5709 | 0.7196 |
| 0.3394 | 12.9328 | 6350 | 0.4136 | 0.3198 | 0.3612 | 0.3356 | -1.0 | -1.0 | 0.3227 | 0.408 | 0.4577 | 0.5278 | -1.0 | -1.0 | 0.5278 | 0.3652 | 0.5111 | 0.0218 | 0.3448 | 0.5725 | 0.7275 |
| 0.4738 | 13.0346 | 6400 | 0.4102 | 0.3177 | 0.3612 | 0.3376 | -1.0 | -1.0 | 0.3228 | 0.4211 | 0.4689 | 0.5562 | -1.0 | -1.0 | 0.5562 | 0.3793 | 0.5492 | 0.0165 | 0.3862 | 0.5573 | 0.7333 |
| 1.1177 | 13.1365 | 6450 | 0.3955 | 0.311 | 0.3477 | 0.3333 | -1.0 | -1.0 | 0.3149 | 0.414 | 0.4836 | 0.5618 | -1.0 | -1.0 | 0.5618 | 0.3778 | 0.5667 | 0.0124 | 0.4 | 0.5427 | 0.7186 |
| 0.4671 | 13.2383 | 6500 | 0.3912 | 0.3208 | 0.3562 | 0.3411 | -1.0 | -1.0 | 0.3235 | 0.4182 | 0.4794 | 0.5495 | -1.0 | -1.0 | 0.5495 | 0.3871 | 0.5492 | 0.017 | 0.3759 | 0.5584 | 0.7235 |
| 0.4744 | 13.3401 | 6550 | 0.3923 | 0.3147 | 0.3527 | 0.3375 | -1.0 | -1.0 | 0.3184 | 0.4197 | 0.4639 | 0.5525 | -1.0 | -1.0 | 0.5525 | 0.3776 | 0.5476 | 0.0109 | 0.3862 | 0.5555 | 0.7235 |
| 0.5302 | 13.4420 | 6600 | 0.3963 | 0.3306 | 0.3744 | 0.3502 | -1.0 | -1.0 | 0.3364 | 0.4402 | 0.4883 | 0.5607 | -1.0 | -1.0 | 0.5607 | 0.4113 | 0.6063 | 0.0178 | 0.3483 | 0.5627 | 0.7275 |
| 0.3128 | 13.5438 | 6650 | 0.3925 | 0.3071 | 0.3531 | 0.3291 | -1.0 | -1.0 | 0.3108 | 0.4132 | 0.4631 | 0.5608 | -1.0 | -1.0 | 0.5608 | 0.3564 | 0.519 | 0.026 | 0.4517 | 0.5388 | 0.7118 |
| 0.3991 | 13.6456 | 6700 | 0.3757 | 0.3115 | 0.3548 | 0.3292 | -1.0 | -1.0 | 0.315 | 0.4058 | 0.4678 | 0.5529 | -1.0 | -1.0 | 0.5529 | 0.3718 | 0.5222 | 0.0191 | 0.4207 | 0.5435 | 0.7157 |
| 0.4445 | 13.7475 | 6750 | 0.3741 | 0.3134 | 0.3565 | 0.3357 | -1.0 | -1.0 | 0.3174 | 0.4255 | 0.4833 | 0.5707 | -1.0 | -1.0 | 0.5707 | 0.3946 | 0.581 | 0.018 | 0.431 | 0.5274 | 0.7 |
| 0.3567 | 13.8493 | 6800 | 0.3990 | 0.3027 | 0.3473 | 0.3238 | -1.0 | -1.0 | 0.3079 | 0.4061 | 0.465 | 0.5409 | -1.0 | -1.0 | 0.5409 | 0.3903 | 0.5841 | 0.0174 | 0.369 | 0.5003 | 0.6696 |
| 0.4034 | 13.9511 | 6850 | 0.3865 | 0.3056 | 0.348 | 0.3279 | -1.0 | -1.0 | 0.3117 | 0.4194 | 0.4654 | 0.5516 | -1.0 | -1.0 | 0.5516 | 0.3961 | 0.5873 | 0.0159 | 0.3862 | 0.5049 | 0.6814 |
| 0.5333 | 14.0530 | 6900 | 0.3983 | 0.3128 | 0.3539 | 0.3331 | -1.0 | -1.0 | 0.3159 | 0.4222 | 0.4709 | 0.5709 | -1.0 | -1.0 | 0.5709 | 0.4218 | 0.6238 | 0.0165 | 0.431 | 0.5001 | 0.6578 |
| 0.7354 | 14.1548 | 6950 | 0.4141 | 0.3272 | 0.372 | 0.3492 | -1.0 | -1.0 | 0.3285 | 0.4241 | 0.4748 | 0.5782 | -1.0 | -1.0 | 0.5782 | 0.4271 | 0.6079 | 0.0103 | 0.4552 | 0.5441 | 0.6716 |
| 1.6592 | 14.2566 | 7000 | 0.3964 | 0.3168 | 0.3669 | 0.332 | -1.0 | -1.0 | 0.3187 | 0.4248 | 0.4698 | 0.5514 | -1.0 | -1.0 | 0.5514 | 0.4006 | 0.5825 | 0.0296 | 0.4138 | 0.5203 | 0.6578 |
| 0.6544 | 14.3585 | 7050 | 0.4193 | 0.2988 | 0.3352 | 0.3178 | -1.0 | -1.0 | 0.3063 | 0.4099 | 0.4889 | 0.5591 | -1.0 | -1.0 | 0.5591 | 0.3766 | 0.5952 | 0.015 | 0.4172 | 0.5047 | 0.6647 |
| 0.5879 | 14.4603 | 7100 | 0.4189 | 0.2819 | 0.3277 | 0.2934 | -1.0 | -1.0 | 0.2917 | 0.3982 | 0.4751 | 0.5659 | -1.0 | -1.0 | 0.5659 | 0.3546 | 0.5778 | 0.0216 | 0.4897 | 0.4694 | 0.6304 |
| 0.9165 | 14.5621 | 7150 | 0.3850 | 0.298 | 0.3386 | 0.3164 | -1.0 | -1.0 | 0.3051 | 0.4053 | 0.4772 | 0.5795 | -1.0 | -1.0 | 0.5795 | 0.3774 | 0.5952 | 0.0191 | 0.5 | 0.4977 | 0.6431 |
| 0.4671 | 14.6640 | 7200 | 0.4131 | 0.2873 | 0.3272 | 0.3003 | -1.0 | -1.0 | 0.2908 | 0.3774 | 0.4611 | 0.5565 | -1.0 | -1.0 | 0.5565 | 0.371 | 0.5825 | 0.0109 | 0.4517 | 0.4799 | 0.6353 |
| 0.3135 | 14.7658 | 7250 | 0.3958 | 0.3117 | 0.357 | 0.3311 | -1.0 | -1.0 | 0.3136 | 0.4233 | 0.4766 | 0.5582 | -1.0 | -1.0 | 0.5582 | 0.3805 | 0.5651 | 0.0177 | 0.4172 | 0.537 | 0.6922 |
| 0.2984 | 14.8676 | 7300 | 0.4165 | 0.2985 | 0.3395 | 0.3232 | -1.0 | -1.0 | 0.2995 | 0.4044 | 0.4586 | 0.5298 | -1.0 | -1.0 | 0.5298 | 0.3353 | 0.4905 | 0.0154 | 0.3931 | 0.5447 | 0.7059 |
| 0.5092 | 14.9695 | 7350 | 0.4012 | 0.3097 | 0.3525 | 0.3261 | -1.0 | -1.0 | 0.3108 | 0.4195 | 0.4751 | 0.5544 | -1.0 | -1.0 | 0.5544 | 0.3381 | 0.4952 | 0.0212 | 0.4414 | 0.5697 | 0.7265 |
| 0.5818 | 15.0713 | 7400 | 0.4022 | 0.323 | 0.3556 | 0.3451 | -1.0 | -1.0 | 0.325 | 0.4258 | 0.4885 | 0.569 | -1.0 | -1.0 | 0.569 | 0.3931 | 0.5794 | 0.0113 | 0.4138 | 0.5648 | 0.7137 |
| 0.6533 | 15.1731 | 7450 | 0.3960 | 0.3241 | 0.3741 | 0.3378 | -1.0 | -1.0 | 0.3257 | 0.4284 | 0.4922 | 0.5554 | -1.0 | -1.0 | 0.5554 | 0.3926 | 0.5746 | 0.0229 | 0.3828 | 0.5567 | 0.7088 |
| 0.6955 | 15.2749 | 7500 | 0.4087 | 0.3219 | 0.3635 | 0.3385 | -1.0 | -1.0 | 0.3243 | 0.4285 | 0.4747 | 0.554 | -1.0 | -1.0 | 0.554 | 0.3785 | 0.5619 | 0.0271 | 0.4 | 0.5602 | 0.7 |
| 0.8639 | 15.3768 | 7550 | 0.3846 | 0.3315 | 0.3748 | 0.3533 | -1.0 | -1.0 | 0.3325 | 0.4372 | 0.4959 | 0.5614 | -1.0 | -1.0 | 0.5614 | 0.4088 | 0.5841 | 0.0225 | 0.4 | 0.563 | 0.7 |
| 1.3757 | 15.4786 | 7600 | 0.3801 | 0.325 | 0.3725 | 0.3428 | -1.0 | -1.0 | 0.3261 | 0.423 | 0.4748 | 0.5633 | -1.0 | -1.0 | 0.5633 | 0.3912 | 0.5651 | 0.0261 | 0.4414 | 0.5575 | 0.6833 |
| 0.3398 | 15.5804 | 7650 | 0.3954 | 0.3275 | 0.3848 | 0.3504 | -1.0 | -1.0 | 0.3278 | 0.426 | 0.4721 | 0.556 | -1.0 | -1.0 | 0.556 | 0.3895 | 0.5587 | 0.0312 | 0.4103 | 0.5619 | 0.699 |
| 0.3513 | 15.6823 | 7700 | 0.3872 | 0.3471 | 0.4016 | 0.3683 | -1.0 | -1.0 | 0.3472 | 0.4347 | 0.4777 | 0.5524 | -1.0 | -1.0 | 0.5524 | 0.4352 | 0.5952 | 0.0259 | 0.369 | 0.5801 | 0.6931 |
| 0.3229 | 15.7841 | 7750 | 0.3868 | 0.345 | 0.3949 | 0.365 | -1.0 | -1.0 | 0.345 | 0.4324 | 0.4707 | 0.5523 | -1.0 | -1.0 | 0.5523 | 0.4224 | 0.5698 | 0.0252 | 0.3862 | 0.5875 | 0.701 |
| 0.4801 | 15.8859 | 7800 | 0.3777 | 0.3469 | 0.3939 | 0.3674 | -1.0 | -1.0 | 0.3469 | 0.4322 | 0.4742 | 0.5558 | -1.0 | -1.0 | 0.5558 | 0.4443 | 0.5968 | 0.0232 | 0.3862 | 0.5731 | 0.6843 |
| 0.435 | 15.9878 | 7850 | 0.3780 | 0.341 | 0.396 | 0.3624 | -1.0 | -1.0 | 0.341 | 0.4307 | 0.4605 | 0.5433 | -1.0 | -1.0 | 0.5433 | 0.4309 | 0.581 | 0.0234 | 0.3655 | 0.5687 | 0.6833 |
| 0.7239 | 16.0896 | 7900 | 0.3660 | 0.3447 | 0.3955 | 0.3645 | -1.0 | -1.0 | 0.3456 | 0.425 | 0.4812 | 0.5628 | -1.0 | -1.0 | 0.5628 | 0.4352 | 0.581 | 0.0207 | 0.4172 | 0.5782 | 0.6902 |
| 1.2456 | 16.1914 | 7950 | 0.3725 | 0.3472 | 0.3957 | 0.3688 | -1.0 | -1.0 | 0.3479 | 0.4266 | 0.4852 | 0.5588 | -1.0 | -1.0 | 0.5588 | 0.4215 | 0.5524 | 0.0158 | 0.3966 | 0.6044 | 0.7275 |
| 0.2957 | 16.2933 | 8000 | 0.3768 | 0.3507 | 0.4022 | 0.3706 | -1.0 | -1.0 | 0.3508 | 0.4378 | 0.4779 | 0.5526 | -1.0 | -1.0 | 0.5526 | 0.4157 | 0.5397 | 0.0236 | 0.3759 | 0.6127 | 0.7422 |
| 0.4006 | 16.3951 | 8050 | 0.3907 | 0.3475 | 0.3892 | 0.3687 | -1.0 | -1.0 | 0.3475 | 0.4274 | 0.476 | 0.5599 | -1.0 | -1.0 | 0.5599 | 0.4305 | 0.5651 | 0.0158 | 0.3931 | 0.5961 | 0.7216 |
| 0.4983 | 16.4969 | 8100 | 0.3968 | 0.3424 | 0.3891 | 0.3656 | -1.0 | -1.0 | 0.3425 | 0.4373 | 0.479 | 0.5618 | -1.0 | -1.0 | 0.5618 | 0.4233 | 0.5746 | 0.019 | 0.4069 | 0.5849 | 0.7039 |
| 0.2923 | 16.5988 | 8150 | 0.3845 | 0.3447 | 0.3961 | 0.3655 | -1.0 | -1.0 | 0.3447 | 0.4211 | 0.4838 | 0.5712 | -1.0 | -1.0 | 0.5712 | 0.4258 | 0.5762 | 0.0234 | 0.4345 | 0.585 | 0.7029 |
| 0.84 | 16.7006 | 8200 | 0.3824 | 0.3363 | 0.3864 | 0.3554 | -1.0 | -1.0 | 0.3363 | 0.4302 | 0.4862 | 0.569 | -1.0 | -1.0 | 0.569 | 0.4107 | 0.5635 | 0.0244 | 0.4483 | 0.5738 | 0.6951 |
| 0.2858 | 16.8024 | 8250 | 0.3763 | 0.333 | 0.3824 | 0.3473 | -1.0 | -1.0 | 0.3332 | 0.4283 | 0.4733 | 0.5618 | -1.0 | -1.0 | 0.5618 | 0.4072 | 0.554 | 0.0287 | 0.4414 | 0.5629 | 0.6902 |
| 0.4264 | 16.9043 | 8300 | 0.3812 | 0.3304 | 0.3742 | 0.3496 | -1.0 | -1.0 | 0.3311 | 0.4134 | 0.4662 | 0.5397 | -1.0 | -1.0 | 0.5397 | 0.4157 | 0.5556 | 0.0241 | 0.3931 | 0.5515 | 0.6706 |
| 0.3899 | 17.0061 | 8350 | 0.3761 | 0.3397 | 0.3917 | 0.354 | -1.0 | -1.0 | 0.3406 | 0.4323 | 0.4779 | 0.5549 | -1.0 | -1.0 | 0.5549 | 0.4234 | 0.5714 | 0.0333 | 0.4138 | 0.5625 | 0.6794 |
| 0.5517 | 17.1079 | 8400 | 0.3705 | 0.3284 | 0.3807 | 0.343 | -1.0 | -1.0 | 0.329 | 0.4243 | 0.4733 | 0.5515 | -1.0 | -1.0 | 0.5515 | 0.4095 | 0.5524 | 0.0308 | 0.4138 | 0.5448 | 0.6882 |
| 0.544 | 17.2098 | 8450 | 0.3792 | 0.336 | 0.3857 | 0.3524 | -1.0 | -1.0 | 0.3368 | 0.4312 | 0.4852 | 0.5726 | -1.0 | -1.0 | 0.5726 | 0.4437 | 0.6127 | 0.0233 | 0.4414 | 0.541 | 0.6637 |
| 0.3121 | 17.3116 | 8500 | 0.3753 | 0.332 | 0.3765 | 0.3514 | -1.0 | -1.0 | 0.3329 | 0.4281 | 0.4854 | 0.5728 | -1.0 | -1.0 | 0.5728 | 0.4091 | 0.5524 | 0.0207 | 0.4621 | 0.5662 | 0.7039 |
| 0.3784 | 17.4134 | 8550 | 0.3753 | 0.3253 | 0.3734 | 0.346 | -1.0 | -1.0 | 0.3262 | 0.4168 | 0.4705 | 0.5613 | -1.0 | -1.0 | 0.5613 | 0.4143 | 0.5524 | 0.0272 | 0.4552 | 0.5346 | 0.6765 |
| 0.5904 | 17.5153 | 8600 | 0.3805 | 0.3321 | 0.3801 | 0.3479 | -1.0 | -1.0 | 0.333 | 0.4223 | 0.4804 | 0.57 | -1.0 | -1.0 | 0.57 | 0.4165 | 0.5524 | 0.0239 | 0.4655 | 0.5559 | 0.6922 |
| 0.9018 | 17.6171 | 8650 | 0.3653 | 0.3388 | 0.383 | 0.356 | -1.0 | -1.0 | 0.3396 | 0.4253 | 0.4848 | 0.563 | -1.0 | -1.0 | 0.563 | 0.4224 | 0.554 | 0.0234 | 0.431 | 0.5708 | 0.7039 |
| 0.5026 | 17.7189 | 8700 | 0.3699 | 0.3383 | 0.39 | 0.3527 | -1.0 | -1.0 | 0.3388 | 0.4107 | 0.4615 | 0.5512 | -1.0 | -1.0 | 0.5512 | 0.418 | 0.5397 | 0.0297 | 0.4276 | 0.5671 | 0.6863 |
| 0.4129 | 17.8208 | 8750 | 0.3559 | 0.3404 | 0.3899 | 0.3565 | -1.0 | -1.0 | 0.3407 | 0.417 | 0.4661 | 0.5523 | -1.0 | -1.0 | 0.5523 | 0.4173 | 0.5397 | 0.0274 | 0.4241 | 0.5766 | 0.6931 |
| 1.2525 | 17.9226 | 8800 | 0.3814 | 0.344 | 0.3841 | 0.364 | -1.0 | -1.0 | 0.344 | 0.4183 | 0.4756 | 0.5572 | -1.0 | -1.0 | 0.5572 | 0.4258 | 0.554 | 0.018 | 0.4069 | 0.5882 | 0.7108 |
| 0.8418 | 18.0244 | 8850 | 0.3730 | 0.3514 | 0.4029 | 0.3701 | -1.0 | -1.0 | 0.3514 | 0.4266 | 0.4815 | 0.5654 | -1.0 | -1.0 | 0.5654 | 0.4392 | 0.5651 | 0.0306 | 0.4241 | 0.5845 | 0.7069 |
| 0.495 | 18.1263 | 8900 | 0.3763 | 0.3444 | 0.3905 | 0.3612 | -1.0 | -1.0 | 0.3444 | 0.4186 | 0.4725 | 0.5656 | -1.0 | -1.0 | 0.5656 | 0.4318 | 0.554 | 0.0277 | 0.4517 | 0.5736 | 0.6912 |
| 0.36 | 18.2281 | 8950 | 0.3530 | 0.3521 | 0.4019 | 0.368 | -1.0 | -1.0 | 0.3521 | 0.4288 | 0.482 | 0.5705 | -1.0 | -1.0 | 0.5705 | 0.4369 | 0.5683 | 0.0334 | 0.4414 | 0.5861 | 0.702 |
| 0.4354 | 18.3299 | 9000 | 0.3601 | 0.3503 | 0.4012 | 0.3664 | -1.0 | -1.0 | 0.3503 | 0.4312 | 0.4799 | 0.5787 | -1.0 | -1.0 | 0.5787 | 0.4436 | 0.581 | 0.0306 | 0.4621 | 0.5767 | 0.6931 |
| 0.5433 | 18.4318 | 9050 | 0.3590 | 0.3488 | 0.3922 | 0.3697 | -1.0 | -1.0 | 0.349 | 0.4301 | 0.4779 | 0.6067 | -1.0 | -1.0 | 0.6067 | 0.4526 | 0.5968 | 0.0203 | 0.5379 | 0.5736 | 0.6853 |
| 0.6011 | 18.5336 | 9100 | 0.3560 | 0.3511 | 0.4051 | 0.3698 | -1.0 | -1.0 | 0.3515 | 0.4314 | 0.4811 | 0.6133 | -1.0 | -1.0 | 0.6133 | 0.4526 | 0.5968 | 0.0289 | 0.5586 | 0.5717 | 0.6843 |
| 0.2583 | 18.6354 | 9150 | 0.3576 | 0.3511 | 0.4031 | 0.3684 | -1.0 | -1.0 | 0.3514 | 0.4279 | 0.4781 | 0.6068 | -1.0 | -1.0 | 0.6068 | 0.4429 | 0.581 | 0.031 | 0.5483 | 0.5794 | 0.6912 |
| 0.5591 | 18.7373 | 9200 | 0.3600 | 0.3502 | 0.4005 | 0.3676 | -1.0 | -1.0 | 0.3503 | 0.4289 | 0.481 | 0.6006 | -1.0 | -1.0 | 0.6006 | 0.4438 | 0.581 | 0.0291 | 0.5345 | 0.5778 | 0.6863 |
| 0.4953 | 18.8391 | 9250 | 0.3594 | 0.3566 | 0.4109 | 0.3753 | -1.0 | -1.0 | 0.3566 | 0.4336 | 0.4857 | 0.6063 | -1.0 | -1.0 | 0.6063 | 0.459 | 0.5968 | 0.0304 | 0.531 | 0.5804 | 0.6912 |
| 0.4214 | 18.9409 | 9300 | 0.3589 | 0.3629 | 0.4123 | 0.3825 | -1.0 | -1.0 | 0.363 | 0.4404 | 0.4896 | 0.616 | -1.0 | -1.0 | 0.616 | 0.4711 | 0.6111 | 0.0314 | 0.5379 | 0.5863 | 0.699 |
| 0.3492 | 19.0428 | 9350 | 0.3619 | 0.3687 | 0.4228 | 0.385 | -1.0 | -1.0 | 0.3687 | 0.4475 | 0.5016 | 0.6258 | -1.0 | -1.0 | 0.6258 | 0.4855 | 0.6286 | 0.0326 | 0.5448 | 0.588 | 0.7039 |
| 0.2945 | 19.1446 | 9400 | 0.3604 | 0.3687 | 0.4216 | 0.3893 | -1.0 | -1.0 | 0.3687 | 0.4473 | 0.5009 | 0.6273 | -1.0 | -1.0 | 0.6273 | 0.4842 | 0.6238 | 0.0331 | 0.5552 | 0.5886 | 0.7029 |
| 0.4994 | 19.2464 | 9450 | 0.3693 | 0.3645 | 0.4175 | 0.3822 | -1.0 | -1.0 | 0.3645 | 0.4383 | 0.4914 | 0.596 | -1.0 | -1.0 | 0.596 | 0.4677 | 0.5968 | 0.0347 | 0.4862 | 0.5911 | 0.7049 |
| 0.6958 | 19.3483 | 9500 | 0.3723 | 0.367 | 0.4224 | 0.3906 | -1.0 | -1.0 | 0.367 | 0.4442 | 0.4995 | 0.6076 | -1.0 | -1.0 | 0.6076 | 0.475 | 0.6095 | 0.0348 | 0.5034 | 0.5913 | 0.7098 |
| 1.15 | 19.4501 | 9550 | 0.3687 | 0.3686 | 0.424 | 0.393 | -1.0 | -1.0 | 0.3686 | 0.446 | 0.5074 | 0.5925 | -1.0 | -1.0 | 0.5925 | 0.4795 | 0.6095 | 0.0256 | 0.4483 | 0.6008 | 0.7196 |
| 0.2574 | 19.5519 | 9600 | 0.3728 | 0.3677 | 0.4216 | 0.3886 | -1.0 | -1.0 | 0.3677 | 0.4432 | 0.5001 | 0.5851 | -1.0 | -1.0 | 0.5851 | 0.4751 | 0.5984 | 0.0244 | 0.4345 | 0.6036 | 0.7225 |
| 0.6085 | 19.6538 | 9650 | 0.3694 | 0.37 | 0.4247 | 0.3921 | -1.0 | -1.0 | 0.37 | 0.4478 | 0.5026 | 0.5923 | -1.0 | -1.0 | 0.5923 | 0.4797 | 0.6095 | 0.0263 | 0.4448 | 0.6041 | 0.7225 |
| 0.5051 | 19.7556 | 9700 | 0.3746 | 0.3675 | 0.4224 | 0.3897 | -1.0 | -1.0 | 0.3676 | 0.4438 | 0.4982 | 0.5856 | -1.0 | -1.0 | 0.5856 | 0.4792 | 0.6095 | 0.0257 | 0.4345 | 0.5976 | 0.7127 |
| 0.9448 | 19.8574 | 9750 | 0.3722 | 0.3666 | 0.4219 | 0.3883 | -1.0 | -1.0 | 0.3666 | 0.4438 | 0.5005 | 0.5948 | -1.0 | -1.0 | 0.5948 | 0.478 | 0.6095 | 0.0249 | 0.4621 | 0.5968 | 0.7127 |
| 0.4886 | 19.9593 | 9800 | 0.3704 | 0.3673 | 0.4248 | 0.3885 | -1.0 | -1.0 | 0.3673 | 0.4389 | 0.5011 | 0.6011 | -1.0 | -1.0 | 0.6011 | 0.4772 | 0.6111 | 0.0256 | 0.4793 | 0.599 | 0.7127 |
| 0.1613 | 20.0611 | 9850 | 0.3702 | 0.3704 | 0.4271 | 0.3914 | -1.0 | -1.0 | 0.3704 | 0.4433 | 0.5049 | 0.5934 | -1.0 | -1.0 | 0.5934 | 0.478 | 0.6095 | 0.0264 | 0.4483 | 0.6067 | 0.7225 |
| 0.3438 | 20.1629 | 9900 | 0.3700 | 0.3701 | 0.428 | 0.3914 | -1.0 | -1.0 | 0.3701 | 0.4419 | 0.5035 | 0.5931 | -1.0 | -1.0 | 0.5931 | 0.4787 | 0.6095 | 0.0258 | 0.4483 | 0.6057 | 0.7216 |
| 0.6573 | 20.2648 | 9950 | 0.3702 | 0.37 | 0.4245 | 0.3912 | -1.0 | -1.0 | 0.37 | 0.4491 | 0.5049 | 0.5934 | -1.0 | -1.0 | 0.5934 | 0.4783 | 0.6095 | 0.0253 | 0.4483 | 0.6063 | 0.7225 |
| 0.5515 | 20.3666 | 10000 | 0.3703 | 0.3703 | 0.4284 | 0.3912 | -1.0 | -1.0 | 0.3703 | 0.4491 | 0.5038 | 0.5934 | -1.0 | -1.0 | 0.5934 | 0.4782 | 0.6095 | 0.0261 | 0.4483 | 0.6064 | 0.7225 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"background",
"metal",
"paper",
"plastic"
] |
gorgor1000/detr-resnet-50-finetuned-fashionpedia |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-finetuned-fashionpedia
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2560
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.9388 | 1.0 | 500 | 1.6931 |
| 1.6687 | 2.0 | 1000 | 1.6105 |
| 1.6005 | 3.0 | 1500 | 1.4854 |
| 1.4927 | 4.0 | 2000 | 1.4709 |
| 1.433 | 5.0 | 2500 | 1.3741 |
| 1.3916 | 6.0 | 3000 | 1.3862 |
| 1.3196 | 7.0 | 3500 | 1.3261 |
| 1.2961 | 8.0 | 4000 | 1.2754 |
| 1.25 | 9.0 | 4500 | 1.2689 |
| 1.2318 | 10.0 | 5000 | 1.2476 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"accessories",
"bags",
"clothing",
"shoes"
] |
dagarcsot/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7771
- Map: 0.5882
- Map 50: 0.8376
- Map 75: 0.6723
- Map Small: -1.0
- Map Medium: 0.6116
- Map Large: 0.5966
- Mar 1: 0.4201
- Mar 10: 0.7111
- Mar 100: 0.7683
- Mar Small: -1.0
- Mar Medium: 0.7071
- Mar Large: 0.7767
- Map Banana: 0.4758
- Mar 100 Banana: 0.7425
- Map Orange: 0.6281
- Mar 100 Orange: 0.8024
- Map Apple: 0.6608
- Mar 100 Apple: 0.76
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 1.9700 | 0.0096 | 0.0268 | 0.0038 | -1.0 | 0.0155 | 0.0132 | 0.078 | 0.2026 | 0.3463 | -1.0 | 0.2343 | 0.3714 | 0.0132 | 0.2975 | 0.0096 | 0.3786 | 0.0058 | 0.3629 |
| No log | 2.0 | 120 | 1.6517 | 0.0553 | 0.1516 | 0.0414 | -1.0 | 0.111 | 0.0556 | 0.1359 | 0.2777 | 0.4308 | -1.0 | 0.3186 | 0.4454 | 0.0647 | 0.5175 | 0.0406 | 0.1976 | 0.0608 | 0.5771 |
| No log | 3.0 | 180 | 1.2778 | 0.1262 | 0.2428 | 0.1168 | -1.0 | 0.1877 | 0.1303 | 0.2519 | 0.5055 | 0.6286 | -1.0 | 0.5814 | 0.634 | 0.1024 | 0.6225 | 0.0983 | 0.4976 | 0.1778 | 0.7657 |
| No log | 4.0 | 240 | 1.0948 | 0.2377 | 0.4041 | 0.2352 | -1.0 | 0.4084 | 0.2402 | 0.3266 | 0.5759 | 0.7115 | -1.0 | 0.6371 | 0.7237 | 0.182 | 0.695 | 0.1717 | 0.7024 | 0.3596 | 0.7371 |
| No log | 5.0 | 300 | 1.0477 | 0.2746 | 0.4623 | 0.2895 | -1.0 | 0.2475 | 0.3142 | 0.3285 | 0.609 | 0.7315 | -1.0 | 0.6257 | 0.7458 | 0.221 | 0.7075 | 0.1828 | 0.7214 | 0.42 | 0.7657 |
| No log | 6.0 | 360 | 1.0028 | 0.3661 | 0.6059 | 0.4064 | -1.0 | 0.4221 | 0.3982 | 0.3651 | 0.6231 | 0.7251 | -1.0 | 0.6229 | 0.7379 | 0.2698 | 0.7 | 0.3568 | 0.7238 | 0.4716 | 0.7514 |
| No log | 7.0 | 420 | 0.9809 | 0.3532 | 0.5656 | 0.4002 | -1.0 | 0.4557 | 0.3731 | 0.3569 | 0.6472 | 0.7488 | -1.0 | 0.6829 | 0.7591 | 0.3239 | 0.715 | 0.3333 | 0.7714 | 0.4025 | 0.76 |
| No log | 8.0 | 480 | 0.9679 | 0.4348 | 0.6762 | 0.4868 | -1.0 | 0.5782 | 0.4375 | 0.3547 | 0.6527 | 0.7254 | -1.0 | 0.7343 | 0.7269 | 0.2877 | 0.68 | 0.4769 | 0.7619 | 0.5397 | 0.7343 |
| 1.2471 | 9.0 | 540 | 0.9173 | 0.4434 | 0.7005 | 0.5049 | -1.0 | 0.5147 | 0.4475 | 0.3646 | 0.6443 | 0.7348 | -1.0 | 0.6771 | 0.7408 | 0.3288 | 0.7225 | 0.4683 | 0.7619 | 0.5332 | 0.72 |
| 1.2471 | 10.0 | 600 | 0.8875 | 0.4834 | 0.7654 | 0.5497 | -1.0 | 0.5051 | 0.4991 | 0.369 | 0.6925 | 0.7589 | -1.0 | 0.6957 | 0.7689 | 0.3668 | 0.73 | 0.497 | 0.7952 | 0.5864 | 0.7514 |
| 1.2471 | 11.0 | 660 | 0.9261 | 0.4803 | 0.7507 | 0.5799 | -1.0 | 0.4907 | 0.4971 | 0.3818 | 0.6745 | 0.7525 | -1.0 | 0.6957 | 0.7629 | 0.3567 | 0.7175 | 0.5014 | 0.7714 | 0.5828 | 0.7686 |
| 1.2471 | 12.0 | 720 | 0.8520 | 0.4974 | 0.7451 | 0.5567 | -1.0 | 0.6198 | 0.4976 | 0.3946 | 0.691 | 0.7489 | -1.0 | 0.7157 | 0.7532 | 0.3709 | 0.7025 | 0.5588 | 0.7929 | 0.5626 | 0.7514 |
| 1.2471 | 13.0 | 780 | 0.8630 | 0.4998 | 0.7799 | 0.5682 | -1.0 | 0.546 | 0.5213 | 0.3848 | 0.6848 | 0.7519 | -1.0 | 0.6443 | 0.768 | 0.4078 | 0.7575 | 0.5624 | 0.7952 | 0.5292 | 0.7029 |
| 1.2471 | 14.0 | 840 | 0.8469 | 0.5071 | 0.776 | 0.5801 | -1.0 | 0.6247 | 0.5104 | 0.3913 | 0.7049 | 0.7579 | -1.0 | 0.6971 | 0.7682 | 0.3635 | 0.71 | 0.5271 | 0.781 | 0.6306 | 0.7829 |
| 1.2471 | 15.0 | 900 | 0.7995 | 0.5311 | 0.8059 | 0.5856 | -1.0 | 0.6156 | 0.5327 | 0.3958 | 0.7068 | 0.7576 | -1.0 | 0.7429 | 0.7592 | 0.3951 | 0.7175 | 0.5739 | 0.8095 | 0.6244 | 0.7457 |
| 1.2471 | 16.0 | 960 | 0.8150 | 0.5342 | 0.8046 | 0.6189 | -1.0 | 0.6285 | 0.5346 | 0.3974 | 0.7012 | 0.7505 | -1.0 | 0.7043 | 0.7556 | 0.4157 | 0.73 | 0.584 | 0.7929 | 0.603 | 0.7286 |
| 0.7135 | 17.0 | 1020 | 0.7887 | 0.5532 | 0.8155 | 0.6643 | -1.0 | 0.5982 | 0.5619 | 0.4184 | 0.7122 | 0.7656 | -1.0 | 0.6929 | 0.7758 | 0.4475 | 0.7425 | 0.5754 | 0.8 | 0.6365 | 0.7543 |
| 0.7135 | 18.0 | 1080 | 0.7961 | 0.5545 | 0.8237 | 0.6426 | -1.0 | 0.6024 | 0.5606 | 0.4042 | 0.7056 | 0.7583 | -1.0 | 0.6971 | 0.7648 | 0.4583 | 0.7425 | 0.6036 | 0.8095 | 0.6014 | 0.7229 |
| 0.7135 | 19.0 | 1140 | 0.7936 | 0.5726 | 0.8321 | 0.6599 | -1.0 | 0.6004 | 0.5838 | 0.4203 | 0.7209 | 0.7776 | -1.0 | 0.7071 | 0.7878 | 0.4648 | 0.75 | 0.5835 | 0.8 | 0.6695 | 0.7829 |
| 0.7135 | 20.0 | 1200 | 0.7948 | 0.5543 | 0.8208 | 0.638 | -1.0 | 0.5928 | 0.5617 | 0.4001 | 0.7032 | 0.7665 | -1.0 | 0.7 | 0.7747 | 0.4439 | 0.7525 | 0.5944 | 0.8071 | 0.6246 | 0.74 |
| 0.7135 | 21.0 | 1260 | 0.7850 | 0.5808 | 0.8357 | 0.6736 | -1.0 | 0.5831 | 0.5941 | 0.4118 | 0.7229 | 0.7766 | -1.0 | 0.7 | 0.7863 | 0.4928 | 0.765 | 0.6112 | 0.8048 | 0.6386 | 0.76 |
| 0.7135 | 22.0 | 1320 | 0.8025 | 0.5813 | 0.8356 | 0.6729 | -1.0 | 0.6177 | 0.5906 | 0.4188 | 0.7138 | 0.771 | -1.0 | 0.6871 | 0.7812 | 0.4719 | 0.755 | 0.6277 | 0.7952 | 0.6442 | 0.7629 |
| 0.7135 | 23.0 | 1380 | 0.7886 | 0.5795 | 0.83 | 0.6743 | -1.0 | 0.5957 | 0.589 | 0.4076 | 0.7065 | 0.7598 | -1.0 | 0.69 | 0.7679 | 0.4784 | 0.75 | 0.624 | 0.7952 | 0.6362 | 0.7343 |
| 0.7135 | 24.0 | 1440 | 0.8081 | 0.5787 | 0.8341 | 0.6563 | -1.0 | 0.5982 | 0.5875 | 0.4117 | 0.7084 | 0.7679 | -1.0 | 0.7114 | 0.7748 | 0.463 | 0.745 | 0.6192 | 0.7929 | 0.6538 | 0.7657 |
| 0.5383 | 25.0 | 1500 | 0.7858 | 0.5865 | 0.8318 | 0.6691 | -1.0 | 0.6285 | 0.5935 | 0.4216 | 0.7144 | 0.7729 | -1.0 | 0.7186 | 0.7792 | 0.473 | 0.75 | 0.624 | 0.8 | 0.6626 | 0.7686 |
| 0.5383 | 26.0 | 1560 | 0.7777 | 0.5935 | 0.8462 | 0.6778 | -1.0 | 0.6176 | 0.6011 | 0.4216 | 0.7151 | 0.7709 | -1.0 | 0.7143 | 0.7784 | 0.4799 | 0.7475 | 0.6363 | 0.8024 | 0.6643 | 0.7629 |
| 0.5383 | 27.0 | 1620 | 0.7821 | 0.5914 | 0.8388 | 0.6746 | -1.0 | 0.6231 | 0.5982 | 0.4209 | 0.7128 | 0.7685 | -1.0 | 0.7043 | 0.7771 | 0.4773 | 0.7375 | 0.6304 | 0.8024 | 0.6665 | 0.7657 |
| 0.5383 | 28.0 | 1680 | 0.7803 | 0.5918 | 0.8401 | 0.6739 | -1.0 | 0.6233 | 0.5987 | 0.4201 | 0.7129 | 0.7684 | -1.0 | 0.7143 | 0.7759 | 0.4768 | 0.74 | 0.6328 | 0.8024 | 0.6658 | 0.7629 |
| 0.5383 | 29.0 | 1740 | 0.7800 | 0.5886 | 0.8382 | 0.6727 | -1.0 | 0.6116 | 0.5971 | 0.4201 | 0.7111 | 0.7683 | -1.0 | 0.7071 | 0.7767 | 0.476 | 0.7425 | 0.629 | 0.8024 | 0.6608 | 0.76 |
| 0.5383 | 30.0 | 1800 | 0.7771 | 0.5882 | 0.8376 | 0.6723 | -1.0 | 0.6116 | 0.5966 | 0.4201 | 0.7111 | 0.7683 | -1.0 | 0.7071 | 0.7767 | 0.4758 | 0.7425 | 0.6281 | 0.8024 | 0.6608 | 0.76 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
franciscomj0901/detr-fashion |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
franciscomj0901/checkpoints |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# checkpoints
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3614
- Map: 0.0116
- Map 50: 0.0224
- Map 75: 0.0105
- Map Small: 0.0058
- Map Medium: 0.0165
- Map Large: 0.0098
- Mar 1: 0.0323
- Mar 10: 0.0618
- Mar 100: 0.0643
- Mar Small: 0.0215
- Mar Medium: 0.0607
- Mar Large: 0.0811
- Map Shirt, blouse: 0.0
- Mar 100 Shirt, blouse: 0.0
- Map Top, t-shirt, sweatshirt: 0.0234
- Mar 100 Top, t-shirt, sweatshirt: 0.3322
- Map Sweater: 0.0
- Mar 100 Sweater: 0.0
- Map Cardigan: 0.0
- Mar 100 Cardigan: 0.0
- Map Jacket: 0.0
- Mar 100 Jacket: 0.0
- Map Vest: 0.0
- Mar 100 Vest: 0.0
- Map Pants: 0.1092
- Mar 100 Pants: 0.6471
- Map Shorts: 0.0
- Mar 100 Shorts: 0.0
- Map Skirt: 0.0
- Mar 100 Skirt: 0.0
- Map Coat: 0.0
- Mar 100 Coat: 0.0
- Map Dress: 0.0869
- Mar 100 Dress: 0.7291
- Map Jumpsuit: 0.0
- Mar 100 Jumpsuit: 0.0
- Map Cape: 0.0
- Mar 100 Cape: 0.0
- Map Glasses: 0.0
- Mar 100 Glasses: 0.0
- Map Hat: 0.0
- Mar 100 Hat: 0.0
- Map Headband, head covering, hair accessory: 0.0
- Mar 100 Headband, head covering, hair accessory: 0.0
- Map Tie: 0.0
- Mar 100 Tie: 0.0
- Map Glove: 0.0
- Mar 100 Glove: 0.0
- Map Watch: 0.0
- Mar 100 Watch: 0.0
- Map Belt: 0.0069
- Mar 100 Belt: 0.0043
- Map Leg warmer: 0.0
- Mar 100 Leg warmer: 0.0
- Map Tights, stockings: 0.0
- Mar 100 Tights, stockings: 0.0
- Map Sock: 0.0
- Mar 100 Sock: 0.0
- Map Shoe: 0.2046
- Mar 100 Shoe: 0.4928
- Map Bag, wallet: 0.0
- Mar 100 Bag, wallet: 0.0
- Map Scarf: 0.0
- Mar 100 Scarf: 0.0
- Map Umbrella: 0.0
- Mar 100 Umbrella: 0.0
- Map Hood: 0.0
- Mar 100 Hood: 0.0
- Map Collar: 0.0
- Mar 100 Collar: 0.0
- Map Lapel: 0.0
- Mar 100 Lapel: 0.0
- Map Epaulette: 0.0
- Mar 100 Epaulette: 0.0
- Map Sleeve: 0.0724
- Mar 100 Sleeve: 0.4613
- Map Pocket: 0.0001
- Mar 100 Pocket: 0.042
- Map Neckline: 0.0292
- Mar 100 Neckline: 0.2495
- Map Buckle: 0.0
- Mar 100 Buckle: 0.0
- Map Zipper: 0.0
- Mar 100 Zipper: 0.0
- Map Applique: 0.0
- Mar 100 Applique: 0.0
- Map Bead: 0.0
- Mar 100 Bead: 0.0
- Map Bow: 0.0
- Mar 100 Bow: 0.0
- Map Flower: 0.0
- Mar 100 Flower: 0.0
- Map Fringe: 0.0
- Mar 100 Fringe: 0.0
- Map Ribbon: 0.0
- Mar 100 Ribbon: 0.0
- Map Rivet: 0.0
- Mar 100 Rivet: 0.0
- Map Ruffle: 0.0
- Mar 100 Ruffle: 0.0
- Map Sequin: 0.0
- Mar 100 Sequin: 0.0
- Map Tassel: 0.0
- Mar 100 Tassel: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Shirt, blouse | Mar 100 Shirt, blouse | Map Top, t-shirt, sweatshirt | Mar 100 Top, t-shirt, sweatshirt | Map Sweater | Mar 100 Sweater | Map Cardigan | Mar 100 Cardigan | Map Jacket | Mar 100 Jacket | Map Vest | Mar 100 Vest | Map Pants | Mar 100 Pants | Map Shorts | Mar 100 Shorts | Map Skirt | Mar 100 Skirt | Map Coat | Mar 100 Coat | Map Dress | Mar 100 Dress | Map Jumpsuit | Mar 100 Jumpsuit | Map Cape | Mar 100 Cape | Map Glasses | Mar 100 Glasses | Map Hat | Mar 100 Hat | Map Headband, head covering, hair accessory | Mar 100 Headband, head covering, hair accessory | Map Tie | Mar 100 Tie | Map Glove | Mar 100 Glove | Map Watch | Mar 100 Watch | Map Belt | Mar 100 Belt | Map Leg warmer | Mar 100 Leg warmer | Map Tights, stockings | Mar 100 Tights, stockings | Map Sock | Mar 100 Sock | Map Shoe | Mar 100 Shoe | Map Bag, wallet | Mar 100 Bag, wallet | Map Scarf | Mar 100 Scarf | Map Umbrella | Mar 100 Umbrella | Map Hood | Mar 100 Hood | Map Collar | Mar 100 Collar | Map Lapel | Mar 100 Lapel | Map Epaulette | Mar 100 Epaulette | Map Sleeve | Mar 100 Sleeve | Map Pocket | Mar 100 Pocket | Map Neckline | Mar 100 Neckline | Map Buckle | Mar 100 Buckle | Map Zipper | Mar 100 Zipper | Map Applique | Mar 100 Applique | Map Bead | Mar 100 Bead | Map Bow | Mar 100 Bow | Map Flower | Mar 100 Flower | Map Fringe | Mar 100 Fringe | Map Ribbon | Mar 100 Ribbon | Map Rivet | Mar 100 Rivet | Map Ruffle | Mar 100 Ruffle | Map Sequin | Mar 100 Sequin | Map Tassel | Mar 100 Tassel |
|:-------------:|:------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------:|:--------------------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------:|:--------------:|:--------:|:------------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-----------:|:---------------:|:-------:|:-----------:|:-------------------------------------------:|:-----------------------------------------------:|:-------:|:-----------:|:---------:|:-------------:|:---------:|:-------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:--------:|:------------:|:---------------:|:-------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:----------:|:--------------:|:---------:|:-------------:|:-------------:|:-----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:--------:|:------------:|:-------:|:-----------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|
| 4.3049 | 0.0438 | 500 | 3.6778 | 0.0005 | 0.0016 | 0.0001 | 0.001 | 0.0006 | 0.0006 | 0.0013 | 0.0048 | 0.0091 | 0.0056 | 0.0112 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0169 | 0.1377 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.2822 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0072 | 0.0877 | 1000 | 3.2435 | 0.0021 | 0.0056 | 0.001 | 0.0023 | 0.0028 | 0.0005 | 0.0039 | 0.0114 | 0.0155 | 0.0104 | 0.0205 | 0.0298 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.083 | 0.3566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0154 | 0.3576 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2281 | 0.1315 | 1500 | 3.0550 | 0.0033 | 0.0081 | 0.0021 | 0.0029 | 0.0046 | 0.0002 | 0.0059 | 0.015 | 0.0181 | 0.013 | 0.0235 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1173 | 0.3901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0267 | 0.3585 | 0.0 | 0.0 | 0.0061 | 0.0772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2953 | 0.1753 | 2000 | 2.9315 | 0.0034 | 0.0084 | 0.0021 | 0.0024 | 0.0053 | 0.0009 | 0.0076 | 0.0197 | 0.0227 | 0.0161 | 0.0265 | 0.024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0098 | 0.1102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1012 | 0.4111 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.032 | 0.3735 | 0.0 | 0.0006 | 0.0118 | 0.1481 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5306 | 0.2192 | 2500 | 2.8260 | 0.0042 | 0.0105 | 0.0028 | 0.0028 | 0.0057 | 0.002 | 0.0093 | 0.0241 | 0.0273 | 0.0177 | 0.0284 | 0.0343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0318 | 0.2535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1067 | 0.4251 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0354 | 0.3962 | 0.0 | 0.0026 | 0.0205 | 0.1784 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8446 | 0.2630 | 3000 | 2.7502 | 0.0051 | 0.0113 | 0.004 | 0.0034 | 0.0058 | 0.003 | 0.012 | 0.0283 | 0.0314 | 0.0161 | 0.0313 | 0.0446 | 0.0 | 0.0 | 0.0002 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0132 | 0.065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0338 | 0.3398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1305 | 0.4584 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0385 | 0.4044 | 0.0 | 0.0078 | 0.018 | 0.1688 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3798 | 0.3069 | 3500 | 2.6691 | 0.0065 | 0.0138 | 0.0053 | 0.0031 | 0.0065 | 0.0057 | 0.0185 | 0.0375 | 0.0407 | 0.0181 | 0.0334 | 0.056 | 0.0 | 0.0 | 0.001 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0486 | 0.2917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0645 | 0.5028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1233 | 0.4518 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0419 | 0.3999 | 0.0 | 0.0108 | 0.0185 | 0.2072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8111 | 0.3507 | 4000 | 2.6270 | 0.0074 | 0.0159 | 0.0062 | 0.0037 | 0.0075 | 0.0065 | 0.0219 | 0.0415 | 0.0443 | 0.0168 | 0.0354 | 0.0607 | 0.0 | 0.0 | 0.0086 | 0.0486 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0581 | 0.3519 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0753 | 0.5732 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1428 | 0.4334 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0409 | 0.4027 | 0.0 | 0.0216 | 0.0165 | 0.2056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3798 | 0.3945 | 4500 | 2.5748 | 0.0076 | 0.0157 | 0.0066 | 0.0041 | 0.01 | 0.0061 | 0.0233 | 0.0486 | 0.0515 | 0.018 | 0.042 | 0.074 | 0.0 | 0.0 | 0.013 | 0.1139 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0528 | 0.4863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0582 | 0.6506 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1605 | 0.4713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0394 | 0.4381 | 0.0 | 0.0184 | 0.0258 | 0.1922 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3017 | 0.4384 | 5000 | 2.5355 | 0.0079 | 0.0162 | 0.007 | 0.0044 | 0.0113 | 0.0063 | 0.0259 | 0.0521 | 0.0553 | 0.0198 | 0.0462 | 0.0793 | 0.0 | 0.0 | 0.0149 | 0.1621 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0607 | 0.5462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0504 | 0.6648 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1652 | 0.4824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0485 | 0.4417 | 0.0001 | 0.0277 | 0.0251 | 0.2184 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1818 | 0.4822 | 5500 | 2.5019 | 0.0088 | 0.0176 | 0.0077 | 0.005 | 0.0118 | 0.0071 | 0.0267 | 0.0547 | 0.0575 | 0.0197 | 0.0497 | 0.0811 | 0.0 | 0.0 | 0.0231 | 0.1979 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0749 | 0.5796 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0573 | 0.6902 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1711 | 0.4896 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0502 | 0.4463 | 0.0001 | 0.0307 | 0.026 | 0.2118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7264 | 0.5260 | 6000 | 2.4801 | 0.0088 | 0.0182 | 0.0074 | 0.0046 | 0.013 | 0.0077 | 0.0271 | 0.0559 | 0.0587 | 0.0199 | 0.0517 | 0.0756 | 0.0 | 0.0 | 0.0226 | 0.256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0777 | 0.5854 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.066 | 0.6799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1535 | 0.459 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0612 | 0.4444 | 0.0001 | 0.0318 | 0.0248 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2463 | 0.5699 | 6500 | 2.4314 | 0.0102 | 0.0202 | 0.0093 | 0.0053 | 0.014 | 0.0087 | 0.0301 | 0.0584 | 0.061 | 0.02 | 0.0575 | 0.0802 | 0.0 | 0.0 | 0.0268 | 0.2933 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0838 | 0.6035 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0839 | 0.7102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1921 | 0.476 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0588 | 0.4496 | 0.0001 | 0.0387 | 0.0258 | 0.2329 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9215 | 0.6137 | 7000 | 2.4212 | 0.0103 | 0.0201 | 0.0094 | 0.0054 | 0.0144 | 0.0088 | 0.03 | 0.0598 | 0.0624 | 0.0211 | 0.058 | 0.0812 | 0.0 | 0.0 | 0.024 | 0.2928 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0984 | 0.6293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0724 | 0.7242 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1887 | 0.4963 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.063 | 0.4455 | 0.0001 | 0.0428 | 0.0268 | 0.2391 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9853 | 0.6575 | 7500 | 2.4040 | 0.0109 | 0.0209 | 0.01 | 0.0056 | 0.0152 | 0.0091 | 0.03 | 0.0601 | 0.0629 | 0.0216 | 0.0599 | 0.0792 | 0.0 | 0.0 | 0.0232 | 0.3118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1049 | 0.6156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0731 | 0.7404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2057 | 0.4923 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0663 | 0.4531 | 0.0001 | 0.0368 | 0.0264 | 0.2431 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7586 | 0.7014 | 8000 | 2.4094 | 0.0106 | 0.0212 | 0.0097 | 0.0054 | 0.0157 | 0.0089 | 0.0304 | 0.0591 | 0.0619 | 0.0216 | 0.0588 | 0.0788 | 0.0 | 0.0 | 0.024 | 0.3101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1042 | 0.6143 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0751 | 0.7167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1962 | 0.4804 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0651 | 0.4502 | 0.0001 | 0.0385 | 0.024 | 0.2379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5399 | 0.7452 | 8500 | 2.3802 | 0.0106 | 0.0207 | 0.0097 | 0.0055 | 0.0153 | 0.0089 | 0.0311 | 0.061 | 0.0633 | 0.0212 | 0.0594 | 0.0808 | 0.0 | 0.0 | 0.0236 | 0.344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0952 | 0.6468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0779 | 0.7134 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1989 | 0.4838 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.066 | 0.4451 | 0.0001 | 0.0422 | 0.0272 | 0.2379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2245 | 0.7891 | 9000 | 2.3654 | 0.0112 | 0.0216 | 0.0102 | 0.0056 | 0.0155 | 0.0095 | 0.0318 | 0.0613 | 0.064 | 0.0217 | 0.0611 | 0.0808 | 0.0 | 0.0 | 0.0241 | 0.3248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1051 | 0.6465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0847 | 0.7272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.201 | 0.4872 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.071 | 0.458 | 0.0001 | 0.0433 | 0.029 | 0.2575 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3969 | 0.8329 | 9500 | 2.3674 | 0.0114 | 0.0221 | 0.0104 | 0.0057 | 0.0163 | 0.0095 | 0.0316 | 0.0615 | 0.0641 | 0.0212 | 0.0604 | 0.0804 | 0.0 | 0.0 | 0.0236 | 0.3309 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1071 | 0.6446 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0835 | 0.7287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2008 | 0.4869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0716 | 0.4598 | 0.0001 | 0.0418 | 0.0295 | 0.2494 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8285 | 0.8767 | 10000 | 2.3614 | 0.0116 | 0.0224 | 0.0105 | 0.0058 | 0.0165 | 0.0098 | 0.0323 | 0.0618 | 0.0643 | 0.0215 | 0.0607 | 0.0811 | 0.0 | 0.0 | 0.0234 | 0.3322 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1092 | 0.6471 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0869 | 0.7291 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2046 | 0.4928 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0724 | 0.4613 | 0.0001 | 0.042 | 0.0292 | 0.2495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
mahernto/yolo_wgisd |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_wgisd
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3685
- Map: 0.2438
- Map 50: 0.5928
- Map 75: 0.1625
- Map Small: -1.0
- Map Medium: 0.1357
- Map Large: 0.2539
- Mar 1: 0.0406
- Mar 10: 0.2643
- Mar 100: 0.4410
- Mar Small: -1.0
- Mar Medium: 0.2096
- Mar Large: 0.4572
- Map Cdy: 0.2934
- Mar 100 Cdy: 0.4522
- Map Cfr: 0.2654
- Mar 100 Cfr: 0.4522
- Map Csv: 0.2146
- Mar 100 Csv: 0.4441
- Map Svb: 0.2115
- Mar 100 Svb: 0.4011
- Map Syh: 0.2341
- Mar 100 Syh: 0.4553
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Cdy | Mar 100 Cdy | Map Cfr | Mar 100 Cfr | Map Csv | Mar 100 Csv | Map Svb | Mar 100 Svb | Map Syh | Mar 100 Syh |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:-------:|:-----------:|:-------:|:-----------:|:-------:|:-----------:|:-------:|:-----------:|
| No log | 1.0 | 61 | 2.7279 | 0.0157 | 0.0522 | 0.0051 | -1.0 | 0.0062 | 0.0169 | 0.006 | 0.0306 | 0.0916 | -1.0 | 0.0108 | 0.1001 | 0.0 | 0.0 | 0.0257 | 0.0975 | 0.0139 | 0.1225 | 0.0388 | 0.2378 | 0.0 | 0.0 |
| No log | 2.0 | 122 | 2.4121 | 0.0315 | 0.1117 | 0.0136 | -1.0 | 0.0044 | 0.0341 | 0.0144 | 0.0623 | 0.1865 | -1.0 | 0.0241 | 0.1966 | 0.0319 | 0.155 | 0.0744 | 0.3283 | 0.0329 | 0.3081 | 0.0120 | 0.0891 | 0.0064 | 0.0521 |
| No log | 3.0 | 183 | 2.0448 | 0.0595 | 0.1858 | 0.0184 | -1.0 | 0.0137 | 0.0638 | 0.0212 | 0.1135 | 0.2843 | -1.0 | 0.0497 | 0.2991 | 0.0804 | 0.2094 | 0.0584 | 0.2811 | 0.0414 | 0.3297 | 0.0580 | 0.2512 | 0.0593 | 0.35 |
| No log | 4.0 | 244 | 1.9249 | 0.0922 | 0.2703 | 0.0375 | -1.0 | 0.0254 | 0.0978 | 0.0291 | 0.1454 | 0.3272 | -1.0 | 0.0653 | 0.3438 | 0.1099 | 0.2717 | 0.1391 | 0.3805 | 0.0972 | 0.3811 | 0.0614 | 0.2770 | 0.0532 | 0.3255 |
| No log | 5.0 | 305 | 1.7604 | 0.1099 | 0.3298 | 0.0457 | -1.0 | 0.0354 | 0.1162 | 0.0249 | 0.1629 | 0.325 | -1.0 | 0.0989 | 0.3396 | 0.1542 | 0.3356 | 0.1203 | 0.3101 | 0.0814 | 0.3108 | 0.0852 | 0.2664 | 0.1081 | 0.4021 |
| No log | 6.0 | 366 | 1.7670 | 0.1354 | 0.3843 | 0.0588 | -1.0 | 0.0640 | 0.1434 | 0.0328 | 0.1796 | 0.3613 | -1.0 | 0.0996 | 0.3796 | 0.1785 | 0.3867 | 0.1670 | 0.3950 | 0.1139 | 0.3766 | 0.0867 | 0.2428 | 0.1310 | 0.4053 |
| No log | 7.0 | 427 | 1.6559 | 0.1525 | 0.4261 | 0.0751 | -1.0 | 0.0557 | 0.1614 | 0.0333 | 0.1969 | 0.3761 | -1.0 | 0.1459 | 0.3934 | 0.2127 | 0.4122 | 0.1674 | 0.3975 | 0.1435 | 0.4108 | 0.1020 | 0.2438 | 0.1370 | 0.4160 |
| No log | 8.0 | 488 | 1.6087 | 0.1641 | 0.4603 | 0.0686 | -1.0 | 0.0779 | 0.1715 | 0.0354 | 0.2009 | 0.3683 | -1.0 | 0.1294 | 0.3837 | 0.2265 | 0.4267 | 0.1834 | 0.3805 | 0.1368 | 0.3694 | 0.1260 | 0.2735 | 0.1476 | 0.3915 |
| 2.0067 | 9.0 | 549 | 1.5575 | 0.1714 | 0.4592 | 0.0873 | -1.0 | 0.0660 | 0.1798 | 0.0304 | 0.2140 | 0.3842 | -1.0 | 0.1357 | 0.3987 | 0.1833 | 0.3206 | 0.2131 | 0.4019 | 0.1741 | 0.4216 | 0.1505 | 0.3272 | 0.1358 | 0.45 |
| 2.0067 | 10.0 | 610 | 1.5390 | 0.1801 | 0.4554 | 0.0995 | -1.0 | 0.0957 | 0.1886 | 0.0348 | 0.2097 | 0.3912 | -1.0 | 0.1674 | 0.4077 | 0.2273 | 0.3644 | 0.2175 | 0.4050 | 0.1473 | 0.4234 | 0.1437 | 0.3357 | 0.1646 | 0.4277 |
| 2.0067 | 11.0 | 671 | 1.6056 | 0.1943 | 0.4944 | 0.0956 | -1.0 | 0.0628 | 0.2043 | 0.0366 | 0.2186 | 0.3931 | -1.0 | 0.1032 | 0.4112 | 0.2081 | 0.3589 | 0.2233 | 0.4145 | 0.1869 | 0.4099 | 0.1542 | 0.3322 | 0.1990 | 0.45 |
| 2.0067 | 12.0 | 732 | 1.5612 | 0.1807 | 0.5075 | 0.0896 | -1.0 | 0.0656 | 0.1889 | 0.0312 | 0.2104 | 0.3713 | -1.0 | 0.1525 | 0.3859 | 0.2146 | 0.3656 | 0.2326 | 0.4069 | 0.1213 | 0.3441 | 0.1575 | 0.3198 | 0.1777 | 0.4202 |
| 2.0067 | 13.0 | 793 | 1.6077 | 0.1834 | 0.4807 | 0.1035 | -1.0 | 0.1021 | 0.1911 | 0.0324 | 0.2194 | 0.3840 | -1.0 | 0.1405 | 0.3985 | 0.1861 | 0.3467 | 0.2512 | 0.4038 | 0.1423 | 0.3847 | 0.1633 | 0.353 | 0.1743 | 0.4319 |
| 2.0067 | 14.0 | 854 | 1.4885 | 0.2117 | 0.5304 | 0.1363 | -1.0 | 0.1370 | 0.2208 | 0.0379 | 0.2361 | 0.4089 | -1.0 | 0.2165 | 0.4239 | 0.2201 | 0.3756 | 0.2742 | 0.4497 | 0.1849 | 0.4216 | 0.1778 | 0.3435 | 0.2014 | 0.4543 |
| 2.0067 | 15.0 | 915 | 1.4687 | 0.2066 | 0.5315 | 0.1206 | -1.0 | 0.1115 | 0.2155 | 0.0393 | 0.2372 | 0.4026 | -1.0 | 0.1535 | 0.4180 | 0.2391 | 0.4111 | 0.2487 | 0.4384 | 0.1818 | 0.4297 | 0.1815 | 0.3452 | 0.1821 | 0.3883 |
| 2.0067 | 16.0 | 976 | 1.5170 | 0.2036 | 0.5142 | 0.1201 | -1.0 | 0.0960 | 0.2139 | 0.0355 | 0.2331 | 0.4217 | -1.0 | 0.1318 | 0.4398 | 0.2067 | 0.3622 | 0.2519 | 0.4453 | 0.1920 | 0.4522 | 0.1740 | 0.3551 | 0.1932 | 0.4936 |
| 1.3336 | 17.0 | 1037 | 1.4479 | 0.217 | 0.5558 | 0.1204 | -1.0 | 0.0950 | 0.2262 | 0.0333 | 0.2366 | 0.4110 | -1.0 | 0.1629 | 0.4264 | 0.2497 | 0.415 | 0.2530 | 0.4057 | 0.1625 | 0.3946 | 0.1952 | 0.3739 | 0.2246 | 0.4660 |
| 1.3336 | 18.0 | 1098 | 1.4191 | 0.2156 | 0.5465 | 0.1385 | -1.0 | 0.1155 | 0.2256 | 0.0387 | 0.2433 | 0.4202 | -1.0 | 0.1644 | 0.4368 | 0.2677 | 0.4372 | 0.2534 | 0.4377 | 0.1918 | 0.4351 | 0.1926 | 0.3590 | 0.1726 | 0.4319 |
| 1.3336 | 19.0 | 1159 | 1.3921 | 0.2291 | 0.5797 | 0.1464 | -1.0 | 0.1057 | 0.2398 | 0.0380 | 0.2526 | 0.4295 | -1.0 | 0.1674 | 0.4465 | 0.2816 | 0.4644 | 0.2437 | 0.4377 | 0.2059 | 0.4387 | 0.2050 | 0.3777 | 0.2095 | 0.4287 |
| 1.3336 | 20.0 | 1220 | 1.3693 | 0.2330 | 0.5773 | 0.1418 | -1.0 | 0.1104 | 0.2439 | 0.0367 | 0.2572 | 0.4410 | -1.0 | 0.1729 | 0.4576 | 0.2697 | 0.4306 | 0.252 | 0.4440 | 0.2076 | 0.4486 | 0.2171 | 0.4159 | 0.2188 | 0.4660 |
| 1.3336 | 21.0 | 1281 | 1.4042 | 0.2283 | 0.5682 | 0.1439 | -1.0 | 0.1151 | 0.2391 | 0.0329 | 0.2517 | 0.4354 | -1.0 | 0.2247 | 0.4524 | 0.2486 | 0.3939 | 0.2513 | 0.4472 | 0.2039 | 0.4640 | 0.1952 | 0.3912 | 0.2425 | 0.4808 |
| 1.3336 | 22.0 | 1342 | 1.4250 | 0.2269 | 0.5728 | 0.1470 | -1.0 | 0.1524 | 0.236 | 0.0412 | 0.2521 | 0.4265 | -1.0 | 0.2683 | 0.4399 | 0.2801 | 0.4378 | 0.2636 | 0.4258 | 0.1957 | 0.4342 | 0.1984 | 0.3880 | 0.1967 | 0.4468 |
| 1.3336 | 23.0 | 1403 | 1.3918 | 0.2379 | 0.5900 | 0.1472 | -1.0 | 0.1349 | 0.2482 | 0.0424 | 0.2566 | 0.4347 | -1.0 | 0.2361 | 0.4504 | 0.2820 | 0.4289 | 0.2593 | 0.4472 | 0.2047 | 0.4486 | 0.2044 | 0.3954 | 0.2391 | 0.4532 |
| 1.3336 | 24.0 | 1464 | 1.3905 | 0.2396 | 0.6033 | 0.1507 | -1.0 | 0.1387 | 0.2496 | 0.0428 | 0.2599 | 0.4411 | -1.0 | 0.2020 | 0.4567 | 0.2894 | 0.4489 | 0.2634 | 0.4434 | 0.1962 | 0.4531 | 0.2072 | 0.3951 | 0.2418 | 0.4649 |
| 1.1255 | 25.0 | 1525 | 1.3732 | 0.2353 | 0.5842 | 0.1362 | -1.0 | 0.1221 | 0.2455 | 0.0377 | 0.2589 | 0.4362 | -1.0 | 0.1973 | 0.4529 | 0.2795 | 0.4267 | 0.2530 | 0.4434 | 0.2039 | 0.4441 | 0.2129 | 0.4042 | 0.2271 | 0.4628 |
| 1.1255 | 26.0 | 1586 | 1.3720 | 0.2445 | 0.6011 | 0.1513 | -1.0 | 0.1338 | 0.2546 | 0.0414 | 0.2611 | 0.4417 | -1.0 | 0.1930 | 0.4581 | 0.3038 | 0.4717 | 0.2661 | 0.4553 | 0.2116 | 0.4387 | 0.2072 | 0.3951 | 0.2338 | 0.4479 |
| 1.1255 | 27.0 | 1647 | 1.3630 | 0.2406 | 0.5886 | 0.1596 | -1.0 | 0.1259 | 0.2511 | 0.0404 | 0.2629 | 0.4384 | -1.0 | 0.1915 | 0.4554 | 0.2912 | 0.4467 | 0.2579 | 0.4484 | 0.2094 | 0.4432 | 0.2077 | 0.4003 | 0.2368 | 0.4532 |
| 1.1255 | 28.0 | 1708 | 1.3697 | 0.2429 | 0.5916 | 0.1577 | -1.0 | 0.1313 | 0.2532 | 0.0397 | 0.2636 | 0.4399 | -1.0 | 0.1963 | 0.4566 | 0.2939 | 0.4489 | 0.2628 | 0.4509 | 0.2171 | 0.4469 | 0.2107 | 0.4028 | 0.2298 | 0.45 |
| 1.1255 | 29.0 | 1769 | 1.3682 | 0.2441 | 0.5927 | 0.1623 | -1.0 | 0.1357 | 0.2542 | 0.0406 | 0.2646 | 0.4419 | -1.0 | 0.2096 | 0.4582 | 0.2935 | 0.4533 | 0.2656 | 0.4528 | 0.2150 | 0.4460 | 0.2119 | 0.4011 | 0.2344 | 0.4564 |
| 1.1255 | 30.0 | 1830 | 1.3685 | 0.2438 | 0.5928 | 0.1625 | -1.0 | 0.1357 | 0.2539 | 0.0406 | 0.2643 | 0.4410 | -1.0 | 0.2096 | 0.4572 | 0.2934 | 0.4522 | 0.2654 | 0.4522 | 0.2146 | 0.4441 | 0.2115 | 0.4011 | 0.2341 | 0.4553 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"background",
"cdy",
"cfr",
"csv",
"svb",
"syh"
] |
MapacheFantasma/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7503
- Map: 0.6085
- Map 50: 0.8475
- Map 75: 0.7364
- Map Small: -1.0
- Map Medium: 0.6015
- Map Large: 0.6265
- Mar 1: 0.4268
- Mar 10: 0.751
- Mar 100: 0.7961
- Mar Small: -1.0
- Mar Medium: 0.7229
- Mar Large: 0.8051
- Map Banana: 0.5094
- Mar 100 Banana: 0.785
- Map Orange: 0.618
- Mar 100 Orange: 0.769
- Map Apple: 0.698
- Mar 100 Apple: 0.8343
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 1.8687 | 0.0129 | 0.038 | 0.0049 | -1.0 | 0.0067 | 0.0157 | 0.0572 | 0.1949 | 0.3652 | -1.0 | 0.2814 | 0.3769 | 0.0129 | 0.415 | 0.0218 | 0.469 | 0.0039 | 0.2114 |
| No log | 2.0 | 120 | 1.9912 | 0.0217 | 0.0645 | 0.0096 | -1.0 | 0.0976 | 0.0177 | 0.0771 | 0.1793 | 0.3634 | -1.0 | 0.31 | 0.3634 | 0.0221 | 0.5125 | 0.0223 | 0.3262 | 0.0207 | 0.2514 |
| No log | 3.0 | 180 | 1.3626 | 0.0825 | 0.1743 | 0.0701 | -1.0 | 0.2573 | 0.0705 | 0.2254 | 0.4597 | 0.6161 | -1.0 | 0.5471 | 0.6235 | 0.0727 | 0.6225 | 0.1038 | 0.5714 | 0.071 | 0.6543 |
| No log | 4.0 | 240 | 1.1473 | 0.2756 | 0.4616 | 0.3013 | -1.0 | 0.2822 | 0.2894 | 0.3357 | 0.5695 | 0.6993 | -1.0 | 0.5957 | 0.7144 | 0.219 | 0.6575 | 0.2073 | 0.669 | 0.4004 | 0.7714 |
| No log | 5.0 | 300 | 1.1179 | 0.2757 | 0.4919 | 0.2891 | -1.0 | 0.3702 | 0.2764 | 0.2987 | 0.5971 | 0.6843 | -1.0 | 0.6257 | 0.6906 | 0.2151 | 0.69 | 0.2161 | 0.6571 | 0.3957 | 0.7057 |
| No log | 6.0 | 360 | 0.9856 | 0.3562 | 0.5528 | 0.405 | -1.0 | 0.468 | 0.3741 | 0.3483 | 0.6138 | 0.7382 | -1.0 | 0.7286 | 0.7449 | 0.2702 | 0.6775 | 0.2062 | 0.7 | 0.5923 | 0.8371 |
| No log | 7.0 | 420 | 0.9100 | 0.4767 | 0.7183 | 0.5312 | -1.0 | 0.4923 | 0.4962 | 0.3951 | 0.6727 | 0.7679 | -1.0 | 0.69 | 0.7806 | 0.3461 | 0.7375 | 0.4555 | 0.7548 | 0.6285 | 0.8114 |
| No log | 8.0 | 480 | 0.8879 | 0.5102 | 0.7946 | 0.5966 | -1.0 | 0.5537 | 0.5229 | 0.3958 | 0.6899 | 0.7675 | -1.0 | 0.67 | 0.7813 | 0.3708 | 0.735 | 0.52 | 0.7762 | 0.64 | 0.7914 |
| 1.2703 | 9.0 | 540 | 0.8767 | 0.4935 | 0.7566 | 0.5666 | -1.0 | 0.5038 | 0.5153 | 0.3947 | 0.6888 | 0.7654 | -1.0 | 0.6971 | 0.7758 | 0.3741 | 0.74 | 0.5181 | 0.7619 | 0.5882 | 0.7943 |
| 1.2703 | 10.0 | 600 | 0.9414 | 0.4938 | 0.7676 | 0.5823 | -1.0 | 0.4991 | 0.5147 | 0.4014 | 0.685 | 0.7503 | -1.0 | 0.6771 | 0.761 | 0.3564 | 0.73 | 0.5156 | 0.7238 | 0.6094 | 0.7971 |
| 1.2703 | 11.0 | 660 | 0.8135 | 0.5144 | 0.7897 | 0.5938 | -1.0 | 0.508 | 0.5392 | 0.4156 | 0.7196 | 0.7767 | -1.0 | 0.7343 | 0.7836 | 0.4231 | 0.7625 | 0.5653 | 0.7762 | 0.5547 | 0.7914 |
| 1.2703 | 12.0 | 720 | 0.8786 | 0.4876 | 0.7543 | 0.5569 | -1.0 | 0.5132 | 0.4986 | 0.3891 | 0.6706 | 0.739 | -1.0 | 0.6914 | 0.7435 | 0.3739 | 0.74 | 0.5269 | 0.7286 | 0.5621 | 0.7486 |
| 1.2703 | 13.0 | 780 | 0.8729 | 0.5293 | 0.8224 | 0.5918 | -1.0 | 0.5589 | 0.5392 | 0.3945 | 0.679 | 0.7554 | -1.0 | 0.7114 | 0.7616 | 0.3989 | 0.7325 | 0.5524 | 0.7595 | 0.6366 | 0.7743 |
| 1.2703 | 14.0 | 840 | 0.9073 | 0.5443 | 0.813 | 0.6243 | -1.0 | 0.5372 | 0.563 | 0.4065 | 0.698 | 0.7671 | -1.0 | 0.6843 | 0.7808 | 0.3877 | 0.715 | 0.5517 | 0.7548 | 0.6934 | 0.8314 |
| 1.2703 | 15.0 | 900 | 0.7988 | 0.5792 | 0.8313 | 0.6911 | -1.0 | 0.5979 | 0.5993 | 0.4382 | 0.7344 | 0.7752 | -1.0 | 0.7243 | 0.7852 | 0.4579 | 0.74 | 0.6013 | 0.7571 | 0.6785 | 0.8286 |
| 1.2703 | 16.0 | 960 | 0.7813 | 0.5791 | 0.8403 | 0.6903 | -1.0 | 0.5997 | 0.5964 | 0.4227 | 0.7348 | 0.7898 | -1.0 | 0.71 | 0.8023 | 0.4825 | 0.775 | 0.574 | 0.7714 | 0.6808 | 0.8229 |
| 0.7137 | 17.0 | 1020 | 0.8336 | 0.5661 | 0.8326 | 0.687 | -1.0 | 0.5509 | 0.5899 | 0.4199 | 0.7257 | 0.7735 | -1.0 | 0.6871 | 0.7848 | 0.4837 | 0.7625 | 0.5681 | 0.7667 | 0.6465 | 0.7914 |
| 0.7137 | 18.0 | 1080 | 0.7945 | 0.5896 | 0.8523 | 0.6792 | -1.0 | 0.6043 | 0.6038 | 0.428 | 0.7363 | 0.789 | -1.0 | 0.7057 | 0.7996 | 0.4522 | 0.765 | 0.6042 | 0.7762 | 0.7124 | 0.8257 |
| 0.7137 | 19.0 | 1140 | 0.8319 | 0.5886 | 0.867 | 0.6988 | -1.0 | 0.6039 | 0.6003 | 0.4302 | 0.7234 | 0.7826 | -1.0 | 0.6929 | 0.792 | 0.4803 | 0.7825 | 0.591 | 0.7452 | 0.6946 | 0.82 |
| 0.7137 | 20.0 | 1200 | 0.7760 | 0.6031 | 0.8523 | 0.7223 | -1.0 | 0.6261 | 0.6134 | 0.429 | 0.7447 | 0.7875 | -1.0 | 0.7129 | 0.7964 | 0.4878 | 0.775 | 0.5966 | 0.7619 | 0.725 | 0.8257 |
| 0.7137 | 21.0 | 1260 | 0.7789 | 0.6091 | 0.8682 | 0.7337 | -1.0 | 0.5898 | 0.6269 | 0.4252 | 0.7343 | 0.7887 | -1.0 | 0.6771 | 0.8031 | 0.4982 | 0.78 | 0.6219 | 0.769 | 0.7071 | 0.8171 |
| 0.7137 | 22.0 | 1320 | 0.7605 | 0.6027 | 0.8448 | 0.6999 | -1.0 | 0.6072 | 0.6237 | 0.4281 | 0.7459 | 0.7911 | -1.0 | 0.7114 | 0.8011 | 0.4851 | 0.79 | 0.6207 | 0.769 | 0.7024 | 0.8143 |
| 0.7137 | 23.0 | 1380 | 0.7435 | 0.6084 | 0.8491 | 0.731 | -1.0 | 0.6307 | 0.6253 | 0.432 | 0.7536 | 0.8052 | -1.0 | 0.7429 | 0.8131 | 0.4922 | 0.7975 | 0.6328 | 0.781 | 0.7001 | 0.8371 |
| 0.7137 | 24.0 | 1440 | 0.7429 | 0.6063 | 0.8352 | 0.7323 | -1.0 | 0.6293 | 0.6206 | 0.4342 | 0.7492 | 0.7987 | -1.0 | 0.7257 | 0.8077 | 0.4852 | 0.7975 | 0.6289 | 0.7643 | 0.7048 | 0.8343 |
| 0.5485 | 25.0 | 1500 | 0.7587 | 0.6018 | 0.8351 | 0.7314 | -1.0 | 0.602 | 0.6199 | 0.4369 | 0.7473 | 0.7954 | -1.0 | 0.7157 | 0.8052 | 0.5002 | 0.79 | 0.6166 | 0.7619 | 0.6887 | 0.8343 |
| 0.5485 | 26.0 | 1560 | 0.7494 | 0.6089 | 0.8385 | 0.7347 | -1.0 | 0.6205 | 0.6252 | 0.4377 | 0.7566 | 0.8028 | -1.0 | 0.7257 | 0.8126 | 0.5078 | 0.795 | 0.6166 | 0.7762 | 0.7024 | 0.8371 |
| 0.5485 | 27.0 | 1620 | 0.7562 | 0.6066 | 0.8428 | 0.7343 | -1.0 | 0.5974 | 0.6242 | 0.4321 | 0.7513 | 0.7963 | -1.0 | 0.7129 | 0.8061 | 0.5057 | 0.79 | 0.6067 | 0.7619 | 0.7072 | 0.8371 |
| 0.5485 | 28.0 | 1680 | 0.7555 | 0.6034 | 0.845 | 0.7342 | -1.0 | 0.5912 | 0.6222 | 0.426 | 0.7502 | 0.7937 | -1.0 | 0.7129 | 0.8033 | 0.5072 | 0.7825 | 0.6061 | 0.7643 | 0.6969 | 0.8343 |
| 0.5485 | 29.0 | 1740 | 0.7505 | 0.6085 | 0.8472 | 0.7371 | -1.0 | 0.6015 | 0.6266 | 0.4268 | 0.7519 | 0.7969 | -1.0 | 0.7229 | 0.8059 | 0.5097 | 0.7875 | 0.6178 | 0.769 | 0.698 | 0.8343 |
| 0.5485 | 30.0 | 1800 | 0.7503 | 0.6085 | 0.8475 | 0.7364 | -1.0 | 0.6015 | 0.6265 | 0.4268 | 0.751 | 0.7961 | -1.0 | 0.7229 | 0.8051 | 0.5094 | 0.785 | 0.618 | 0.769 | 0.698 | 0.8343 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
pabpelle/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8212
- Map: 0.5422
- Map 50: 0.7536
- Map 75: 0.6124
- Map Small: -1.0
- Map Medium: 0.5556
- Map Large: 0.5523
- Mar 1: 0.4148
- Mar 10: 0.7053
- Mar 100: 0.7661
- Mar Small: -1.0
- Mar Medium: 0.7486
- Mar Large: 0.7706
- Map Banana: 0.4267
- Mar 100 Banana: 0.7325
- Map Orange: 0.5337
- Mar 100 Orange: 0.7714
- Map Apple: 0.6661
- Mar 100 Apple: 0.7943
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 1.9249 | 0.0109 | 0.0354 | 0.0052 | -1.0 | 0.0329 | 0.0109 | 0.0335 | 0.1437 | 0.2863 | -1.0 | 0.2843 | 0.2869 | 0.0046 | 0.34 | 0.0094 | 0.219 | 0.0186 | 0.3 |
| No log | 2.0 | 120 | 1.7552 | 0.0307 | 0.0686 | 0.027 | -1.0 | 0.1317 | 0.0312 | 0.1155 | 0.2331 | 0.3925 | -1.0 | 0.3071 | 0.4029 | 0.018 | 0.5075 | 0.0175 | 0.15 | 0.0566 | 0.52 |
| No log | 3.0 | 180 | 1.5625 | 0.0389 | 0.0978 | 0.0255 | -1.0 | 0.0548 | 0.0447 | 0.1446 | 0.2812 | 0.4478 | -1.0 | 0.2757 | 0.467 | 0.051 | 0.57 | 0.0204 | 0.2619 | 0.0452 | 0.5114 |
| No log | 4.0 | 240 | 1.6910 | 0.0592 | 0.1271 | 0.0474 | -1.0 | 0.1825 | 0.0515 | 0.1635 | 0.2915 | 0.4286 | -1.0 | 0.3357 | 0.4442 | 0.0439 | 0.44 | 0.0721 | 0.2714 | 0.0615 | 0.5743 |
| No log | 5.0 | 300 | 1.4597 | 0.0665 | 0.1389 | 0.0556 | -1.0 | 0.2186 | 0.0554 | 0.1602 | 0.362 | 0.4977 | -1.0 | 0.4129 | 0.5089 | 0.0538 | 0.5425 | 0.0681 | 0.319 | 0.0776 | 0.6314 |
| No log | 6.0 | 360 | 1.2406 | 0.0908 | 0.1801 | 0.0701 | -1.0 | 0.2087 | 0.0898 | 0.2304 | 0.4411 | 0.6045 | -1.0 | 0.4443 | 0.6203 | 0.0807 | 0.675 | 0.0816 | 0.4929 | 0.1101 | 0.6457 |
| No log | 7.0 | 420 | 1.2385 | 0.108 | 0.1981 | 0.098 | -1.0 | 0.296 | 0.1089 | 0.292 | 0.4907 | 0.604 | -1.0 | 0.5186 | 0.6145 | 0.0613 | 0.6325 | 0.097 | 0.4167 | 0.1658 | 0.7629 |
| No log | 8.0 | 480 | 1.1481 | 0.1272 | 0.2265 | 0.1263 | -1.0 | 0.3262 | 0.1267 | 0.3307 | 0.5254 | 0.6662 | -1.0 | 0.6186 | 0.6759 | 0.0771 | 0.5875 | 0.1323 | 0.7024 | 0.1721 | 0.7086 |
| 1.4397 | 9.0 | 540 | 1.0555 | 0.1758 | 0.304 | 0.1975 | -1.0 | 0.4053 | 0.1713 | 0.3339 | 0.5702 | 0.7069 | -1.0 | 0.6614 | 0.7126 | 0.1352 | 0.6725 | 0.1431 | 0.7024 | 0.2491 | 0.7457 |
| 1.4397 | 10.0 | 600 | 0.9996 | 0.2135 | 0.3317 | 0.2421 | -1.0 | 0.3245 | 0.2411 | 0.3571 | 0.6142 | 0.7234 | -1.0 | 0.6229 | 0.7368 | 0.1289 | 0.685 | 0.1954 | 0.731 | 0.3163 | 0.7543 |
| 1.4397 | 11.0 | 660 | 1.0359 | 0.313 | 0.4916 | 0.3638 | -1.0 | 0.4765 | 0.3136 | 0.3783 | 0.6039 | 0.6935 | -1.0 | 0.5971 | 0.7051 | 0.2058 | 0.64 | 0.2602 | 0.7119 | 0.4731 | 0.7286 |
| 1.4397 | 12.0 | 720 | 1.0051 | 0.3329 | 0.5244 | 0.3845 | -1.0 | 0.539 | 0.3262 | 0.3867 | 0.6153 | 0.7231 | -1.0 | 0.6914 | 0.7268 | 0.2134 | 0.6925 | 0.2743 | 0.7452 | 0.5111 | 0.7314 |
| 1.4397 | 13.0 | 780 | 0.9043 | 0.4053 | 0.6027 | 0.4549 | -1.0 | 0.491 | 0.4133 | 0.3862 | 0.6646 | 0.7517 | -1.0 | 0.71 | 0.7585 | 0.286 | 0.73 | 0.3504 | 0.7738 | 0.5796 | 0.7514 |
| 1.4397 | 14.0 | 840 | 0.9346 | 0.3972 | 0.589 | 0.45 | -1.0 | 0.4686 | 0.4016 | 0.3975 | 0.6671 | 0.7438 | -1.0 | 0.7171 | 0.7505 | 0.3249 | 0.69 | 0.3405 | 0.7786 | 0.5261 | 0.7629 |
| 1.4397 | 15.0 | 900 | 0.8932 | 0.46 | 0.6707 | 0.5185 | -1.0 | 0.4714 | 0.4758 | 0.4033 | 0.6815 | 0.7486 | -1.0 | 0.7171 | 0.7564 | 0.3733 | 0.7 | 0.395 | 0.7571 | 0.6115 | 0.7886 |
| 1.4397 | 16.0 | 960 | 0.9136 | 0.449 | 0.6785 | 0.5173 | -1.0 | 0.5232 | 0.4552 | 0.3999 | 0.659 | 0.7456 | -1.0 | 0.6843 | 0.7544 | 0.3302 | 0.7125 | 0.3977 | 0.75 | 0.619 | 0.7743 |
| 0.8233 | 17.0 | 1020 | 0.8842 | 0.472 | 0.7043 | 0.5647 | -1.0 | 0.5214 | 0.4805 | 0.3964 | 0.6667 | 0.7573 | -1.0 | 0.7386 | 0.7606 | 0.3924 | 0.7275 | 0.4447 | 0.7786 | 0.579 | 0.7657 |
| 0.8233 | 18.0 | 1080 | 0.9066 | 0.4877 | 0.7263 | 0.5845 | -1.0 | 0.5344 | 0.4988 | 0.3897 | 0.6804 | 0.7563 | -1.0 | 0.7357 | 0.7613 | 0.3998 | 0.71 | 0.4572 | 0.7619 | 0.606 | 0.7971 |
| 0.8233 | 19.0 | 1140 | 0.8907 | 0.5064 | 0.741 | 0.5837 | -1.0 | 0.5143 | 0.5225 | 0.3963 | 0.6844 | 0.756 | -1.0 | 0.71 | 0.7648 | 0.4011 | 0.715 | 0.4914 | 0.7786 | 0.6267 | 0.7743 |
| 0.8233 | 20.0 | 1200 | 0.8983 | 0.5226 | 0.779 | 0.6133 | -1.0 | 0.5526 | 0.5303 | 0.4044 | 0.6931 | 0.7544 | -1.0 | 0.76 | 0.7563 | 0.4215 | 0.695 | 0.507 | 0.7738 | 0.6393 | 0.7943 |
| 0.8233 | 21.0 | 1260 | 0.8372 | 0.5484 | 0.7701 | 0.641 | -1.0 | 0.5423 | 0.561 | 0.4135 | 0.7104 | 0.7793 | -1.0 | 0.8 | 0.7795 | 0.4446 | 0.7375 | 0.5457 | 0.7976 | 0.6548 | 0.8029 |
| 0.8233 | 22.0 | 1320 | 0.8065 | 0.5278 | 0.7465 | 0.6028 | -1.0 | 0.5488 | 0.5416 | 0.4163 | 0.7259 | 0.7807 | -1.0 | 0.7443 | 0.7871 | 0.4388 | 0.765 | 0.489 | 0.7857 | 0.6555 | 0.7914 |
| 0.8233 | 23.0 | 1380 | 0.8122 | 0.5423 | 0.7637 | 0.6037 | -1.0 | 0.53 | 0.5583 | 0.4075 | 0.6963 | 0.777 | -1.0 | 0.7457 | 0.7809 | 0.4426 | 0.7625 | 0.5304 | 0.7857 | 0.6538 | 0.7829 |
| 0.8233 | 24.0 | 1440 | 0.8222 | 0.5089 | 0.7194 | 0.5928 | -1.0 | 0.5741 | 0.5159 | 0.3979 | 0.6915 | 0.7724 | -1.0 | 0.7214 | 0.7786 | 0.3995 | 0.76 | 0.4906 | 0.7857 | 0.6366 | 0.7714 |
| 0.6232 | 25.0 | 1500 | 0.8286 | 0.5212 | 0.7314 | 0.5952 | -1.0 | 0.5522 | 0.5289 | 0.4019 | 0.7154 | 0.7805 | -1.0 | 0.7357 | 0.7866 | 0.4001 | 0.76 | 0.4929 | 0.7786 | 0.6707 | 0.8029 |
| 0.6232 | 26.0 | 1560 | 0.8298 | 0.5307 | 0.737 | 0.5891 | -1.0 | 0.5528 | 0.5435 | 0.406 | 0.7093 | 0.7718 | -1.0 | 0.7386 | 0.7775 | 0.3965 | 0.745 | 0.5269 | 0.7762 | 0.6686 | 0.7943 |
| 0.6232 | 27.0 | 1620 | 0.8292 | 0.5412 | 0.7501 | 0.6078 | -1.0 | 0.5566 | 0.5509 | 0.4173 | 0.7106 | 0.7721 | -1.0 | 0.7657 | 0.7756 | 0.4197 | 0.735 | 0.5364 | 0.7786 | 0.6676 | 0.8029 |
| 0.6232 | 28.0 | 1680 | 0.8215 | 0.5417 | 0.7518 | 0.6103 | -1.0 | 0.5583 | 0.5515 | 0.4131 | 0.7052 | 0.7676 | -1.0 | 0.7486 | 0.7725 | 0.4222 | 0.73 | 0.5368 | 0.7786 | 0.666 | 0.7943 |
| 0.6232 | 29.0 | 1740 | 0.8211 | 0.5424 | 0.7539 | 0.6124 | -1.0 | 0.5556 | 0.5525 | 0.4148 | 0.7053 | 0.7677 | -1.0 | 0.7486 | 0.7724 | 0.4272 | 0.7325 | 0.5339 | 0.7762 | 0.6661 | 0.7943 |
| 0.6232 | 30.0 | 1800 | 0.8212 | 0.5422 | 0.7536 | 0.6124 | -1.0 | 0.5556 | 0.5523 | 0.4148 | 0.7053 | 0.7661 | -1.0 | 0.7486 | 0.7706 | 0.4267 | 0.7325 | 0.5337 | 0.7714 | 0.6661 | 0.7943 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
kylecsnow/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
***THIS IS JUST A TEST, THIS MODEL IS NOT GOOD***
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 6.9081
- eval_map: 0.0098
- eval_map_50: 0.0281
- eval_map_75: 0.0
- eval_map_small: 0.0
- eval_map_medium: 0.0
- eval_map_large: 0.0196
- eval_mar_1: 0.0083
- eval_mar_10: 0.0111
- eval_mar_100: 0.0111
- eval_mar_small: 0.0
- eval_mar_medium: 0.0
- eval_mar_large: 0.0222
- eval_map_shirt, blouse: 0.0
- eval_mar_100_shirt, blouse: 0.0
- eval_map_top, t-shirt, sweatshirt: 0.0
- eval_mar_100_top, t-shirt, sweatshirt: 0.0
- eval_map_sweater: -1.0
- eval_mar_100_sweater: -1.0
- eval_map_cardigan: -1.0
- eval_mar_100_cardigan: -1.0
- eval_map_jacket: 0.1767
- eval_mar_100_jacket: 0.2
- eval_map_vest: -1.0
- eval_mar_100_vest: -1.0
- eval_map_pants: 0.0
- eval_mar_100_pants: 0.0
- eval_map_shorts: 0.0
- eval_mar_100_shorts: 0.0
- eval_map_skirt: 0.0
- eval_mar_100_skirt: 0.0
- eval_map_coat: 0.0
- eval_mar_100_coat: 0.0
- eval_map_dress: 0.0
- eval_mar_100_dress: 0.0
- eval_map_cape: -1.0
- eval_mar_100_cape: -1.0
- eval_map_glasses: 0.0
- eval_mar_100_glasses: 0.0
- eval_map_tie: -1.0
- eval_mar_100_tie: -1.0
- eval_map_glove: -1.0
- eval_mar_100_glove: -1.0
- eval_map_leg warmer: -1.0
- eval_mar_100_leg warmer: -1.0
- eval_map_tights, stockings: 0.0
- eval_mar_100_tights, stockings: 0.0
- eval_map_shoe: 0.0
- eval_mar_100_shoe: 0.0
- eval_map_umbrella: 0.0
- eval_mar_100_umbrella: 0.0
- eval_map_hood: -1.0
- eval_mar_100_hood: -1.0
- eval_map_collar: 0.0
- eval_mar_100_collar: 0.0
- eval_map_lapel: 0.0
- eval_mar_100_lapel: 0.0
- eval_map_sleeve: 0.0
- eval_mar_100_sleeve: 0.0
- eval_map_pocket: 0.0
- eval_mar_100_pocket: 0.0
- eval_map_neckline: 0.0
- eval_mar_100_neckline: 0.0
- eval_map_buckle: -1.0
- eval_mar_100_buckle: -1.0
- eval_map_zipper: 0.0
- eval_mar_100_zipper: 0.0
- eval_map_applique: -1.0
- eval_mar_100_applique: -1.0
- eval_map_bead: -1.0
- eval_mar_100_bead: -1.0
- eval_map_bow: -1.0
- eval_mar_100_bow: -1.0
- eval_map_fringe: -1.0
- eval_mar_100_fringe: -1.0
- eval_map_ribbon: -1.0
- eval_mar_100_ribbon: -1.0
- eval_map_rivet: -1.0
- eval_mar_100_rivet: -1.0
- eval_runtime: 4.5194
- eval_samples_per_second: 2.434
- eval_steps_per_second: 0.664
- epoch: 0.4386
- step: 50
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 1000
### Framework versions
- Transformers 4.50.3
- Pytorch 2.2.1
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
vncgabriel/raccoon_detector |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# raccoon_detector
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4696
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 40 | 1.2814 |
| No log | 2.0 | 80 | 1.1424 |
| No log | 3.0 | 120 | 0.9869 |
| No log | 4.0 | 160 | 0.9042 |
| No log | 5.0 | 200 | 0.7036 |
| No log | 6.0 | 240 | 0.7302 |
| No log | 7.0 | 280 | 0.6320 |
| No log | 8.0 | 320 | 0.5814 |
| No log | 9.0 | 360 | 0.5468 |
| No log | 10.0 | 400 | 0.5419 |
| No log | 11.0 | 440 | 0.5130 |
| No log | 12.0 | 480 | 0.5220 |
| 0.8899 | 13.0 | 520 | 0.5162 |
| 0.8899 | 14.0 | 560 | 0.5061 |
| 0.8899 | 15.0 | 600 | 0.5066 |
| 0.8899 | 16.0 | 640 | 0.4785 |
| 0.8899 | 17.0 | 680 | 0.4817 |
| 0.8899 | 18.0 | 720 | 0.4782 |
| 0.8899 | 19.0 | 760 | 0.4873 |
| 0.8899 | 20.0 | 800 | 0.4696 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"raccoon",
"no_object"
] |
kylecsnow/detr-resnet-50-bloodcell-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-bloodcell-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1568
- Map: 0.0244
- Map 50: 0.0597
- Map 75: 0.02
- Map Small: 0.0049
- Map Medium: 0.0369
- Map Large: -1.0
- Mar 1: 0.0573
- Mar 10: 0.1748
- Mar 100: 0.2252
- Mar Small: 0.1568
- Mar Medium: 0.2833
- Mar Large: -1.0
- Map Platelets: 0.0
- Mar 100 Platelets: 0.0
- Map Rbc: 0.0614
- Mar 100 Rbc: 0.3613
- Map Wbc: 0.0118
- Mar 100 Wbc: 0.3143
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Platelets | Mar 100 Platelets | Map Rbc | Mar 100 Rbc | Map Wbc | Mar 100 Wbc |
|:-------------:|:--------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------------:|:-----------------:|:-------:|:-----------:|:-------:|:-----------:|
| 1.6586 | 0.7812 | 50 | 1.8480 | 0.0009 | 0.0023 | 0.0006 | 0.0015 | 0.0019 | -1.0 | 0.0075 | 0.0237 | 0.0935 | 0.0182 | 0.1383 | -1.0 | 0.0 | 0.0 | 0.0028 | 0.2806 | 0.0 | 0.0 |
| 1.9477 | 1.5625 | 100 | 1.8023 | 0.0014 | 0.004 | 0.0003 | 0.0002 | 0.0027 | -1.0 | 0.014 | 0.0355 | 0.0903 | 0.0045 | 0.1383 | -1.0 | 0.0 | 0.0 | 0.0042 | 0.271 | 0.0 | 0.0 |
| 2.2372 | 2.3438 | 150 | 1.9074 | 0.0024 | 0.0075 | 0.0015 | 0.0 | 0.0048 | -1.0 | 0.0054 | 0.0495 | 0.0661 | 0.0 | 0.1017 | -1.0 | 0.0 | 0.0 | 0.007 | 0.1839 | 0.0001 | 0.0143 |
| 1.6014 | 3.125 | 200 | 1.7235 | 0.0033 | 0.0113 | 0.0016 | 0.0008 | 0.0054 | -1.0 | 0.0075 | 0.057 | 0.0968 | 0.0227 | 0.1417 | -1.0 | 0.0 | 0.0 | 0.01 | 0.2903 | 0.0 | 0.0 |
| 2.1905 | 3.9062 | 250 | 1.7036 | 0.003 | 0.007 | 0.0017 | 0.0014 | 0.0049 | -1.0 | 0.0 | 0.0559 | 0.1097 | 0.0227 | 0.1617 | -1.0 | 0.0 | 0.0 | 0.0089 | 0.329 | 0.0 | 0.0 |
| 1.7382 | 4.6875 | 300 | 1.8214 | 0.0024 | 0.0069 | 0.0004 | 0.0005 | 0.0038 | -1.0 | 0.0011 | 0.0591 | 0.086 | 0.0273 | 0.1233 | -1.0 | 0.0 | 0.0 | 0.0071 | 0.2581 | 0.0 | 0.0 |
| 1.5236 | 5.4688 | 350 | 1.6356 | 0.003 | 0.008 | 0.0009 | 0.0004 | 0.0051 | -1.0 | 0.0022 | 0.071 | 0.1054 | 0.0227 | 0.155 | -1.0 | 0.0 | 0.0 | 0.009 | 0.3161 | 0.0 | 0.0 |
| 1.2473 | 6.25 | 400 | 1.6151 | 0.0031 | 0.0097 | 0.0005 | 0.0007 | 0.0048 | -1.0 | 0.0032 | 0.0538 | 0.0892 | 0.0273 | 0.1283 | -1.0 | 0.0 | 0.0 | 0.0092 | 0.2677 | 0.0 | 0.0 |
| 1.3519 | 7.0312 | 450 | 1.5804 | 0.0033 | 0.0087 | 0.002 | 0.002 | 0.0052 | -1.0 | 0.0 | 0.0677 | 0.1086 | 0.0455 | 0.1517 | -1.0 | 0.0 | 0.0 | 0.0098 | 0.3258 | 0.0 | 0.0 |
| 1.7836 | 7.8125 | 500 | 1.5477 | 0.005 | 0.0103 | 0.0032 | 0.0056 | 0.0081 | -1.0 | 0.0 | 0.0968 | 0.1204 | 0.0318 | 0.175 | -1.0 | 0.0 | 0.0 | 0.0149 | 0.3613 | 0.0 | 0.0 |
| 1.1599 | 8.5938 | 550 | 1.5482 | 0.0059 | 0.0148 | 0.0039 | 0.0126 | 0.0099 | -1.0 | 0.0043 | 0.1054 | 0.1172 | 0.05 | 0.1633 | -1.0 | 0.0 | 0.0 | 0.0178 | 0.3516 | 0.0 | 0.0 |
| 1.4864 | 9.375 | 600 | 1.5424 | 0.0057 | 0.0175 | 0.0019 | 0.0086 | 0.0089 | -1.0 | 0.0075 | 0.0978 | 0.1075 | 0.05 | 0.1483 | -1.0 | 0.0 | 0.0 | 0.017 | 0.3226 | 0.0 | 0.0 |
| 1.0296 | 10.1562 | 650 | 1.7076 | 0.0037 | 0.0103 | 0.0026 | 0.003 | 0.0062 | -1.0 | 0.0108 | 0.0806 | 0.1043 | 0.0318 | 0.15 | -1.0 | 0.0 | 0.0 | 0.011 | 0.3129 | 0.0 | 0.0 |
| 1.5607 | 10.9375 | 700 | 1.7597 | 0.004 | 0.011 | 0.0031 | 0.0046 | 0.0068 | -1.0 | 0.0161 | 0.0806 | 0.1011 | 0.0364 | 0.1433 | -1.0 | 0.0 | 0.0 | 0.0119 | 0.3032 | 0.0 | 0.0 |
| 1.2834 | 11.7188 | 750 | 1.5308 | 0.0063 | 0.0262 | 0.0037 | 0.0076 | 0.0093 | -1.0 | 0.0065 | 0.0935 | 0.129 | 0.0682 | 0.175 | -1.0 | 0.0 | 0.0 | 0.019 | 0.3871 | 0.0 | 0.0 |
| 1.732 | 12.5 | 800 | 1.5817 | 0.0045 | 0.0138 | 0.0025 | 0.0033 | 0.0069 | -1.0 | 0.0118 | 0.0613 | 0.0957 | 0.0364 | 0.135 | -1.0 | 0.0 | 0.0 | 0.0135 | 0.2871 | 0.0 | 0.0 |
| 1.6955 | 13.2812 | 850 | 1.3932 | 0.0077 | 0.0218 | 0.0065 | 0.0022 | 0.012 | -1.0 | 0.0161 | 0.1068 | 0.1412 | 0.0273 | 0.2067 | -1.0 | 0.0 | 0.0 | 0.0219 | 0.3806 | 0.0013 | 0.0429 |
| 1.4977 | 14.0625 | 900 | 1.4943 | 0.0067 | 0.019 | 0.0033 | 0.0011 | 0.0104 | -1.0 | 0.0097 | 0.0892 | 0.1161 | 0.0227 | 0.1717 | -1.0 | 0.0 | 0.0 | 0.0202 | 0.3484 | 0.0 | 0.0 |
| 1.3825 | 14.8438 | 950 | 1.6302 | 0.0052 | 0.0139 | 0.0032 | 0.0008 | 0.009 | -1.0 | 0.0118 | 0.071 | 0.0978 | 0.0091 | 0.1483 | -1.0 | 0.0 | 0.0 | 0.0157 | 0.2935 | 0.0 | 0.0 |
| 1.2735 | 15.625 | 1000 | 1.5745 | 0.0046 | 0.013 | 0.0022 | 0.0016 | 0.0074 | -1.0 | 0.0151 | 0.0768 | 0.1058 | 0.0318 | 0.1517 | -1.0 | 0.0 | 0.0 | 0.0135 | 0.3032 | 0.0003 | 0.0143 |
| 1.9496 | 16.4062 | 1050 | 1.5460 | 0.0058 | 0.0131 | 0.006 | 0.0005 | 0.0095 | -1.0 | 0.029 | 0.0774 | 0.1161 | 0.0136 | 0.175 | -1.0 | 0.0 | 0.0 | 0.0175 | 0.3484 | 0.0 | 0.0 |
| 1.1106 | 17.1875 | 1100 | 1.5203 | 0.0075 | 0.0177 | 0.0057 | 0.0014 | 0.0123 | -1.0 | 0.0333 | 0.0731 | 0.1075 | 0.0273 | 0.1567 | -1.0 | 0.0 | 0.0 | 0.0225 | 0.3226 | 0.0 | 0.0 |
| 1.053 | 17.9688 | 1150 | 1.4638 | 0.0052 | 0.0116 | 0.0045 | 0.0018 | 0.0084 | -1.0 | 0.0301 | 0.0882 | 0.1022 | 0.0182 | 0.1517 | -1.0 | 0.0 | 0.0 | 0.0157 | 0.3065 | 0.0 | 0.0 |
| 1.3288 | 18.75 | 1200 | 1.4683 | 0.0088 | 0.0225 | 0.0047 | 0.0012 | 0.0137 | -1.0 | 0.0333 | 0.0806 | 0.1108 | 0.0455 | 0.155 | -1.0 | 0.0 | 0.0 | 0.0265 | 0.3323 | 0.0 | 0.0 |
| 1.1287 | 19.5312 | 1250 | 1.4633 | 0.022 | 0.0767 | 0.0043 | 0.0084 | 0.0327 | -1.0 | 0.0439 | 0.096 | 0.1304 | 0.0364 | 0.1867 | -1.0 | 0.0 | 0.0 | 0.0359 | 0.3484 | 0.0302 | 0.0429 |
| 1.4217 | 20.3125 | 1300 | 1.3464 | 0.0093 | 0.0258 | 0.0038 | 0.0024 | 0.0173 | -1.0 | 0.0416 | 0.1088 | 0.1367 | 0.0364 | 0.195 | -1.0 | 0.0 | 0.0 | 0.0159 | 0.3387 | 0.0121 | 0.0714 |
| 2.5532 | 21.0938 | 1350 | 1.3783 | 0.0083 | 0.0232 | 0.0076 | 0.0155 | 0.0127 | -1.0 | 0.037 | 0.0983 | 0.131 | 0.0364 | 0.1883 | -1.0 | 0.0 | 0.0 | 0.0231 | 0.3645 | 0.0016 | 0.0286 |
| 1.5528 | 21.875 | 1400 | 1.2964 | 0.008 | 0.0196 | 0.004 | 0.0065 | 0.0119 | -1.0 | 0.029 | 0.0968 | 0.1412 | 0.05 | 0.1983 | -1.0 | 0.0 | 0.0 | 0.0235 | 0.3806 | 0.0004 | 0.0429 |
| 1.5319 | 22.6562 | 1450 | 1.3924 | 0.0098 | 0.0273 | 0.0041 | 0.003 | 0.0162 | -1.0 | 0.0476 | 0.1088 | 0.1558 | 0.0182 | 0.2283 | -1.0 | 0.0 | 0.0 | 0.0244 | 0.3387 | 0.0048 | 0.1286 |
| 2.0142 | 23.4375 | 1500 | 1.4189 | 0.0102 | 0.028 | 0.0022 | 0.0065 | 0.0159 | -1.0 | 0.0481 | 0.1061 | 0.1425 | 0.0545 | 0.1967 | -1.0 | 0.0 | 0.0 | 0.0266 | 0.3419 | 0.0041 | 0.0857 |
| 1.0811 | 24.2188 | 1550 | 1.4115 | 0.0112 | 0.0264 | 0.0036 | 0.0093 | 0.0174 | -1.0 | 0.0621 | 0.1115 | 0.1409 | 0.0455 | 0.1967 | -1.0 | 0.0 | 0.0 | 0.031 | 0.3226 | 0.0025 | 0.1 |
| 0.8783 | 25.0 | 1600 | 1.3619 | 0.009 | 0.019 | 0.0083 | 0.0056 | 0.0146 | -1.0 | 0.0647 | 0.1238 | 0.1495 | 0.0455 | 0.21 | -1.0 | 0.0 | 0.0 | 0.0236 | 0.3484 | 0.0035 | 0.1 |
| 0.9488 | 25.7812 | 1650 | 1.4108 | 0.0092 | 0.0225 | 0.0071 | 0.004 | 0.0142 | -1.0 | 0.0461 | 0.1008 | 0.1318 | 0.0318 | 0.1883 | -1.0 | 0.0 | 0.0 | 0.0246 | 0.3097 | 0.0029 | 0.0857 |
| 1.2976 | 26.5625 | 1700 | 1.3445 | 0.0085 | 0.0213 | 0.0041 | 0.0047 | 0.0134 | -1.0 | 0.0444 | 0.1072 | 0.1535 | 0.0636 | 0.2117 | -1.0 | 0.0 | 0.0 | 0.0205 | 0.4032 | 0.0051 | 0.0571 |
| 1.4806 | 27.3438 | 1750 | 1.3642 | 0.0079 | 0.0208 | 0.0058 | 0.0027 | 0.0122 | -1.0 | 0.0237 | 0.096 | 0.1447 | 0.05 | 0.2017 | -1.0 | 0.0 | 0.0 | 0.0218 | 0.3484 | 0.0018 | 0.0857 |
| 1.362 | 28.125 | 1800 | 1.3862 | 0.0078 | 0.0215 | 0.0045 | 0.0017 | 0.0119 | -1.0 | 0.0161 | 0.0951 | 0.1244 | 0.0318 | 0.1783 | -1.0 | 0.0 | 0.0 | 0.0232 | 0.3161 | 0.0003 | 0.0571 |
| 1.2454 | 28.9062 | 1850 | 1.4007 | 0.0069 | 0.0202 | 0.0037 | 0.0021 | 0.0105 | -1.0 | 0.0183 | 0.0854 | 0.1223 | 0.0273 | 0.1767 | -1.0 | 0.0 | 0.0 | 0.0203 | 0.3097 | 0.0003 | 0.0571 |
| 1.4498 | 29.6875 | 1900 | 1.3674 | 0.0068 | 0.0171 | 0.0032 | 0.0021 | 0.0102 | -1.0 | 0.0269 | 0.086 | 0.1177 | 0.0318 | 0.17 | -1.0 | 0.0 | 0.0 | 0.0203 | 0.3387 | 0.0 | 0.0143 |
| 1.2829 | 30.4688 | 1950 | 1.3086 | 0.0073 | 0.0177 | 0.0065 | 0.0011 | 0.0112 | -1.0 | 0.0204 | 0.0892 | 0.1108 | 0.0227 | 0.1633 | -1.0 | 0.0 | 0.0 | 0.0219 | 0.3323 | 0.0 | 0.0 |
| 1.5519 | 31.25 | 2000 | 1.3179 | 0.0115 | 0.0292 | 0.0049 | 0.0018 | 0.0195 | -1.0 | 0.0363 | 0.1353 | 0.1707 | 0.0409 | 0.2417 | -1.0 | 0.0 | 0.0 | 0.0237 | 0.3548 | 0.0107 | 0.1571 |
| 0.7896 | 32.0312 | 2050 | 1.2675 | 0.0106 | 0.0268 | 0.0027 | 0.0025 | 0.0182 | -1.0 | 0.0535 | 0.1289 | 0.1547 | 0.0273 | 0.2233 | -1.0 | 0.0 | 0.0 | 0.0202 | 0.3355 | 0.0116 | 0.1286 |
| 1.4217 | 32.8125 | 2100 | 1.3383 | 0.0109 | 0.0357 | 0.0041 | 0.0008 | 0.0169 | -1.0 | 0.0366 | 0.1061 | 0.1303 | 0.0227 | 0.19 | -1.0 | 0.0 | 0.0 | 0.031 | 0.3194 | 0.0016 | 0.0714 |
| 1.5639 | 33.5938 | 2150 | 1.3229 | 0.011 | 0.0368 | 0.0055 | 0.0042 | 0.0171 | -1.0 | 0.0349 | 0.1361 | 0.1479 | 0.0318 | 0.2133 | -1.0 | 0.0 | 0.0 | 0.0301 | 0.3581 | 0.003 | 0.0857 |
| 1.0576 | 34.375 | 2200 | 1.4207 | 0.0086 | 0.0224 | 0.0059 | 0.0033 | 0.0127 | -1.0 | 0.043 | 0.0903 | 0.1194 | 0.05 | 0.1667 | -1.0 | 0.0 | 0.0 | 0.0259 | 0.3581 | 0.0 | 0.0 |
| 1.0306 | 35.1562 | 2250 | 1.3139 | 0.0103 | 0.0263 | 0.0052 | 0.0009 | 0.0163 | -1.0 | 0.0376 | 0.1089 | 0.1516 | 0.0227 | 0.2217 | -1.0 | 0.0 | 0.0 | 0.0285 | 0.3548 | 0.0024 | 0.1 |
| 1.486 | 35.9375 | 2300 | 1.3837 | 0.0092 | 0.0253 | 0.0051 | 0.0014 | 0.0143 | -1.0 | 0.0258 | 0.1109 | 0.1324 | 0.0182 | 0.195 | -1.0 | 0.0 | 0.0 | 0.0261 | 0.3258 | 0.0015 | 0.0714 |
| 1.2642 | 36.7188 | 2350 | 1.3354 | 0.0112 | 0.0389 | 0.0064 | 0.0004 | 0.018 | -1.0 | 0.0398 | 0.1444 | 0.1637 | 0.0091 | 0.2433 | -1.0 | 0.0 | 0.0 | 0.0295 | 0.3484 | 0.0041 | 0.1429 |
| 1.1747 | 37.5 | 2400 | 1.3678 | 0.0098 | 0.027 | 0.0046 | 0.001 | 0.0151 | -1.0 | 0.0527 | 0.1174 | 0.1346 | 0.0318 | 0.1933 | -1.0 | 0.0 | 0.0 | 0.0285 | 0.3323 | 0.0009 | 0.0714 |
| 1.5575 | 38.2812 | 2450 | 1.3451 | 0.0098 | 0.0242 | 0.0069 | 0.0004 | 0.0151 | -1.0 | 0.0376 | 0.1035 | 0.1276 | 0.0273 | 0.185 | -1.0 | 0.0 | 0.0 | 0.0278 | 0.3258 | 0.0016 | 0.0571 |
| 1.4042 | 39.0625 | 2500 | 1.3499 | 0.0131 | 0.0322 | 0.0069 | 0.0011 | 0.0221 | -1.0 | 0.0502 | 0.12 | 0.1467 | 0.0227 | 0.2133 | -1.0 | 0.0 | 0.0 | 0.0287 | 0.3258 | 0.0107 | 0.1143 |
| 0.8908 | 39.8438 | 2550 | 1.2977 | 0.0165 | 0.0397 | 0.0077 | 0.0012 | 0.0291 | -1.0 | 0.0438 | 0.1344 | 0.1628 | 0.0318 | 0.235 | -1.0 | 0.0 | 0.0 | 0.0291 | 0.3742 | 0.0205 | 0.1143 |
| 1.5144 | 40.625 | 2600 | 1.3231 | 0.0138 | 0.0366 | 0.0078 | 0.0046 | 0.0215 | -1.0 | 0.0476 | 0.1381 | 0.1644 | 0.0409 | 0.2333 | -1.0 | 0.0 | 0.0 | 0.0313 | 0.3645 | 0.01 | 0.1286 |
| 1.1109 | 41.4062 | 2650 | 1.3453 | 0.0143 | 0.0414 | 0.0058 | 0.0024 | 0.0225 | -1.0 | 0.0379 | 0.1169 | 0.1389 | 0.0409 | 0.1967 | -1.0 | 0.0 | 0.0 | 0.0337 | 0.3452 | 0.0094 | 0.0714 |
| 1.1691 | 42.1875 | 2700 | 1.3283 | 0.0154 | 0.0455 | 0.0078 | 0.0037 | 0.0242 | -1.0 | 0.0536 | 0.1375 | 0.1716 | 0.0227 | 0.2483 | -1.0 | 0.0 | 0.0 | 0.0343 | 0.329 | 0.012 | 0.1857 |
| 1.9002 | 42.9688 | 2750 | 1.2725 | 0.0127 | 0.0338 | 0.0095 | 0.0017 | 0.0196 | -1.0 | 0.029 | 0.1174 | 0.1691 | 0.0364 | 0.2417 | -1.0 | 0.0 | 0.0 | 0.0328 | 0.3645 | 0.0052 | 0.1429 |
| 1.276 | 43.75 | 2800 | 1.3445 | 0.0143 | 0.0339 | 0.0071 | 0.0034 | 0.0217 | -1.0 | 0.0588 | 0.1273 | 0.1768 | 0.0227 | 0.255 | -1.0 | 0.0 | 0.0 | 0.0291 | 0.3161 | 0.0138 | 0.2143 |
| 1.1785 | 44.5312 | 2850 | 1.2616 | 0.0153 | 0.046 | 0.0068 | 0.002 | 0.0229 | -1.0 | 0.0478 | 0.126 | 0.1542 | 0.0318 | 0.2217 | -1.0 | 0.0 | 0.0 | 0.0341 | 0.3484 | 0.0117 | 0.1143 |
| 0.9947 | 45.3125 | 2900 | 1.2818 | 0.0186 | 0.0493 | 0.0114 | 0.0039 | 0.0278 | -1.0 | 0.0312 | 0.1461 | 0.2118 | 0.0409 | 0.2983 | -1.0 | 0.0 | 0.0 | 0.0375 | 0.3355 | 0.0184 | 0.3 |
| 1.2224 | 46.0938 | 2950 | 1.2723 | 0.0182 | 0.0533 | 0.0095 | 0.0028 | 0.0275 | -1.0 | 0.0461 | 0.1598 | 0.2203 | 0.0273 | 0.315 | -1.0 | 0.0 | 0.0 | 0.0294 | 0.3323 | 0.0253 | 0.3286 |
| 1.8751 | 46.875 | 3000 | 1.2760 | 0.026 | 0.0533 | 0.0312 | 0.0061 | 0.0376 | -1.0 | 0.0587 | 0.1318 | 0.1934 | 0.0455 | 0.2717 | -1.0 | 0.0 | 0.0 | 0.0321 | 0.3516 | 0.046 | 0.2286 |
| 1.38 | 47.6562 | 3050 | 1.2529 | 0.0147 | 0.0393 | 0.0076 | 0.0097 | 0.0212 | -1.0 | 0.0269 | 0.1217 | 0.2002 | 0.0455 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0274 | 0.329 | 0.0169 | 0.2714 |
| 1.2834 | 48.4375 | 3100 | 1.2167 | 0.0183 | 0.0574 | 0.0055 | 0.0106 | 0.0264 | -1.0 | 0.0306 | 0.1461 | 0.2057 | 0.0818 | 0.2767 | -1.0 | 0.0 | 0.0 | 0.0342 | 0.3742 | 0.0207 | 0.2429 |
| 0.8984 | 49.2188 | 3150 | 1.2765 | 0.0232 | 0.0545 | 0.005 | 0.0117 | 0.0332 | -1.0 | 0.0453 | 0.1157 | 0.1736 | 0.0455 | 0.2417 | -1.0 | 0.0 | 0.0 | 0.031 | 0.3065 | 0.0386 | 0.2143 |
| 0.9636 | 50.0 | 3200 | 1.2292 | 0.021 | 0.0525 | 0.0093 | 0.0093 | 0.0302 | -1.0 | 0.0352 | 0.1659 | 0.1957 | 0.0591 | 0.2717 | -1.0 | 0.0 | 0.0 | 0.0315 | 0.3871 | 0.0315 | 0.2 |
| 1.4512 | 50.7812 | 3250 | 1.2248 | 0.0183 | 0.0574 | 0.0048 | 0.0047 | 0.0275 | -1.0 | 0.0316 | 0.112 | 0.1843 | 0.0636 | 0.2517 | -1.0 | 0.0 | 0.0 | 0.0415 | 0.3387 | 0.0133 | 0.2143 |
| 1.2392 | 51.5625 | 3300 | 1.2376 | 0.0297 | 0.0563 | 0.0371 | 0.0044 | 0.0435 | -1.0 | 0.0598 | 0.1386 | 0.1917 | 0.0318 | 0.2733 | -1.0 | 0.0 | 0.0 | 0.043 | 0.3323 | 0.0462 | 0.2429 |
| 1.6072 | 52.3438 | 3350 | 1.2323 | 0.0285 | 0.0606 | 0.0238 | 0.0034 | 0.042 | -1.0 | 0.0496 | 0.1527 | 0.1891 | 0.0455 | 0.265 | -1.0 | 0.0 | 0.0 | 0.044 | 0.3387 | 0.0417 | 0.2286 |
| 0.7403 | 53.125 | 3400 | 1.2880 | 0.0184 | 0.0479 | 0.0038 | 0.0094 | 0.0275 | -1.0 | 0.029 | 0.1012 | 0.1725 | 0.0273 | 0.2467 | -1.0 | 0.0 | 0.0 | 0.0424 | 0.3032 | 0.0128 | 0.2143 |
| 1.4539 | 53.9062 | 3450 | 1.2321 | 0.0185 | 0.052 | 0.0066 | 0.0042 | 0.0269 | -1.0 | 0.039 | 0.1318 | 0.168 | 0.0455 | 0.2367 | -1.0 | 0.0 | 0.0 | 0.0294 | 0.3613 | 0.0261 | 0.1429 |
| 0.7978 | 54.6875 | 3500 | 1.2440 | 0.0193 | 0.0435 | 0.0108 | 0.0094 | 0.0277 | -1.0 | 0.0535 | 0.1329 | 0.2132 | 0.0636 | 0.295 | -1.0 | 0.0 | 0.0 | 0.0355 | 0.3968 | 0.0223 | 0.2429 |
| 1.5656 | 55.4688 | 3550 | 1.2641 | 0.0182 | 0.0435 | 0.0085 | 0.013 | 0.0257 | -1.0 | 0.0459 | 0.1233 | 0.1724 | 0.0682 | 0.235 | -1.0 | 0.0 | 0.0 | 0.034 | 0.3742 | 0.0204 | 0.1429 |
| 1.5915 | 56.25 | 3600 | 1.2424 | 0.0201 | 0.0506 | 0.0083 | 0.0145 | 0.0289 | -1.0 | 0.0316 | 0.1313 | 0.1687 | 0.05 | 0.2367 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.3774 | 0.0181 | 0.1286 |
| 0.983 | 57.0312 | 3650 | 1.2732 | 0.026 | 0.0552 | 0.0339 | 0.0023 | 0.0374 | -1.0 | 0.0533 | 0.1275 | 0.1777 | 0.0409 | 0.2533 | -1.0 | 0.0 | 0.0 | 0.0334 | 0.3903 | 0.0447 | 0.1429 |
| 1.3788 | 57.8125 | 3700 | 1.2715 | 0.0244 | 0.05 | 0.0261 | 0.0024 | 0.0352 | -1.0 | 0.0518 | 0.1301 | 0.1857 | 0.0409 | 0.265 | -1.0 | 0.0 | 0.0 | 0.0304 | 0.4 | 0.0427 | 0.1571 |
| 1.4864 | 58.5938 | 3750 | 1.2291 | 0.0232 | 0.0608 | 0.0135 | 0.0043 | 0.0345 | -1.0 | 0.0278 | 0.1412 | 0.2078 | 0.05 | 0.2917 | -1.0 | 0.0 | 0.0 | 0.0414 | 0.3806 | 0.028 | 0.2429 |
| 0.8287 | 59.375 | 3800 | 1.2190 | 0.024 | 0.0499 | 0.0219 | 0.0041 | 0.0353 | -1.0 | 0.0694 | 0.135 | 0.2286 | 0.0545 | 0.32 | -1.0 | 0.0 | 0.0 | 0.0478 | 0.4 | 0.0241 | 0.2857 |
| 0.8933 | 60.1562 | 3850 | 1.2321 | 0.0283 | 0.0578 | 0.0101 | 0.0018 | 0.042 | -1.0 | 0.0485 | 0.1361 | 0.2323 | 0.0364 | 0.3317 | -1.0 | 0.0 | 0.0 | 0.0462 | 0.3968 | 0.0389 | 0.3 |
| 1.1932 | 60.9375 | 3900 | 1.1959 | 0.0167 | 0.0438 | 0.0092 | 0.002 | 0.0259 | -1.0 | 0.0359 | 0.1131 | 0.1908 | 0.0364 | 0.2717 | -1.0 | 0.0 | 0.0 | 0.0417 | 0.3581 | 0.0084 | 0.2143 |
| 1.0248 | 61.7188 | 3950 | 1.2461 | 0.0185 | 0.0399 | 0.0117 | 0.0061 | 0.0281 | -1.0 | 0.0513 | 0.1255 | 0.1903 | 0.05 | 0.2667 | -1.0 | 0.0 | 0.0 | 0.0483 | 0.371 | 0.0071 | 0.2 |
| 0.5128 | 62.5 | 4000 | 1.2597 | 0.0202 | 0.04 | 0.0132 | 0.0088 | 0.03 | -1.0 | 0.07 | 0.1335 | 0.2301 | 0.0636 | 0.3183 | -1.0 | 0.0 | 0.0 | 0.0503 | 0.3903 | 0.0102 | 0.3 |
| 1.1046 | 63.2812 | 4050 | 1.1834 | 0.028 | 0.0612 | 0.0177 | 0.0078 | 0.0412 | -1.0 | 0.0599 | 0.1562 | 0.2323 | 0.0727 | 0.3183 | -1.0 | 0.0 | 0.0 | 0.0494 | 0.3968 | 0.0347 | 0.3 |
| 0.9547 | 64.0625 | 4100 | 1.2606 | 0.0197 | 0.044 | 0.0128 | 0.0079 | 0.0295 | -1.0 | 0.0562 | 0.1384 | 0.1803 | 0.0864 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0475 | 0.3839 | 0.0116 | 0.1571 |
| 0.9502 | 64.8438 | 4150 | 1.2495 | 0.018 | 0.0372 | 0.0139 | 0.0061 | 0.0268 | -1.0 | 0.0344 | 0.1538 | 0.1836 | 0.0727 | 0.25 | -1.0 | 0.0 | 0.0 | 0.051 | 0.3935 | 0.0029 | 0.1571 |
| 0.9012 | 65.625 | 4200 | 1.2171 | 0.0128 | 0.0358 | 0.0093 | 0.0053 | 0.0197 | -1.0 | 0.0366 | 0.1621 | 0.1957 | 0.1068 | 0.2567 | -1.0 | 0.0 | 0.0 | 0.0326 | 0.3871 | 0.0057 | 0.2 |
| 1.4715 | 66.4062 | 4250 | 1.2161 | 0.0159 | 0.0565 | 0.011 | 0.0047 | 0.0242 | -1.0 | 0.0445 | 0.1642 | 0.194 | 0.0727 | 0.2633 | -1.0 | 0.0 | 0.0 | 0.0366 | 0.3677 | 0.0111 | 0.2143 |
| 1.2676 | 67.1875 | 4300 | 1.2170 | 0.0302 | 0.0626 | 0.0216 | 0.0058 | 0.0447 | -1.0 | 0.0648 | 0.1797 | 0.2126 | 0.0727 | 0.29 | -1.0 | 0.0 | 0.0 | 0.0636 | 0.3806 | 0.027 | 0.2571 |
| 0.9165 | 67.9688 | 4350 | 1.1931 | 0.0267 | 0.0635 | 0.0138 | 0.0045 | 0.0403 | -1.0 | 0.0584 | 0.188 | 0.2312 | 0.0886 | 0.3133 | -1.0 | 0.0 | 0.0 | 0.0537 | 0.3935 | 0.0265 | 0.3 |
| 0.9043 | 68.75 | 4400 | 1.1971 | 0.0254 | 0.0626 | 0.0119 | 0.0052 | 0.0373 | -1.0 | 0.0648 | 0.1627 | 0.2063 | 0.1136 | 0.2717 | -1.0 | 0.0 | 0.0 | 0.0516 | 0.3903 | 0.0245 | 0.2286 |
| 1.1175 | 69.5312 | 4450 | 1.1998 | 0.0195 | 0.0662 | 0.0106 | 0.0073 | 0.0285 | -1.0 | 0.0327 | 0.1453 | 0.2043 | 0.0909 | 0.2733 | -1.0 | 0.0 | 0.0 | 0.0495 | 0.4129 | 0.0091 | 0.2 |
| 1.0341 | 70.3125 | 4500 | 1.2569 | 0.0277 | 0.092 | 0.0161 | 0.0034 | 0.0406 | -1.0 | 0.0364 | 0.1295 | 0.1685 | 0.0591 | 0.2317 | -1.0 | 0.0 | 0.0 | 0.0517 | 0.3484 | 0.0315 | 0.1571 |
| 1.1449 | 71.0938 | 4550 | 1.2740 | 0.0242 | 0.0614 | 0.0191 | 0.0058 | 0.0355 | -1.0 | 0.0386 | 0.1143 | 0.1925 | 0.0773 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0558 | 0.3774 | 0.0168 | 0.2 |
| 1.0001 | 71.875 | 4600 | 1.3106 | 0.0227 | 0.0616 | 0.0109 | 0.0037 | 0.0332 | -1.0 | 0.0476 | 0.104 | 0.1869 | 0.0545 | 0.2583 | -1.0 | 0.0 | 0.0 | 0.044 | 0.3323 | 0.024 | 0.2286 |
| 0.7522 | 72.6562 | 4650 | 1.3374 | 0.0239 | 0.0504 | 0.0199 | 0.0068 | 0.0375 | -1.0 | 0.0376 | 0.1221 | 0.1882 | 0.0773 | 0.2533 | -1.0 | 0.0 | 0.0 | 0.0659 | 0.3645 | 0.0059 | 0.2 |
| 0.9972 | 73.4375 | 4700 | 1.3006 | 0.0255 | 0.0529 | 0.0213 | 0.0044 | 0.0449 | -1.0 | 0.0599 | 0.131 | 0.1811 | 0.0727 | 0.2433 | -1.0 | 0.0 | 0.0 | 0.0577 | 0.329 | 0.0187 | 0.2143 |
| 0.9227 | 74.2188 | 4750 | 1.2410 | 0.0239 | 0.0454 | 0.0195 | 0.0033 | 0.0395 | -1.0 | 0.0487 | 0.1221 | 0.2178 | 0.0591 | 0.3017 | -1.0 | 0.0 | 0.0 | 0.0559 | 0.3677 | 0.0157 | 0.2857 |
| 1.0915 | 75.0 | 4800 | 1.2961 | 0.0163 | 0.0404 | 0.0091 | 0.0022 | 0.0267 | -1.0 | 0.0427 | 0.1137 | 0.1845 | 0.0545 | 0.2567 | -1.0 | 0.0 | 0.0 | 0.0377 | 0.3677 | 0.0112 | 0.1857 |
| 1.6626 | 75.7812 | 4850 | 1.2969 | 0.0146 | 0.0359 | 0.0129 | 0.0017 | 0.0233 | -1.0 | 0.0545 | 0.1169 | 0.1785 | 0.0318 | 0.255 | -1.0 | 0.0 | 0.0 | 0.0377 | 0.3355 | 0.006 | 0.2 |
| 1.1782 | 76.5625 | 4900 | 1.2964 | 0.0186 | 0.0413 | 0.021 | 0.0038 | 0.0282 | -1.0 | 0.0419 | 0.1154 | 0.1711 | 0.05 | 0.2383 | -1.0 | 0.0 | 0.0 | 0.0534 | 0.3419 | 0.0025 | 0.1714 |
| 1.131 | 77.3438 | 4950 | 1.2270 | 0.0186 | 0.0376 | 0.0124 | 0.004 | 0.0278 | -1.0 | 0.0396 | 0.1214 | 0.1814 | 0.0682 | 0.2483 | -1.0 | 0.0 | 0.0 | 0.0531 | 0.3871 | 0.0027 | 0.1571 |
| 1.0441 | 78.125 | 5000 | 1.2069 | 0.0189 | 0.0387 | 0.0169 | 0.003 | 0.029 | -1.0 | 0.0498 | 0.1164 | 0.1813 | 0.0545 | 0.2517 | -1.0 | 0.0 | 0.0 | 0.0531 | 0.3581 | 0.0036 | 0.1857 |
| 1.0901 | 78.9062 | 5050 | 1.1993 | 0.021 | 0.042 | 0.0222 | 0.0046 | 0.0317 | -1.0 | 0.0323 | 0.1335 | 0.2264 | 0.1068 | 0.3 | -1.0 | 0.0 | 0.0 | 0.0591 | 0.3935 | 0.004 | 0.2857 |
| 0.845 | 79.6875 | 5100 | 1.1731 | 0.0147 | 0.0362 | 0.0118 | 0.005 | 0.022 | -1.0 | 0.028 | 0.1432 | 0.2186 | 0.0818 | 0.2967 | -1.0 | 0.0 | 0.0 | 0.0411 | 0.4129 | 0.0032 | 0.2429 |
| 1.1312 | 80.4688 | 5150 | 1.1796 | 0.0243 | 0.056 | 0.0232 | 0.0046 | 0.0361 | -1.0 | 0.0441 | 0.1584 | 0.2252 | 0.1386 | 0.29 | -1.0 | 0.0 | 0.0 | 0.0584 | 0.3613 | 0.0144 | 0.3143 |
| 0.8801 | 81.25 | 5200 | 1.1800 | 0.0245 | 0.0575 | 0.0116 | 0.0069 | 0.0352 | -1.0 | 0.0642 | 0.1462 | 0.2094 | 0.0909 | 0.2783 | -1.0 | 0.0 | 0.0 | 0.041 | 0.371 | 0.0325 | 0.2571 |
| 1.0512 | 82.0312 | 5250 | 1.1787 | 0.033 | 0.0669 | 0.0226 | 0.0102 | 0.048 | -1.0 | 0.0733 | 0.1616 | 0.2453 | 0.1977 | 0.3017 | -1.0 | 0.0 | 0.0 | 0.0575 | 0.3645 | 0.0415 | 0.3714 |
| 1.3642 | 82.8125 | 5300 | 1.1866 | 0.0249 | 0.056 | 0.0228 | 0.0165 | 0.0367 | -1.0 | 0.0505 | 0.1877 | 0.2524 | 0.1705 | 0.3183 | -1.0 | 0.0 | 0.0 | 0.0635 | 0.4 | 0.0112 | 0.3571 |
| 0.672 | 83.5938 | 5350 | 1.1975 | 0.0227 | 0.0486 | 0.0131 | 0.0033 | 0.0349 | -1.0 | 0.0693 | 0.1948 | 0.2367 | 0.0545 | 0.3283 | -1.0 | 0.0 | 0.0 | 0.0548 | 0.3387 | 0.0132 | 0.3714 |
| 1.3491 | 84.375 | 5400 | 1.1901 | 0.0228 | 0.0505 | 0.0243 | 0.0027 | 0.0352 | -1.0 | 0.0323 | 0.1578 | 0.2289 | 0.0591 | 0.3167 | -1.0 | 0.0 | 0.0 | 0.0576 | 0.3581 | 0.0106 | 0.3286 |
| 1.3377 | 85.1562 | 5450 | 1.1934 | 0.0238 | 0.0521 | 0.0239 | 0.0108 | 0.0355 | -1.0 | 0.0685 | 0.1441 | 0.2263 | 0.2068 | 0.2717 | -1.0 | 0.0 | 0.0 | 0.0604 | 0.3645 | 0.0111 | 0.3143 |
| 1.0356 | 85.9375 | 5500 | 1.2006 | 0.024 | 0.0525 | 0.0198 | 0.0081 | 0.0367 | -1.0 | 0.0621 | 0.1367 | 0.2152 | 0.0682 | 0.295 | -1.0 | 0.0 | 0.0 | 0.058 | 0.3742 | 0.014 | 0.2714 |
| 0.8696 | 86.7188 | 5550 | 1.1967 | 0.0247 | 0.0604 | 0.0204 | 0.0121 | 0.0372 | -1.0 | 0.0541 | 0.1367 | 0.2051 | 0.0773 | 0.2767 | -1.0 | 0.0 | 0.0 | 0.0616 | 0.3581 | 0.0125 | 0.2571 |
| 1.0582 | 87.5 | 5600 | 1.2069 | 0.0245 | 0.0571 | 0.0206 | 0.0156 | 0.0357 | -1.0 | 0.0551 | 0.1379 | 0.1978 | 0.125 | 0.2533 | -1.0 | 0.0 | 0.0 | 0.0641 | 0.3935 | 0.0094 | 0.2 |
| 1.3351 | 88.2812 | 5650 | 1.1957 | 0.0237 | 0.0569 | 0.0167 | 0.0076 | 0.0345 | -1.0 | 0.0594 | 0.1399 | 0.2104 | 0.1318 | 0.27 | -1.0 | 0.0 | 0.0 | 0.059 | 0.3742 | 0.0121 | 0.2571 |
| 1.4034 | 89.0625 | 5700 | 1.2276 | 0.0223 | 0.0599 | 0.0118 | 0.0087 | 0.0331 | -1.0 | 0.059 | 0.1575 | 0.2083 | 0.0818 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0595 | 0.3677 | 0.0075 | 0.2571 |
| 0.9675 | 89.8438 | 5750 | 1.2125 | 0.0259 | 0.0803 | 0.0132 | 0.0123 | 0.0375 | -1.0 | 0.0605 | 0.1447 | 0.2109 | 0.1364 | 0.2683 | -1.0 | 0.0 | 0.0 | 0.0518 | 0.3613 | 0.026 | 0.2714 |
| 0.7693 | 90.625 | 5800 | 1.1752 | 0.0215 | 0.0686 | 0.013 | 0.0117 | 0.0313 | -1.0 | 0.0525 | 0.1432 | 0.2083 | 0.1364 | 0.265 | -1.0 | 0.0 | 0.0 | 0.0538 | 0.3677 | 0.0107 | 0.2571 |
| 1.1744 | 91.4062 | 5850 | 1.2025 | 0.0198 | 0.0628 | 0.0112 | 0.0102 | 0.0285 | -1.0 | 0.0487 | 0.1395 | 0.2204 | 0.1318 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0503 | 0.3613 | 0.009 | 0.3 |
| 1.4173 | 92.1875 | 5900 | 1.1697 | 0.0257 | 0.0671 | 0.0238 | 0.0093 | 0.0374 | -1.0 | 0.0445 | 0.1644 | 0.2321 | 0.1818 | 0.2867 | -1.0 | 0.0 | 0.0 | 0.0558 | 0.3677 | 0.0214 | 0.3286 |
| 0.4756 | 92.9688 | 5950 | 1.1985 | 0.0177 | 0.0445 | 0.0127 | 0.0086 | 0.0273 | -1.0 | 0.0333 | 0.1607 | 0.239 | 0.1068 | 0.3167 | -1.0 | 0.0 | 0.0 | 0.0456 | 0.3742 | 0.0076 | 0.3429 |
| 0.7877 | 93.75 | 6000 | 1.2018 | 0.0174 | 0.0459 | 0.0128 | 0.0076 | 0.027 | -1.0 | 0.0392 | 0.149 | 0.2072 | 0.0591 | 0.2867 | -1.0 | 0.0 | 0.0 | 0.0468 | 0.3645 | 0.0054 | 0.2571 |
| 0.9659 | 94.5312 | 6050 | 1.1644 | 0.0201 | 0.0548 | 0.0136 | 0.006 | 0.03 | -1.0 | 0.0456 | 0.1505 | 0.2512 | 0.1909 | 0.31 | -1.0 | 0.0 | 0.0 | 0.0515 | 0.3677 | 0.009 | 0.3857 |
| 1.4117 | 95.3125 | 6100 | 1.1477 | 0.0205 | 0.0578 | 0.0143 | 0.0111 | 0.0302 | -1.0 | 0.043 | 0.1644 | 0.2412 | 0.1795 | 0.2983 | -1.0 | 0.0 | 0.0 | 0.0517 | 0.3806 | 0.0098 | 0.3429 |
| 1.3861 | 96.0938 | 6150 | 1.1695 | 0.0216 | 0.0493 | 0.0192 | 0.0105 | 0.0327 | -1.0 | 0.0398 | 0.1665 | 0.255 | 0.1 | 0.34 | -1.0 | 0.0 | 0.0 | 0.0552 | 0.3935 | 0.0096 | 0.3714 |
| 0.8848 | 96.875 | 6200 | 1.1615 | 0.0205 | 0.0494 | 0.013 | 0.0081 | 0.0302 | -1.0 | 0.0445 | 0.1536 | 0.2501 | 0.1705 | 0.3133 | -1.0 | 0.0 | 0.0 | 0.0492 | 0.3645 | 0.0123 | 0.3857 |
| 0.994 | 97.6562 | 6250 | 1.1699 | 0.0177 | 0.0508 | 0.0109 | 0.0057 | 0.0263 | -1.0 | 0.0376 | 0.1175 | 0.214 | 0.1614 | 0.265 | -1.0 | 0.0 | 0.0 | 0.0488 | 0.3419 | 0.0044 | 0.3 |
| 0.9745 | 98.4375 | 6300 | 1.1648 | 0.0203 | 0.05 | 0.0167 | 0.0105 | 0.0295 | -1.0 | 0.043 | 0.1373 | 0.2343 | 0.1705 | 0.2917 | -1.0 | 0.0 | 0.0 | 0.0565 | 0.3742 | 0.0045 | 0.3286 |
| 0.675 | 99.2188 | 6350 | 1.2046 | 0.0228 | 0.0527 | 0.0223 | 0.0055 | 0.0337 | -1.0 | 0.0387 | 0.1341 | 0.2066 | 0.1773 | 0.2517 | -1.0 | 0.0 | 0.0 | 0.0633 | 0.3484 | 0.005 | 0.2714 |
| 0.9219 | 100.0 | 6400 | 1.1805 | 0.0216 | 0.0538 | 0.0175 | 0.0055 | 0.0316 | -1.0 | 0.028 | 0.1352 | 0.2178 | 0.1909 | 0.2633 | -1.0 | 0.0 | 0.0 | 0.0595 | 0.3677 | 0.0053 | 0.2857 |
| 0.9901 | 100.7812 | 6450 | 1.1593 | 0.0221 | 0.0558 | 0.0141 | 0.0075 | 0.0323 | -1.0 | 0.0413 | 0.1432 | 0.2221 | 0.0864 | 0.2983 | -1.0 | 0.0 | 0.0 | 0.0588 | 0.3806 | 0.0076 | 0.2857 |
| 0.8857 | 101.5625 | 6500 | 1.1659 | 0.0178 | 0.0448 | 0.0114 | 0.0091 | 0.0259 | -1.0 | 0.0237 | 0.1479 | 0.2396 | 0.1136 | 0.3133 | -1.0 | 0.0 | 0.0 | 0.0458 | 0.3903 | 0.0078 | 0.3286 |
| 1.1024 | 102.3438 | 6550 | 1.1656 | 0.0209 | 0.0476 | 0.0136 | 0.0084 | 0.031 | -1.0 | 0.043 | 0.157 | 0.2301 | 0.0909 | 0.3083 | -1.0 | 0.0 | 0.0 | 0.0549 | 0.3903 | 0.0079 | 0.3 |
| 1.2697 | 103.125 | 6600 | 1.1850 | 0.0229 | 0.0551 | 0.02 | 0.0086 | 0.0335 | -1.0 | 0.0402 | 0.1501 | 0.2083 | 0.0955 | 0.275 | -1.0 | 0.0 | 0.0 | 0.0622 | 0.3677 | 0.0066 | 0.2571 |
| 0.9893 | 103.9062 | 6650 | 1.1688 | 0.024 | 0.0463 | 0.0236 | 0.0063 | 0.0356 | -1.0 | 0.0381 | 0.1553 | 0.249 | 0.0955 | 0.3317 | -1.0 | 0.0 | 0.0 | 0.0632 | 0.3613 | 0.0089 | 0.3857 |
| 1.0774 | 104.6875 | 6700 | 1.1826 | 0.0252 | 0.0623 | 0.0238 | 0.0054 | 0.0383 | -1.0 | 0.0467 | 0.1553 | 0.2241 | 0.0886 | 0.3017 | -1.0 | 0.0 | 0.0 | 0.0666 | 0.3581 | 0.0091 | 0.3143 |
| 0.5791 | 105.4688 | 6750 | 1.1881 | 0.0187 | 0.0462 | 0.0096 | 0.0071 | 0.0278 | -1.0 | 0.0269 | 0.151 | 0.196 | 0.1932 | 0.2333 | -1.0 | 0.0 | 0.0 | 0.0507 | 0.3452 | 0.0053 | 0.2429 |
| 1.0598 | 106.25 | 6800 | 1.1317 | 0.0192 | 0.0492 | 0.0145 | 0.0074 | 0.028 | -1.0 | 0.0387 | 0.1404 | 0.1982 | 0.2023 | 0.2333 | -1.0 | 0.0 | 0.0 | 0.0547 | 0.3516 | 0.0029 | 0.2429 |
| 1.1959 | 107.0312 | 6850 | 1.1462 | 0.0206 | 0.056 | 0.0168 | 0.0044 | 0.0314 | -1.0 | 0.0398 | 0.1462 | 0.1992 | 0.1545 | 0.25 | -1.0 | 0.0 | 0.0 | 0.0562 | 0.3548 | 0.0057 | 0.2429 |
| 0.7161 | 107.8125 | 6900 | 1.1465 | 0.02 | 0.0472 | 0.0152 | 0.0063 | 0.0301 | -1.0 | 0.0376 | 0.1432 | 0.2174 | 0.0932 | 0.2917 | -1.0 | 0.0 | 0.0 | 0.0546 | 0.3806 | 0.0053 | 0.2714 |
| 1.0016 | 108.5938 | 6950 | 1.1418 | 0.0221 | 0.0564 | 0.017 | 0.0087 | 0.0337 | -1.0 | 0.0409 | 0.1693 | 0.2513 | 0.0932 | 0.34 | -1.0 | 0.0 | 0.0 | 0.0582 | 0.3968 | 0.008 | 0.3571 |
| 0.7246 | 109.375 | 7000 | 1.1783 | 0.0233 | 0.0575 | 0.0233 | 0.0043 | 0.036 | -1.0 | 0.0488 | 0.1542 | 0.2343 | 0.0886 | 0.3167 | -1.0 | 0.0 | 0.0 | 0.061 | 0.3742 | 0.0089 | 0.3286 |
| 1.1884 | 110.1562 | 7050 | 1.1401 | 0.0224 | 0.0527 | 0.0265 | 0.0052 | 0.0338 | -1.0 | 0.0366 | 0.1524 | 0.2387 | 0.1227 | 0.315 | -1.0 | 0.0 | 0.0 | 0.0631 | 0.4161 | 0.0042 | 0.3 |
| 0.7453 | 110.9375 | 7100 | 1.1591 | 0.0195 | 0.0533 | 0.0173 | 0.0039 | 0.0299 | -1.0 | 0.043 | 0.1505 | 0.2332 | 0.1182 | 0.3067 | -1.0 | 0.0 | 0.0 | 0.0525 | 0.371 | 0.0061 | 0.3286 |
| 1.131 | 111.7188 | 7150 | 1.1320 | 0.0242 | 0.0626 | 0.0231 | 0.0042 | 0.037 | -1.0 | 0.0498 | 0.1575 | 0.2449 | 0.1682 | 0.31 | -1.0 | 0.0 | 0.0 | 0.0607 | 0.3774 | 0.0119 | 0.3571 |
| 0.9711 | 112.5 | 7200 | 1.1211 | 0.0232 | 0.0604 | 0.0143 | 0.0061 | 0.035 | -1.0 | 0.0455 | 0.168 | 0.2258 | 0.1909 | 0.275 | -1.0 | 0.0 | 0.0 | 0.0576 | 0.3774 | 0.012 | 0.3 |
| 0.9416 | 113.2812 | 7250 | 1.1775 | 0.0261 | 0.0616 | 0.0243 | 0.0038 | 0.0405 | -1.0 | 0.0508 | 0.1642 | 0.2198 | 0.1341 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0656 | 0.3452 | 0.0127 | 0.3143 |
| 1.3022 | 114.0625 | 7300 | 1.1642 | 0.0245 | 0.0642 | 0.0215 | 0.0041 | 0.0379 | -1.0 | 0.0498 | 0.18 | 0.2235 | 0.1727 | 0.2767 | -1.0 | 0.0 | 0.0 | 0.0602 | 0.3419 | 0.0133 | 0.3286 |
| 1.4988 | 114.8438 | 7350 | 1.1382 | 0.0264 | 0.0635 | 0.0199 | 0.0088 | 0.0391 | -1.0 | 0.0508 | 0.168 | 0.2343 | 0.2114 | 0.2817 | -1.0 | 0.0 | 0.0 | 0.0691 | 0.3742 | 0.0103 | 0.3286 |
| 1.4201 | 115.625 | 7400 | 1.1775 | 0.024 | 0.0616 | 0.0236 | 0.0047 | 0.0373 | -1.0 | 0.0429 | 0.1461 | 0.2134 | 0.1432 | 0.27 | -1.0 | 0.0 | 0.0 | 0.0604 | 0.3258 | 0.0116 | 0.3143 |
| 0.7683 | 116.4062 | 7450 | 1.1837 | 0.0249 | 0.0627 | 0.0231 | 0.0093 | 0.0371 | -1.0 | 0.0409 | 0.1528 | 0.1978 | 0.1045 | 0.2583 | -1.0 | 0.0 | 0.0 | 0.0694 | 0.3935 | 0.0053 | 0.2 |
| 0.9967 | 117.1875 | 7500 | 1.1343 | 0.026 | 0.0569 | 0.0247 | 0.0077 | 0.039 | -1.0 | 0.043 | 0.1747 | 0.2303 | 0.0909 | 0.31 | -1.0 | 0.0 | 0.0 | 0.0704 | 0.4194 | 0.0075 | 0.2714 |
| 1.1531 | 117.9688 | 7550 | 1.1178 | 0.0256 | 0.0531 | 0.025 | 0.0096 | 0.0382 | -1.0 | 0.0461 | 0.1799 | 0.2355 | 0.0909 | 0.3167 | -1.0 | 0.0 | 0.0 | 0.0682 | 0.4065 | 0.0086 | 0.3 |
| 1.3522 | 118.75 | 7600 | 1.1938 | 0.0234 | 0.0621 | 0.0225 | 0.0077 | 0.0347 | -1.0 | 0.0498 | 0.1473 | 0.212 | 0.0955 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0613 | 0.3645 | 0.0088 | 0.2714 |
| 0.7913 | 119.5312 | 7650 | 1.1439 | 0.0248 | 0.0567 | 0.0212 | 0.0113 | 0.0366 | -1.0 | 0.0541 | 0.1544 | 0.2217 | 0.1 | 0.2933 | -1.0 | 0.0 | 0.0 | 0.0678 | 0.3935 | 0.0067 | 0.2714 |
| 1.1967 | 120.3125 | 7700 | 1.1705 | 0.0238 | 0.0607 | 0.0241 | 0.0098 | 0.035 | -1.0 | 0.0562 | 0.1453 | 0.2184 | 0.0955 | 0.29 | -1.0 | 0.0 | 0.0 | 0.0641 | 0.3839 | 0.0073 | 0.2714 |
| 1.1244 | 121.0938 | 7750 | 1.1540 | 0.0224 | 0.0538 | 0.0233 | 0.0059 | 0.0336 | -1.0 | 0.0476 | 0.1336 | 0.2078 | 0.0864 | 0.2783 | -1.0 | 0.0 | 0.0 | 0.0618 | 0.3806 | 0.0055 | 0.2429 |
| 0.6604 | 121.875 | 7800 | 1.1592 | 0.0231 | 0.0535 | 0.0236 | 0.0069 | 0.0345 | -1.0 | 0.0472 | 0.1332 | 0.2078 | 0.0773 | 0.2817 | -1.0 | 0.0 | 0.0 | 0.0645 | 0.3806 | 0.0047 | 0.2429 |
| 1.3675 | 122.6562 | 7850 | 1.1508 | 0.0245 | 0.0546 | 0.0225 | 0.0056 | 0.0365 | -1.0 | 0.0524 | 0.1442 | 0.1909 | 0.1045 | 0.2483 | -1.0 | 0.0 | 0.0 | 0.0655 | 0.3871 | 0.0081 | 0.1857 |
| 0.8144 | 123.4375 | 7900 | 1.2068 | 0.0239 | 0.0557 | 0.0266 | 0.0028 | 0.0365 | -1.0 | 0.053 | 0.1367 | 0.186 | 0.0909 | 0.245 | -1.0 | 0.0 | 0.0 | 0.0645 | 0.3581 | 0.0072 | 0.2 |
| 0.8341 | 124.2188 | 7950 | 1.1728 | 0.0227 | 0.0575 | 0.0228 | 0.0057 | 0.0334 | -1.0 | 0.0513 | 0.141 | 0.176 | 0.1091 | 0.225 | -1.0 | 0.0 | 0.0 | 0.0589 | 0.371 | 0.0093 | 0.1571 |
| 0.8846 | 125.0 | 8000 | 1.1473 | 0.0225 | 0.0563 | 0.0158 | 0.0075 | 0.0321 | -1.0 | 0.0439 | 0.1475 | 0.1803 | 0.1727 | 0.2133 | -1.0 | 0.0 | 0.0 | 0.0582 | 0.3839 | 0.0092 | 0.1571 |
| 1.1327 | 125.7812 | 8050 | 1.1435 | 0.0249 | 0.0618 | 0.0227 | 0.0081 | 0.0359 | -1.0 | 0.0556 | 0.1501 | 0.2015 | 0.1841 | 0.2417 | -1.0 | 0.0 | 0.0 | 0.062 | 0.3903 | 0.0128 | 0.2143 |
| 1.0436 | 126.5625 | 8100 | 1.1390 | 0.0249 | 0.0552 | 0.0221 | 0.0075 | 0.0358 | -1.0 | 0.0535 | 0.1507 | 0.2206 | 0.2091 | 0.2617 | -1.0 | 0.0 | 0.0 | 0.0625 | 0.3903 | 0.0123 | 0.2714 |
| 0.8491 | 127.3438 | 8150 | 1.1546 | 0.0263 | 0.0591 | 0.0301 | 0.0049 | 0.0397 | -1.0 | 0.0651 | 0.1559 | 0.2068 | 0.0977 | 0.275 | -1.0 | 0.0 | 0.0 | 0.0636 | 0.3774 | 0.0153 | 0.2429 |
| 0.8821 | 128.125 | 8200 | 1.2071 | 0.0246 | 0.0574 | 0.0243 | 0.0028 | 0.0367 | -1.0 | 0.0625 | 0.1361 | 0.1971 | 0.0682 | 0.2683 | -1.0 | 0.0 | 0.0 | 0.0589 | 0.3484 | 0.015 | 0.2429 |
| 0.7626 | 128.9062 | 8250 | 1.1649 | 0.026 | 0.0583 | 0.0165 | 0.0042 | 0.0399 | -1.0 | 0.0647 | 0.1399 | 0.2178 | 0.0682 | 0.2983 | -1.0 | 0.0 | 0.0 | 0.063 | 0.3677 | 0.015 | 0.2857 |
| 0.8785 | 129.6875 | 8300 | 1.1356 | 0.0265 | 0.058 | 0.0229 | 0.0051 | 0.0391 | -1.0 | 0.0614 | 0.1484 | 0.221 | 0.1273 | 0.2867 | -1.0 | 0.0 | 0.0 | 0.0641 | 0.3774 | 0.0154 | 0.2857 |
| 1.1451 | 130.4688 | 8350 | 1.1162 | 0.0278 | 0.0603 | 0.0257 | 0.0085 | 0.041 | -1.0 | 0.0562 | 0.172 | 0.2372 | 0.1205 | 0.3117 | -1.0 | 0.0 | 0.0 | 0.0714 | 0.4258 | 0.012 | 0.2857 |
| 1.0488 | 131.25 | 8400 | 1.1421 | 0.0268 | 0.0599 | 0.0238 | 0.0078 | 0.0393 | -1.0 | 0.0578 | 0.1555 | 0.2359 | 0.1409 | 0.3033 | -1.0 | 0.0 | 0.0 | 0.0666 | 0.3935 | 0.0138 | 0.3143 |
| 1.0632 | 132.0312 | 8450 | 1.1593 | 0.0251 | 0.0608 | 0.0229 | 0.005 | 0.037 | -1.0 | 0.0631 | 0.1495 | 0.2263 | 0.1318 | 0.2917 | -1.0 | 0.0 | 0.0 | 0.0612 | 0.3645 | 0.0139 | 0.3143 |
| 0.8655 | 132.8125 | 8500 | 1.1641 | 0.025 | 0.0622 | 0.0175 | 0.0048 | 0.0381 | -1.0 | 0.0631 | 0.1538 | 0.2226 | 0.1432 | 0.285 | -1.0 | 0.0 | 0.0 | 0.061 | 0.3677 | 0.0139 | 0.3 |
| 0.7642 | 133.5938 | 8550 | 1.1465 | 0.0258 | 0.0594 | 0.0157 | 0.0049 | 0.0383 | -1.0 | 0.0535 | 0.1538 | 0.2174 | 0.1273 | 0.2817 | -1.0 | 0.0 | 0.0 | 0.0637 | 0.3806 | 0.0137 | 0.2714 |
| 0.5636 | 134.375 | 8600 | 1.1321 | 0.0266 | 0.0609 | 0.0191 | 0.0064 | 0.0392 | -1.0 | 0.0648 | 0.1714 | 0.2207 | 0.15 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0681 | 0.4194 | 0.0116 | 0.2429 |
| 1.2294 | 135.1562 | 8650 | 1.1447 | 0.0259 | 0.0601 | 0.025 | 0.006 | 0.0387 | -1.0 | 0.0659 | 0.1522 | 0.2095 | 0.1045 | 0.275 | -1.0 | 0.0 | 0.0 | 0.0705 | 0.4 | 0.0074 | 0.2286 |
| 1.8606 | 135.9375 | 8700 | 1.1614 | 0.0262 | 0.0576 | 0.0253 | 0.0067 | 0.0387 | -1.0 | 0.0627 | 0.1628 | 0.2143 | 0.1 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0708 | 0.4 | 0.0079 | 0.2429 |
| 1.1145 | 136.7188 | 8750 | 1.1493 | 0.0257 | 0.0582 | 0.0265 | 0.0087 | 0.0379 | -1.0 | 0.0553 | 0.1565 | 0.2175 | 0.1091 | 0.285 | -1.0 | 0.0 | 0.0 | 0.0715 | 0.4097 | 0.0058 | 0.2429 |
| 0.5476 | 137.5 | 8800 | 1.1837 | 0.0247 | 0.0642 | 0.0264 | 0.0055 | 0.0367 | -1.0 | 0.0531 | 0.1488 | 0.1945 | 0.0864 | 0.2583 | -1.0 | 0.0 | 0.0 | 0.0656 | 0.3548 | 0.0086 | 0.2286 |
| 1.2711 | 138.2812 | 8850 | 1.1837 | 0.026 | 0.0636 | 0.0244 | 0.0058 | 0.0386 | -1.0 | 0.067 | 0.1462 | 0.1992 | 0.1068 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0694 | 0.3548 | 0.0087 | 0.2429 |
| 0.7984 | 139.0625 | 8900 | 1.1582 | 0.0267 | 0.0611 | 0.0254 | 0.006 | 0.0395 | -1.0 | 0.0616 | 0.1501 | 0.2243 | 0.1159 | 0.2933 | -1.0 | 0.0 | 0.0 | 0.0706 | 0.3871 | 0.0095 | 0.2857 |
| 0.9046 | 139.8438 | 8950 | 1.1674 | 0.0266 | 0.0616 | 0.0248 | 0.0052 | 0.0396 | -1.0 | 0.0616 | 0.1564 | 0.22 | 0.1159 | 0.2867 | -1.0 | 0.0 | 0.0 | 0.0698 | 0.3742 | 0.0101 | 0.2857 |
| 1.0865 | 140.625 | 9000 | 1.1438 | 0.0281 | 0.0668 | 0.0251 | 0.0063 | 0.0419 | -1.0 | 0.0627 | 0.1708 | 0.2223 | 0.1295 | 0.2867 | -1.0 | 0.0 | 0.0 | 0.0731 | 0.4097 | 0.0114 | 0.2571 |
| 1.0426 | 141.4062 | 9050 | 1.1461 | 0.0284 | 0.0616 | 0.0262 | 0.006 | 0.0421 | -1.0 | 0.0653 | 0.1691 | 0.2524 | 0.175 | 0.3167 | -1.0 | 0.0 | 0.0 | 0.0716 | 0.4 | 0.0137 | 0.3571 |
| 1.3125 | 142.1875 | 9100 | 1.1321 | 0.0266 | 0.0688 | 0.0273 | 0.0046 | 0.0399 | -1.0 | 0.067 | 0.1653 | 0.2131 | 0.1659 | 0.2633 | -1.0 | 0.0 | 0.0 | 0.0664 | 0.3677 | 0.0135 | 0.2714 |
| 0.7798 | 142.9688 | 9150 | 1.1413 | 0.0281 | 0.0665 | 0.0257 | 0.0066 | 0.041 | -1.0 | 0.0707 | 0.1605 | 0.2046 | 0.1591 | 0.2517 | -1.0 | 0.0 | 0.0 | 0.0676 | 0.371 | 0.0168 | 0.2429 |
| 0.71 | 143.75 | 9200 | 1.1400 | 0.0273 | 0.0602 | 0.0259 | 0.0048 | 0.0406 | -1.0 | 0.0585 | 0.1754 | 0.2215 | 0.1477 | 0.2817 | -1.0 | 0.0 | 0.0 | 0.0702 | 0.3645 | 0.0118 | 0.3 |
| 1.0794 | 144.5312 | 9250 | 1.1419 | 0.0268 | 0.0635 | 0.0188 | 0.0069 | 0.0397 | -1.0 | 0.0381 | 0.175 | 0.2343 | 0.1727 | 0.2933 | -1.0 | 0.0 | 0.0 | 0.0667 | 0.3742 | 0.0138 | 0.3286 |
| 1.2372 | 145.3125 | 9300 | 1.1581 | 0.0259 | 0.0631 | 0.0188 | 0.0055 | 0.0386 | -1.0 | 0.0461 | 0.1733 | 0.2241 | 0.1682 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0641 | 0.3581 | 0.0137 | 0.3143 |
| 1.1543 | 146.0938 | 9350 | 1.1619 | 0.0256 | 0.0596 | 0.019 | 0.0054 | 0.0385 | -1.0 | 0.0631 | 0.1685 | 0.2252 | 0.1727 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0642 | 0.3613 | 0.0126 | 0.3143 |
| 0.5925 | 146.875 | 9400 | 1.1743 | 0.0243 | 0.0591 | 0.0202 | 0.005 | 0.0364 | -1.0 | 0.0594 | 0.1579 | 0.2167 | 0.1273 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0632 | 0.3645 | 0.0097 | 0.2857 |
| 0.461 | 147.6562 | 9450 | 1.1802 | 0.0243 | 0.0601 | 0.0193 | 0.005 | 0.0364 | -1.0 | 0.0594 | 0.1531 | 0.212 | 0.1273 | 0.2733 | -1.0 | 0.0 | 0.0 | 0.0635 | 0.3645 | 0.0094 | 0.2714 |
| 1.614 | 148.4375 | 9500 | 1.1799 | 0.0253 | 0.0614 | 0.0206 | 0.0051 | 0.0379 | -1.0 | 0.0605 | 0.1627 | 0.2178 | 0.1318 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0651 | 0.3677 | 0.0109 | 0.2857 |
| 0.8701 | 149.2188 | 9550 | 1.1699 | 0.0251 | 0.0613 | 0.02 | 0.0046 | 0.0375 | -1.0 | 0.0642 | 0.161 | 0.2172 | 0.1523 | 0.2733 | -1.0 | 0.0 | 0.0 | 0.0618 | 0.3516 | 0.0135 | 0.3 |
| 0.6805 | 150.0 | 9600 | 1.1687 | 0.0244 | 0.0606 | 0.02 | 0.0046 | 0.0366 | -1.0 | 0.0594 | 0.1668 | 0.222 | 0.1523 | 0.28 | -1.0 | 0.0 | 0.0 | 0.0612 | 0.3516 | 0.0121 | 0.3143 |
| 1.081 | 150.7812 | 9650 | 1.1657 | 0.0247 | 0.0608 | 0.02 | 0.0048 | 0.0371 | -1.0 | 0.0594 | 0.169 | 0.2336 | 0.1523 | 0.2967 | -1.0 | 0.0 | 0.0 | 0.0616 | 0.3581 | 0.0125 | 0.3429 |
| 1.271 | 151.5625 | 9700 | 1.1566 | 0.0244 | 0.0608 | 0.02 | 0.0049 | 0.0366 | -1.0 | 0.0573 | 0.169 | 0.2278 | 0.1727 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0611 | 0.3548 | 0.0121 | 0.3286 |
| 0.7807 | 152.3438 | 9750 | 1.1615 | 0.0248 | 0.0612 | 0.0209 | 0.0053 | 0.0371 | -1.0 | 0.0573 | 0.17 | 0.2336 | 0.1523 | 0.2967 | -1.0 | 0.0 | 0.0 | 0.062 | 0.3581 | 0.0124 | 0.3429 |
| 1.0392 | 153.125 | 9800 | 1.1611 | 0.0247 | 0.0608 | 0.0201 | 0.0047 | 0.037 | -1.0 | 0.0562 | 0.1727 | 0.2289 | 0.1773 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0613 | 0.3581 | 0.0126 | 0.3286 |
| 1.0513 | 153.9062 | 9850 | 1.1615 | 0.0246 | 0.061 | 0.02 | 0.0047 | 0.0369 | -1.0 | 0.0584 | 0.1748 | 0.223 | 0.1477 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0613 | 0.3548 | 0.0126 | 0.3143 |
| 0.9073 | 154.6875 | 9900 | 1.1592 | 0.0243 | 0.0601 | 0.02 | 0.005 | 0.0366 | -1.0 | 0.0536 | 0.1737 | 0.2241 | 0.1568 | 0.2817 | -1.0 | 0.0 | 0.0 | 0.0613 | 0.3581 | 0.0116 | 0.3143 |
| 0.9581 | 155.4688 | 9950 | 1.1593 | 0.0243 | 0.0598 | 0.02 | 0.0049 | 0.0366 | -1.0 | 0.0584 | 0.1748 | 0.2204 | 0.1568 | 0.2767 | -1.0 | 0.0 | 0.0 | 0.0615 | 0.3613 | 0.0114 | 0.3 |
| 1.3363 | 156.25 | 10000 | 1.1568 | 0.0244 | 0.0597 | 0.02 | 0.0049 | 0.0369 | -1.0 | 0.0573 | 0.1748 | 0.2252 | 0.1568 | 0.2833 | -1.0 | 0.0 | 0.0 | 0.0614 | 0.3613 | 0.0118 | 0.3143 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.3.1+cu121
- Datasets 3.5.0
- Tokenizers 0.19.1
| [
"platelets",
"rbc",
"wbc"
] |
diribes/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7791
- Map: 0.4714
- Map 50: 0.6956
- Map 75: 0.5409
- Map Small: -1.0
- Map Medium: 0.4873
- Map Large: 0.5087
- Mar 1: 0.4392
- Mar 10: 0.71
- Mar 100: 0.7539
- Mar Small: -1.0
- Mar Medium: 0.6438
- Mar Large: 0.7737
- Map Banana: 0.3709
- Mar 100 Banana: 0.7275
- Map Orange: 0.4969
- Mar 100 Orange: 0.7429
- Map Apple: 0.5463
- Mar 100 Apple: 0.7914
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 1.9961 | 0.0048 | 0.0162 | 0.0025 | -1.0 | 0.0199 | 0.0051 | 0.0369 | 0.1339 | 0.2356 | -1.0 | 0.1223 | 0.2534 | 0.0042 | 0.245 | 0.0063 | 0.2619 | 0.0038 | 0.2 |
| No log | 2.0 | 120 | 1.7598 | 0.0221 | 0.0672 | 0.0129 | -1.0 | 0.0577 | 0.0216 | 0.0639 | 0.1861 | 0.3615 | -1.0 | 0.1723 | 0.3772 | 0.0439 | 0.5525 | 0.0137 | 0.1548 | 0.0088 | 0.3771 |
| No log | 3.0 | 180 | 1.5885 | 0.0344 | 0.0984 | 0.0195 | -1.0 | 0.04 | 0.0394 | 0.1276 | 0.2906 | 0.4912 | -1.0 | 0.2277 | 0.5277 | 0.0393 | 0.5725 | 0.0242 | 0.3381 | 0.0398 | 0.5629 |
| No log | 4.0 | 240 | 1.7055 | 0.0432 | 0.1236 | 0.0231 | -1.0 | 0.086 | 0.0453 | 0.1399 | 0.2912 | 0.4533 | -1.0 | 0.1991 | 0.493 | 0.0514 | 0.48 | 0.0468 | 0.3143 | 0.0313 | 0.5657 |
| No log | 5.0 | 300 | 1.5373 | 0.0441 | 0.0939 | 0.0325 | -1.0 | 0.0775 | 0.049 | 0.1611 | 0.3362 | 0.4907 | -1.0 | 0.117 | 0.5434 | 0.0504 | 0.595 | 0.0585 | 0.3429 | 0.0233 | 0.5343 |
| No log | 6.0 | 360 | 1.3281 | 0.078 | 0.1589 | 0.0696 | -1.0 | 0.1315 | 0.0958 | 0.2488 | 0.4444 | 0.6168 | -1.0 | 0.4062 | 0.6491 | 0.0527 | 0.6375 | 0.1145 | 0.6214 | 0.0667 | 0.5914 |
| No log | 7.0 | 420 | 1.1093 | 0.084 | 0.1721 | 0.0795 | -1.0 | 0.2091 | 0.0909 | 0.2741 | 0.5155 | 0.6735 | -1.0 | 0.5036 | 0.6996 | 0.0753 | 0.6875 | 0.0937 | 0.65 | 0.0829 | 0.6829 |
| No log | 8.0 | 480 | 1.0723 | 0.1198 | 0.2091 | 0.126 | -1.0 | 0.3302 | 0.1316 | 0.3113 | 0.5349 | 0.6929 | -1.0 | 0.5598 | 0.7189 | 0.0885 | 0.635 | 0.1448 | 0.681 | 0.1259 | 0.7629 |
| 1.4412 | 9.0 | 540 | 0.9907 | 0.1246 | 0.2195 | 0.1319 | -1.0 | 0.1973 | 0.1443 | 0.3778 | 0.5932 | 0.7191 | -1.0 | 0.5759 | 0.7445 | 0.1133 | 0.6925 | 0.1306 | 0.6762 | 0.1298 | 0.7886 |
| 1.4412 | 10.0 | 600 | 0.9855 | 0.1517 | 0.2615 | 0.1465 | -1.0 | 0.2562 | 0.1671 | 0.35 | 0.5819 | 0.6753 | -1.0 | 0.4375 | 0.7129 | 0.1153 | 0.685 | 0.1784 | 0.6381 | 0.1613 | 0.7029 |
| 1.4412 | 11.0 | 660 | 0.9734 | 0.1793 | 0.2934 | 0.1965 | -1.0 | 0.2641 | 0.1978 | 0.3564 | 0.6029 | 0.6922 | -1.0 | 0.4839 | 0.7249 | 0.1385 | 0.6975 | 0.1879 | 0.6476 | 0.2114 | 0.7314 |
| 1.4412 | 12.0 | 720 | 1.0457 | 0.2177 | 0.3468 | 0.2265 | -1.0 | 0.2242 | 0.2489 | 0.3676 | 0.6224 | 0.6648 | -1.0 | 0.4259 | 0.704 | 0.158 | 0.66 | 0.255 | 0.6286 | 0.2399 | 0.7057 |
| 1.4412 | 13.0 | 780 | 0.8756 | 0.2393 | 0.3799 | 0.2619 | -1.0 | 0.3545 | 0.2646 | 0.4163 | 0.6819 | 0.7369 | -1.0 | 0.6054 | 0.7577 | 0.1926 | 0.745 | 0.2709 | 0.7143 | 0.2546 | 0.7514 |
| 1.4412 | 14.0 | 840 | 0.9067 | 0.2987 | 0.4602 | 0.3327 | -1.0 | 0.4857 | 0.305 | 0.3873 | 0.6723 | 0.7256 | -1.0 | 0.6661 | 0.7371 | 0.2039 | 0.7025 | 0.3302 | 0.7429 | 0.362 | 0.7314 |
| 1.4412 | 15.0 | 900 | 0.9761 | 0.2658 | 0.4491 | 0.2969 | -1.0 | 0.3493 | 0.2926 | 0.3656 | 0.6225 | 0.7037 | -1.0 | 0.5911 | 0.7232 | 0.2068 | 0.685 | 0.3045 | 0.6976 | 0.286 | 0.7286 |
| 1.4412 | 16.0 | 960 | 0.9318 | 0.2791 | 0.4399 | 0.3218 | -1.0 | 0.3934 | 0.3035 | 0.3802 | 0.6542 | 0.7144 | -1.0 | 0.6054 | 0.733 | 0.2017 | 0.7 | 0.306 | 0.6976 | 0.3295 | 0.7457 |
| 0.8426 | 17.0 | 1020 | 0.8593 | 0.3076 | 0.4558 | 0.3454 | -1.0 | 0.42 | 0.3289 | 0.4046 | 0.678 | 0.7545 | -1.0 | 0.6589 | 0.7731 | 0.2056 | 0.7125 | 0.3782 | 0.7452 | 0.3389 | 0.8057 |
| 0.8426 | 18.0 | 1080 | 0.8634 | 0.3121 | 0.5056 | 0.3313 | -1.0 | 0.4074 | 0.347 | 0.4025 | 0.6684 | 0.7438 | -1.0 | 0.6393 | 0.7646 | 0.2542 | 0.6925 | 0.3362 | 0.7476 | 0.3459 | 0.7914 |
| 0.8426 | 19.0 | 1140 | 0.8064 | 0.3787 | 0.5725 | 0.3908 | -1.0 | 0.422 | 0.4135 | 0.4251 | 0.7022 | 0.7504 | -1.0 | 0.6313 | 0.774 | 0.2907 | 0.695 | 0.4079 | 0.7476 | 0.4376 | 0.8086 |
| 0.8426 | 20.0 | 1200 | 0.7830 | 0.4137 | 0.6134 | 0.4445 | -1.0 | 0.4597 | 0.446 | 0.4294 | 0.7059 | 0.7565 | -1.0 | 0.6259 | 0.7801 | 0.3242 | 0.7225 | 0.4305 | 0.75 | 0.4864 | 0.7971 |
| 0.8426 | 21.0 | 1260 | 0.7738 | 0.4224 | 0.6321 | 0.468 | -1.0 | 0.4326 | 0.4516 | 0.4314 | 0.7087 | 0.7619 | -1.0 | 0.6232 | 0.7866 | 0.3299 | 0.7325 | 0.4396 | 0.7619 | 0.4977 | 0.7914 |
| 0.8426 | 22.0 | 1320 | 0.8010 | 0.429 | 0.6543 | 0.4765 | -1.0 | 0.4505 | 0.4649 | 0.4238 | 0.7088 | 0.7403 | -1.0 | 0.6027 | 0.7643 | 0.3371 | 0.7175 | 0.4546 | 0.7262 | 0.4953 | 0.7771 |
| 0.8426 | 23.0 | 1380 | 0.7663 | 0.4368 | 0.6546 | 0.473 | -1.0 | 0.4744 | 0.4744 | 0.4363 | 0.7014 | 0.7561 | -1.0 | 0.6571 | 0.7754 | 0.3623 | 0.7125 | 0.4574 | 0.7643 | 0.4906 | 0.7914 |
| 0.8426 | 24.0 | 1440 | 0.7869 | 0.4652 | 0.704 | 0.5348 | -1.0 | 0.4769 | 0.4996 | 0.4319 | 0.7171 | 0.75 | -1.0 | 0.6295 | 0.7716 | 0.3588 | 0.7225 | 0.5102 | 0.7476 | 0.5265 | 0.78 |
| 0.6721 | 25.0 | 1500 | 0.7694 | 0.466 | 0.6751 | 0.5348 | -1.0 | 0.4933 | 0.5055 | 0.4348 | 0.7088 | 0.752 | -1.0 | 0.6509 | 0.7704 | 0.3672 | 0.725 | 0.4856 | 0.731 | 0.5452 | 0.8 |
| 0.6721 | 26.0 | 1560 | 0.7674 | 0.4673 | 0.6805 | 0.5337 | -1.0 | 0.4829 | 0.5079 | 0.4362 | 0.7093 | 0.7527 | -1.0 | 0.6375 | 0.7737 | 0.3688 | 0.7225 | 0.4927 | 0.7357 | 0.5405 | 0.8 |
| 0.6721 | 27.0 | 1620 | 0.7790 | 0.4742 | 0.7018 | 0.5431 | -1.0 | 0.486 | 0.5139 | 0.4414 | 0.7103 | 0.7557 | -1.0 | 0.6438 | 0.7759 | 0.3763 | 0.7275 | 0.5016 | 0.7452 | 0.5448 | 0.7943 |
| 0.6721 | 28.0 | 1680 | 0.7775 | 0.4726 | 0.6975 | 0.5412 | -1.0 | 0.4855 | 0.5109 | 0.4383 | 0.7101 | 0.7522 | -1.0 | 0.6366 | 0.7728 | 0.3748 | 0.7275 | 0.4959 | 0.7405 | 0.5472 | 0.7886 |
| 0.6721 | 29.0 | 1740 | 0.7791 | 0.4715 | 0.6957 | 0.5409 | -1.0 | 0.4873 | 0.5089 | 0.4392 | 0.7109 | 0.7539 | -1.0 | 0.6438 | 0.7737 | 0.3715 | 0.7275 | 0.4969 | 0.7429 | 0.5462 | 0.7914 |
| 0.6721 | 30.0 | 1800 | 0.7791 | 0.4714 | 0.6956 | 0.5409 | -1.0 | 0.4873 | 0.5087 | 0.4392 | 0.71 | 0.7539 | -1.0 | 0.6438 | 0.7737 | 0.3709 | 0.7275 | 0.4969 | 0.7429 | 0.5463 | 0.7914 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
ivferns/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7884
- Map: 0.5807
- Map 50: 0.8349
- Map 75: 0.6454
- Map Small: -1.0
- Map Medium: 0.6834
- Map Large: 0.5849
- Mar 1: 0.4218
- Mar 10: 0.7223
- Mar 100: 0.7898
- Mar Small: -1.0
- Mar Medium: 0.7143
- Mar Large: 0.8008
- Map Banana: 0.4033
- Mar 100 Banana: 0.75
- Map Orange: 0.6231
- Mar 100 Orange: 0.8024
- Map Apple: 0.7157
- Mar 100 Apple: 0.8171
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 2.0139 | 0.0066 | 0.0212 | 0.0027 | -1.0 | 0.0037 | 0.0079 | 0.0686 | 0.1798 | 0.2746 | -1.0 | 0.0929 | 0.2962 | 0.0071 | 0.3425 | 0.0011 | 0.0643 | 0.0115 | 0.4171 |
| No log | 2.0 | 120 | 1.7738 | 0.0212 | 0.0584 | 0.0119 | -1.0 | 0.02 | 0.0242 | 0.1142 | 0.2996 | 0.4433 | -1.0 | 0.3232 | 0.4584 | 0.0187 | 0.455 | 0.0179 | 0.3548 | 0.027 | 0.52 |
| No log | 3.0 | 180 | 1.5507 | 0.0482 | 0.1393 | 0.0261 | -1.0 | 0.0414 | 0.0497 | 0.1395 | 0.3359 | 0.4989 | -1.0 | 0.2732 | 0.5223 | 0.0632 | 0.55 | 0.0408 | 0.281 | 0.0405 | 0.6657 |
| No log | 4.0 | 240 | 1.5842 | 0.0761 | 0.19 | 0.0338 | -1.0 | 0.0625 | 0.0803 | 0.17 | 0.3695 | 0.4696 | -1.0 | 0.3571 | 0.484 | 0.1125 | 0.525 | 0.0436 | 0.2952 | 0.0722 | 0.5886 |
| No log | 5.0 | 300 | 1.5088 | 0.06 | 0.1382 | 0.0366 | -1.0 | 0.1775 | 0.0582 | 0.1912 | 0.3768 | 0.51 | -1.0 | 0.5179 | 0.51 | 0.0582 | 0.5175 | 0.0502 | 0.4095 | 0.0716 | 0.6029 |
| No log | 6.0 | 360 | 1.4455 | 0.1084 | 0.2423 | 0.0602 | -1.0 | 0.2456 | 0.1027 | 0.2208 | 0.4244 | 0.5362 | -1.0 | 0.4643 | 0.5476 | 0.0846 | 0.5325 | 0.1017 | 0.419 | 0.139 | 0.6571 |
| No log | 7.0 | 420 | 1.2631 | 0.1451 | 0.2461 | 0.163 | -1.0 | 0.2669 | 0.1543 | 0.2573 | 0.4622 | 0.6294 | -1.0 | 0.5768 | 0.6385 | 0.0585 | 0.5825 | 0.0888 | 0.5 | 0.2881 | 0.8057 |
| No log | 8.0 | 480 | 1.2531 | 0.1467 | 0.2328 | 0.1608 | -1.0 | 0.3191 | 0.1568 | 0.2923 | 0.468 | 0.655 | -1.0 | 0.6429 | 0.6583 | 0.1008 | 0.5925 | 0.118 | 0.581 | 0.2211 | 0.7914 |
| 1.508 | 9.0 | 540 | 1.1516 | 0.1744 | 0.2949 | 0.1852 | -1.0 | 0.2948 | 0.2079 | 0.3346 | 0.5257 | 0.6978 | -1.0 | 0.5607 | 0.7177 | 0.1149 | 0.59 | 0.1606 | 0.6976 | 0.2476 | 0.8057 |
| 1.508 | 10.0 | 600 | 1.1138 | 0.2979 | 0.4906 | 0.3257 | -1.0 | 0.4518 | 0.2998 | 0.3235 | 0.5682 | 0.7065 | -1.0 | 0.6839 | 0.7123 | 0.178 | 0.6225 | 0.3009 | 0.7286 | 0.4147 | 0.7686 |
| 1.508 | 11.0 | 660 | 1.0224 | 0.3595 | 0.5625 | 0.399 | -1.0 | 0.5083 | 0.3788 | 0.3651 | 0.6193 | 0.7398 | -1.0 | 0.6768 | 0.7489 | 0.223 | 0.6775 | 0.3392 | 0.7619 | 0.5165 | 0.78 |
| 1.508 | 12.0 | 720 | 0.9289 | 0.4241 | 0.6453 | 0.4613 | -1.0 | 0.5583 | 0.4292 | 0.392 | 0.6519 | 0.7582 | -1.0 | 0.7464 | 0.7642 | 0.2221 | 0.695 | 0.4663 | 0.7738 | 0.584 | 0.8057 |
| 1.508 | 13.0 | 780 | 0.9366 | 0.445 | 0.6901 | 0.507 | -1.0 | 0.5302 | 0.4553 | 0.3852 | 0.6642 | 0.7537 | -1.0 | 0.675 | 0.7646 | 0.2955 | 0.7225 | 0.4751 | 0.75 | 0.5643 | 0.7886 |
| 1.508 | 14.0 | 840 | 0.9113 | 0.4709 | 0.7198 | 0.5633 | -1.0 | 0.5496 | 0.485 | 0.399 | 0.6869 | 0.7525 | -1.0 | 0.7268 | 0.7589 | 0.3115 | 0.715 | 0.5103 | 0.7452 | 0.5909 | 0.7971 |
| 1.508 | 15.0 | 900 | 0.8645 | 0.5101 | 0.7637 | 0.5848 | -1.0 | 0.6017 | 0.5203 | 0.4123 | 0.6792 | 0.7602 | -1.0 | 0.7054 | 0.7705 | 0.3204 | 0.7225 | 0.5754 | 0.7667 | 0.6345 | 0.7914 |
| 1.508 | 16.0 | 960 | 0.8947 | 0.5143 | 0.7771 | 0.5891 | -1.0 | 0.6307 | 0.5175 | 0.4041 | 0.6809 | 0.7662 | -1.0 | 0.7054 | 0.773 | 0.3113 | 0.7275 | 0.5785 | 0.7595 | 0.653 | 0.8114 |
| 0.887 | 17.0 | 1020 | 0.8798 | 0.5558 | 0.8316 | 0.6245 | -1.0 | 0.6535 | 0.562 | 0.414 | 0.6906 | 0.7619 | -1.0 | 0.7125 | 0.7721 | 0.3877 | 0.7175 | 0.5929 | 0.7595 | 0.6868 | 0.8086 |
| 0.887 | 18.0 | 1080 | 0.8313 | 0.5469 | 0.8066 | 0.6245 | -1.0 | 0.6548 | 0.5489 | 0.4138 | 0.7113 | 0.7858 | -1.0 | 0.7357 | 0.7929 | 0.3797 | 0.7475 | 0.5876 | 0.7929 | 0.6735 | 0.8171 |
| 0.887 | 19.0 | 1140 | 0.8462 | 0.5478 | 0.8191 | 0.6445 | -1.0 | 0.6461 | 0.55 | 0.4089 | 0.7115 | 0.7856 | -1.0 | 0.7196 | 0.797 | 0.3853 | 0.735 | 0.5963 | 0.8048 | 0.6618 | 0.8171 |
| 0.887 | 20.0 | 1200 | 0.8010 | 0.5579 | 0.8275 | 0.6407 | -1.0 | 0.6591 | 0.5626 | 0.4085 | 0.7079 | 0.7739 | -1.0 | 0.7446 | 0.7822 | 0.3899 | 0.7275 | 0.6097 | 0.7857 | 0.6741 | 0.8086 |
| 0.887 | 21.0 | 1260 | 0.7917 | 0.5707 | 0.8343 | 0.6548 | -1.0 | 0.6462 | 0.5799 | 0.4081 | 0.7204 | 0.7783 | -1.0 | 0.7196 | 0.7876 | 0.3921 | 0.745 | 0.6316 | 0.7929 | 0.6884 | 0.7971 |
| 0.887 | 22.0 | 1320 | 0.8459 | 0.5535 | 0.8298 | 0.6178 | -1.0 | 0.6422 | 0.56 | 0.4051 | 0.7059 | 0.7803 | -1.0 | 0.7125 | 0.7914 | 0.3614 | 0.73 | 0.612 | 0.8167 | 0.6872 | 0.7943 |
| 0.887 | 23.0 | 1380 | 0.8255 | 0.5685 | 0.8346 | 0.6427 | -1.0 | 0.641 | 0.5772 | 0.4141 | 0.7213 | 0.7808 | -1.0 | 0.7143 | 0.791 | 0.3819 | 0.74 | 0.6176 | 0.7881 | 0.706 | 0.8143 |
| 0.887 | 24.0 | 1440 | 0.8337 | 0.5714 | 0.8358 | 0.6285 | -1.0 | 0.6683 | 0.5772 | 0.4062 | 0.7098 | 0.7751 | -1.0 | 0.7054 | 0.787 | 0.3992 | 0.7325 | 0.6136 | 0.7929 | 0.7013 | 0.8 |
| 0.6681 | 25.0 | 1500 | 0.7999 | 0.5757 | 0.8302 | 0.6332 | -1.0 | 0.6743 | 0.5821 | 0.4071 | 0.7108 | 0.7744 | -1.0 | 0.7268 | 0.7818 | 0.3908 | 0.735 | 0.6343 | 0.8024 | 0.7019 | 0.7857 |
| 0.6681 | 26.0 | 1560 | 0.7842 | 0.5788 | 0.835 | 0.6576 | -1.0 | 0.6764 | 0.5842 | 0.4184 | 0.7238 | 0.7821 | -1.0 | 0.7 | 0.7944 | 0.3921 | 0.745 | 0.626 | 0.7929 | 0.7183 | 0.8086 |
| 0.6681 | 27.0 | 1620 | 0.7925 | 0.5788 | 0.8317 | 0.6525 | -1.0 | 0.6884 | 0.5831 | 0.4096 | 0.7243 | 0.7792 | -1.0 | 0.7125 | 0.7894 | 0.3964 | 0.7425 | 0.6336 | 0.8095 | 0.7065 | 0.7857 |
| 0.6681 | 28.0 | 1680 | 0.7893 | 0.5791 | 0.8342 | 0.6494 | -1.0 | 0.6833 | 0.5833 | 0.42 | 0.7265 | 0.7898 | -1.0 | 0.7143 | 0.8008 | 0.3992 | 0.75 | 0.6242 | 0.8024 | 0.7139 | 0.8171 |
| 0.6681 | 29.0 | 1740 | 0.7884 | 0.581 | 0.8351 | 0.6492 | -1.0 | 0.6834 | 0.5853 | 0.4218 | 0.7231 | 0.7898 | -1.0 | 0.7143 | 0.8008 | 0.4047 | 0.75 | 0.6228 | 0.8024 | 0.7157 | 0.8171 |
| 0.6681 | 30.0 | 1800 | 0.7884 | 0.5807 | 0.8349 | 0.6454 | -1.0 | 0.6834 | 0.5849 | 0.4218 | 0.7223 | 0.7898 | -1.0 | 0.7143 | 0.8008 | 0.4033 | 0.75 | 0.6231 | 0.8024 | 0.7157 | 0.8171 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
enamezto/yolo_finetuned_fruits |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolo_finetuned_fruits
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7609
- Map: 0.5852
- Map 50: 0.8253
- Map 75: 0.6874
- Map Small: -1.0
- Map Medium: 0.5946
- Map Large: 0.6071
- Mar 1: 0.4315
- Mar 10: 0.7259
- Mar 100: 0.7891
- Mar Small: -1.0
- Mar Medium: 0.7768
- Mar Large: 0.7913
- Map Banana: 0.491
- Mar 100 Banana: 0.785
- Map Orange: 0.5933
- Mar 100 Orange: 0.7738
- Map Apple: 0.6711
- Mar 100 Apple: 0.8086
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|
| No log | 1.0 | 60 | 1.9279 | 0.0163 | 0.0554 | 0.0051 | -1.0 | 0.0207 | 0.0179 | 0.0527 | 0.1587 | 0.3376 | -1.0 | 0.2286 | 0.3472 | 0.0209 | 0.44 | 0.0118 | 0.3071 | 0.0161 | 0.2657 |
| No log | 2.0 | 120 | 1.7657 | 0.028 | 0.074 | 0.0149 | -1.0 | 0.0297 | 0.0379 | 0.1074 | 0.2925 | 0.4748 | -1.0 | 0.25 | 0.5115 | 0.0258 | 0.4825 | 0.0263 | 0.419 | 0.032 | 0.5229 |
| No log | 3.0 | 180 | 1.4961 | 0.0378 | 0.0907 | 0.0267 | -1.0 | 0.0315 | 0.0459 | 0.1775 | 0.3621 | 0.5732 | -1.0 | 0.3071 | 0.621 | 0.0376 | 0.52 | 0.0434 | 0.6024 | 0.0324 | 0.5971 |
| No log | 4.0 | 240 | 1.4776 | 0.0733 | 0.1538 | 0.0651 | -1.0 | 0.0786 | 0.0936 | 0.2241 | 0.4129 | 0.586 | -1.0 | 0.3839 | 0.623 | 0.0699 | 0.5375 | 0.0765 | 0.569 | 0.0734 | 0.6514 |
| No log | 5.0 | 300 | 1.2594 | 0.0588 | 0.1253 | 0.0526 | -1.0 | 0.1091 | 0.0676 | 0.2692 | 0.4808 | 0.6442 | -1.0 | 0.3964 | 0.6866 | 0.0457 | 0.6225 | 0.076 | 0.5929 | 0.0547 | 0.7171 |
| No log | 6.0 | 360 | 1.0993 | 0.1228 | 0.2516 | 0.1108 | -1.0 | 0.25 | 0.119 | 0.3001 | 0.5339 | 0.7206 | -1.0 | 0.5562 | 0.7541 | 0.1418 | 0.635 | 0.133 | 0.7095 | 0.0935 | 0.8171 |
| No log | 7.0 | 420 | 1.0335 | 0.1618 | 0.2956 | 0.1618 | -1.0 | 0.1892 | 0.1785 | 0.349 | 0.5538 | 0.7229 | -1.0 | 0.6009 | 0.7443 | 0.2141 | 0.7025 | 0.1402 | 0.7119 | 0.131 | 0.7543 |
| No log | 8.0 | 480 | 1.0224 | 0.224 | 0.4124 | 0.2229 | -1.0 | 0.2989 | 0.2616 | 0.3443 | 0.5907 | 0.7291 | -1.0 | 0.6071 | 0.7519 | 0.2272 | 0.6925 | 0.2419 | 0.7119 | 0.2028 | 0.7829 |
| 1.3854 | 9.0 | 540 | 0.9531 | 0.2635 | 0.4383 | 0.2886 | -1.0 | 0.4117 | 0.2979 | 0.3791 | 0.6271 | 0.7395 | -1.0 | 0.6045 | 0.7646 | 0.2093 | 0.6975 | 0.3114 | 0.7238 | 0.2698 | 0.7971 |
| 1.3854 | 10.0 | 600 | 0.9559 | 0.3768 | 0.635 | 0.4208 | -1.0 | 0.4203 | 0.4151 | 0.3885 | 0.6638 | 0.7276 | -1.0 | 0.6098 | 0.7482 | 0.2938 | 0.7075 | 0.4063 | 0.6952 | 0.4304 | 0.78 |
| 1.3854 | 11.0 | 660 | 0.8701 | 0.4757 | 0.7181 | 0.5387 | -1.0 | 0.5361 | 0.5064 | 0.4152 | 0.6861 | 0.7606 | -1.0 | 0.6812 | 0.7769 | 0.3535 | 0.7175 | 0.5277 | 0.7643 | 0.5459 | 0.8 |
| 1.3854 | 12.0 | 720 | 0.9818 | 0.4139 | 0.6727 | 0.4376 | -1.0 | 0.5323 | 0.4314 | 0.3565 | 0.6606 | 0.7069 | -1.0 | 0.6107 | 0.7248 | 0.2533 | 0.6775 | 0.4495 | 0.6833 | 0.539 | 0.76 |
| 1.3854 | 13.0 | 780 | 0.8000 | 0.5048 | 0.7425 | 0.5895 | -1.0 | 0.5617 | 0.5248 | 0.404 | 0.713 | 0.7628 | -1.0 | 0.6705 | 0.778 | 0.3873 | 0.7575 | 0.4994 | 0.7452 | 0.6278 | 0.7857 |
| 1.3854 | 14.0 | 840 | 0.8710 | 0.4943 | 0.7355 | 0.562 | -1.0 | 0.5574 | 0.5152 | 0.3905 | 0.6779 | 0.7563 | -1.0 | 0.7232 | 0.7624 | 0.3564 | 0.745 | 0.5118 | 0.7381 | 0.6147 | 0.7857 |
| 1.3854 | 15.0 | 900 | 0.8485 | 0.5397 | 0.8054 | 0.631 | -1.0 | 0.5787 | 0.5563 | 0.4008 | 0.7005 | 0.7638 | -1.0 | 0.7232 | 0.7707 | 0.4181 | 0.7575 | 0.5544 | 0.7452 | 0.6466 | 0.7886 |
| 1.3854 | 16.0 | 960 | 0.7850 | 0.5603 | 0.7975 | 0.6452 | -1.0 | 0.5588 | 0.5814 | 0.4194 | 0.7282 | 0.7779 | -1.0 | 0.692 | 0.7909 | 0.4533 | 0.785 | 0.5635 | 0.7429 | 0.6641 | 0.8057 |
| 0.7875 | 17.0 | 1020 | 0.8237 | 0.5428 | 0.7859 | 0.6431 | -1.0 | 0.5682 | 0.5714 | 0.4212 | 0.7101 | 0.7727 | -1.0 | 0.75 | 0.7766 | 0.4544 | 0.7675 | 0.5655 | 0.7619 | 0.6084 | 0.7886 |
| 0.7875 | 18.0 | 1080 | 0.8090 | 0.5326 | 0.7637 | 0.6097 | -1.0 | 0.5912 | 0.5568 | 0.4182 | 0.7185 | 0.7722 | -1.0 | 0.7723 | 0.7729 | 0.4126 | 0.76 | 0.5424 | 0.7452 | 0.6427 | 0.8114 |
| 0.7875 | 19.0 | 1140 | 0.7976 | 0.5499 | 0.7936 | 0.647 | -1.0 | 0.5647 | 0.5761 | 0.4244 | 0.7244 | 0.7835 | -1.0 | 0.7705 | 0.7866 | 0.4382 | 0.7675 | 0.555 | 0.7714 | 0.6566 | 0.8114 |
| 0.7875 | 20.0 | 1200 | 0.7670 | 0.5856 | 0.8227 | 0.6922 | -1.0 | 0.5578 | 0.6123 | 0.424 | 0.7272 | 0.7734 | -1.0 | 0.7339 | 0.7806 | 0.4982 | 0.7625 | 0.5985 | 0.769 | 0.6602 | 0.7886 |
| 0.7875 | 21.0 | 1260 | 0.7620 | 0.5696 | 0.7927 | 0.6667 | -1.0 | 0.5838 | 0.5908 | 0.4376 | 0.7332 | 0.7879 | -1.0 | 0.7339 | 0.7961 | 0.4481 | 0.79 | 0.5879 | 0.7595 | 0.6729 | 0.8143 |
| 0.7875 | 22.0 | 1320 | 0.7752 | 0.571 | 0.8108 | 0.6646 | -1.0 | 0.6126 | 0.5923 | 0.4308 | 0.7296 | 0.7848 | -1.0 | 0.7509 | 0.7912 | 0.4489 | 0.7725 | 0.5818 | 0.7476 | 0.6823 | 0.8343 |
| 0.7875 | 23.0 | 1380 | 0.7367 | 0.5871 | 0.8107 | 0.6975 | -1.0 | 0.6088 | 0.6081 | 0.4359 | 0.7355 | 0.7914 | -1.0 | 0.7973 | 0.7914 | 0.505 | 0.7775 | 0.5909 | 0.7738 | 0.6654 | 0.8229 |
| 0.7875 | 24.0 | 1440 | 0.7607 | 0.5811 | 0.8174 | 0.6883 | -1.0 | 0.6021 | 0.606 | 0.4331 | 0.7271 | 0.7881 | -1.0 | 0.7902 | 0.7881 | 0.4877 | 0.7825 | 0.6049 | 0.7762 | 0.6507 | 0.8057 |
| 0.5913 | 25.0 | 1500 | 0.7660 | 0.5832 | 0.8164 | 0.6855 | -1.0 | 0.6194 | 0.5996 | 0.4308 | 0.7277 | 0.7871 | -1.0 | 0.7902 | 0.7865 | 0.4895 | 0.785 | 0.598 | 0.7762 | 0.662 | 0.8 |
| 0.5913 | 26.0 | 1560 | 0.7584 | 0.592 | 0.8275 | 0.6891 | -1.0 | 0.6089 | 0.612 | 0.4371 | 0.7322 | 0.7891 | -1.0 | 0.7839 | 0.7902 | 0.4895 | 0.7825 | 0.6059 | 0.7762 | 0.6807 | 0.8086 |
| 0.5913 | 27.0 | 1620 | 0.7555 | 0.5885 | 0.8288 | 0.683 | -1.0 | 0.5944 | 0.6125 | 0.4353 | 0.7274 | 0.7875 | -1.0 | 0.7768 | 0.7896 | 0.489 | 0.78 | 0.6046 | 0.7738 | 0.6719 | 0.8086 |
| 0.5913 | 28.0 | 1680 | 0.7610 | 0.5835 | 0.8249 | 0.6864 | -1.0 | 0.595 | 0.6051 | 0.4297 | 0.7249 | 0.7874 | -1.0 | 0.7768 | 0.7891 | 0.4895 | 0.785 | 0.5909 | 0.7714 | 0.6702 | 0.8057 |
| 0.5913 | 29.0 | 1740 | 0.7596 | 0.5899 | 0.8315 | 0.6917 | -1.0 | 0.5946 | 0.6126 | 0.4338 | 0.7275 | 0.7917 | -1.0 | 0.7768 | 0.7943 | 0.4902 | 0.7875 | 0.6078 | 0.7762 | 0.6715 | 0.8114 |
| 0.5913 | 30.0 | 1800 | 0.7609 | 0.5852 | 0.8253 | 0.6874 | -1.0 | 0.5946 | 0.6071 | 0.4315 | 0.7259 | 0.7891 | -1.0 | 0.7768 | 0.7913 | 0.491 | 0.785 | 0.5933 | 0.7738 | 0.6711 | 0.8086 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"banana",
"orange",
"apple"
] |
kylecsnow/detr-resnet-50-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0044
- Map: 0.0402
- Map 50: 0.076
- Map 75: 0.0374
- Map Small: 0.0184
- Map Medium: 0.0397
- Map Large: 0.0565
- Mar 1: 0.0854
- Mar 10: 0.1208
- Mar 100: 0.122
- Mar Small: 0.0443
- Mar Medium: 0.1322
- Mar Large: 0.1494
- Map Shirt, blouse: 0.0
- Mar 100 Shirt, blouse: 0.0
- Map Top, t-shirt, sweatshirt: 0.0858
- Mar 100 Top, t-shirt, sweatshirt: 0.514
- Map Sweater: 0.0
- Mar 100 Sweater: 0.0
- Map Cardigan: 0.0
- Mar 100 Cardigan: 0.0
- Map Jacket: 0.1682
- Mar 100 Jacket: 0.5626
- Map Vest: 0.0
- Mar 100 Vest: 0.0
- Map Pants: 0.3009
- Mar 100 Pants: 0.6596
- Map Shorts: 0.0095
- Mar 100 Shorts: 0.0429
- Map Skirt: 0.1019
- Mar 100 Skirt: 0.4694
- Map Coat: 0.0
- Mar 100 Coat: 0.0
- Map Dress: 0.3346
- Mar 100 Dress: 0.7634
- Map Jumpsuit: 0.0
- Mar 100 Jumpsuit: 0.0
- Map Cape: 0.0
- Mar 100 Cape: 0.0
- Map Glasses: 0.0716
- Mar 100 Glasses: 0.269
- Map Hat: 0.0715
- Mar 100 Hat: 0.2466
- Map Headband, head covering, hair accessory: 0.0006
- Mar 100 Headband, head covering, hair accessory: 0.0257
- Map Tie: 0.0
- Mar 100 Tie: 0.0
- Map Glove: 0.0
- Mar 100 Glove: 0.0
- Map Watch: 0.0005
- Mar 100 Watch: 0.0113
- Map Belt: 0.007
- Mar 100 Belt: 0.1366
- Map Leg warmer: 0.0
- Mar 100 Leg warmer: 0.0
- Map Tights, stockings: 0.008
- Mar 100 Tights, stockings: 0.1575
- Map Sock: 0.0
- Mar 100 Sock: 0.0
- Map Shoe: 0.3226
- Mar 100 Shoe: 0.4942
- Map Bag, wallet: 0.0089
- Mar 100 Bag, wallet: 0.1296
- Map Scarf: 0.0
- Mar 100 Scarf: 0.0
- Map Umbrella: 0.0
- Mar 100 Umbrella: 0.0
- Map Hood: 0.0
- Mar 100 Hood: 0.0
- Map Collar: 0.0209
- Mar 100 Collar: 0.1548
- Map Lapel: 0.0266
- Mar 100 Lapel: 0.2037
- Map Epaulette: 0.0
- Mar 100 Epaulette: 0.0
- Map Sleeve: 0.2336
- Mar 100 Sleeve: 0.434
- Map Pocket: 0.0002
- Mar 100 Pocket: 0.0504
- Map Neckline: 0.0785
- Mar 100 Neckline: 0.2888
- Map Buckle: 0.0
- Mar 100 Buckle: 0.0
- Map Zipper: 0.0
- Mar 100 Zipper: 0.0
- Map Applique: 0.0
- Mar 100 Applique: 0.0
- Map Bead: 0.0
- Mar 100 Bead: 0.0
- Map Bow: 0.0
- Mar 100 Bow: 0.0
- Map Flower: 0.0
- Mar 100 Flower: 0.0
- Map Fringe: 0.0
- Mar 100 Fringe: 0.0
- Map Ribbon: 0.0
- Mar 100 Ribbon: 0.0
- Map Rivet: 0.0
- Mar 100 Rivet: 0.0
- Map Ruffle: 0.0
- Mar 100 Ruffle: 0.0
- Map Sequin: 0.0
- Mar 100 Sequin: 0.0
- Map Tassel: 0.0
- Mar 100 Tassel: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Shirt, blouse | Mar 100 Shirt, blouse | Map Top, t-shirt, sweatshirt | Mar 100 Top, t-shirt, sweatshirt | Map Sweater | Mar 100 Sweater | Map Cardigan | Mar 100 Cardigan | Map Jacket | Mar 100 Jacket | Map Vest | Mar 100 Vest | Map Pants | Mar 100 Pants | Map Shorts | Mar 100 Shorts | Map Skirt | Mar 100 Skirt | Map Coat | Mar 100 Coat | Map Dress | Mar 100 Dress | Map Jumpsuit | Mar 100 Jumpsuit | Map Cape | Mar 100 Cape | Map Glasses | Mar 100 Glasses | Map Hat | Mar 100 Hat | Map Headband, head covering, hair accessory | Mar 100 Headband, head covering, hair accessory | Map Tie | Mar 100 Tie | Map Glove | Mar 100 Glove | Map Watch | Mar 100 Watch | Map Belt | Mar 100 Belt | Map Leg warmer | Mar 100 Leg warmer | Map Tights, stockings | Mar 100 Tights, stockings | Map Sock | Mar 100 Sock | Map Shoe | Mar 100 Shoe | Map Bag, wallet | Mar 100 Bag, wallet | Map Scarf | Mar 100 Scarf | Map Umbrella | Mar 100 Umbrella | Map Hood | Mar 100 Hood | Map Collar | Mar 100 Collar | Map Lapel | Mar 100 Lapel | Map Epaulette | Mar 100 Epaulette | Map Sleeve | Mar 100 Sleeve | Map Pocket | Mar 100 Pocket | Map Neckline | Mar 100 Neckline | Map Buckle | Mar 100 Buckle | Map Zipper | Mar 100 Zipper | Map Applique | Mar 100 Applique | Map Bead | Mar 100 Bead | Map Bow | Mar 100 Bow | Map Flower | Mar 100 Flower | Map Fringe | Mar 100 Fringe | Map Ribbon | Mar 100 Ribbon | Map Rivet | Mar 100 Rivet | Map Ruffle | Mar 100 Ruffle | Map Sequin | Mar 100 Sequin | Map Tassel | Mar 100 Tassel |
|:-------------:|:------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------:|:--------------------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------:|:--------------:|:--------:|:------------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-----------:|:---------------:|:-------:|:-----------:|:-------------------------------------------:|:-----------------------------------------------:|:-------:|:-----------:|:---------:|:-------------:|:---------:|:-------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:--------:|:------------:|:---------------:|:-------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:----------:|:--------------:|:---------:|:-------------:|:-------------:|:-----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:--------:|:------------:|:-------:|:-----------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|
| 4.9161 | 0.0088 | 50 | 4.9567 | 0.0001 | 0.0002 | 0.0 | 0.0001 | 0.0001 | 0.0003 | 0.0014 | 0.0035 | 0.0065 | 0.0028 | 0.0096 | 0.0047 | 0.0002 | 0.0168 | 0.0001 | 0.0371 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.001 | 0.0 | 0.001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.1399 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0037 | 0.0 | 0.0 | 0.0001 | 0.0075 | 0.0 | 0.0088 | 0.0001 | 0.0038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0333 | 0.0 | 0.0 | 0.0005 | 0.0329 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1776 | 0.0175 | 100 | 3.9779 | 0.0003 | 0.0009 | 0.0001 | 0.0002 | 0.0004 | 0.0 | 0.0012 | 0.005 | 0.0079 | 0.0054 | 0.0115 | 0.0077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0111 | 0.2752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.084 | 0.0 | 0.0009 | 0.0002 | 0.0034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6948 | 0.0263 | 150 | 3.7230 | 0.0003 | 0.0012 | 0.0001 | 0.0002 | 0.0005 | 0.0001 | 0.0008 | 0.0052 | 0.0076 | 0.0053 | 0.0101 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0137 | 0.2517 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0977 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1624 | 0.0351 | 200 | 3.4985 | 0.0008 | 0.0026 | 0.0002 | 0.0012 | 0.0011 | 0.0002 | 0.002 | 0.0078 | 0.0112 | 0.0061 | 0.0157 | 0.0198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0318 | 0.3046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.2123 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6542 | 0.0438 | 250 | 3.4477 | 0.0015 | 0.0046 | 0.0004 | 0.0015 | 0.002 | 0.0002 | 0.003 | 0.0084 | 0.0122 | 0.0055 | 0.0168 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.058 | 0.2768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0095 | 0.2822 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4352 | 0.0526 | 300 | 3.3234 | 0.0012 | 0.0038 | 0.0005 | 0.0009 | 0.0022 | 0.0005 | 0.002 | 0.0094 | 0.0136 | 0.0083 | 0.0184 | 0.0273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0477 | 0.355 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0068 | 0.2634 | 0.0 | 0.0 | 0.0021 | 0.0086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1122 | 0.0614 | 350 | 3.2007 | 0.0018 | 0.0051 | 0.001 | 0.0011 | 0.0033 | 0.0007 | 0.003 | 0.0107 | 0.0158 | 0.0117 | 0.0204 | 0.0293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.004 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0561 | 0.3913 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0126 | 0.306 | 0.0 | 0.0006 | 0.0094 | 0.0295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9117 | 0.0701 | 400 | 3.2530 | 0.0018 | 0.0057 | 0.0007 | 0.0012 | 0.0033 | 0.0005 | 0.004 | 0.0119 | 0.0158 | 0.0097 | 0.0207 | 0.0273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0046 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0552 | 0.3236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0096 | 0.2773 | 0.0 | 0.0 | 0.0117 | 0.0931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1889 | 0.0789 | 450 | 3.1164 | 0.0023 | 0.0072 | 0.0009 | 0.0015 | 0.003 | 0.0013 | 0.0054 | 0.0148 | 0.0194 | 0.0111 | 0.022 | 0.0329 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0043 | 0.0109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0213 | 0.1362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0649 | 0.3347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.3161 | 0.0 | 0.0013 | 0.0092 | 0.0916 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4447 | 0.0877 | 500 | 3.1064 | 0.003 | 0.0086 | 0.0014 | 0.0018 | 0.003 | 0.0019 | 0.0072 | 0.0192 | 0.0231 | 0.0094 | 0.0233 | 0.0319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0187 | 0.0551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0235 | 0.2581 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0704 | 0.3287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0145 | 0.3011 | 0.0 | 0.0 | 0.0128 | 0.1179 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2958 | 0.0964 | 550 | 3.0700 | 0.0026 | 0.0074 | 0.0012 | 0.0014 | 0.0041 | 0.002 | 0.007 | 0.0211 | 0.0252 | 0.0118 | 0.0241 | 0.0382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0107 | 0.0423 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0256 | 0.3295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0564 | 0.3777 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0122 | 0.2946 | 0.0 | 0.0013 | 0.013 | 0.1142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4857 | 0.1052 | 600 | 3.0383 | 0.0025 | 0.0074 | 0.0014 | 0.0011 | 0.0035 | 0.0019 | 0.0073 | 0.0226 | 0.0266 | 0.0095 | 0.0227 | 0.0394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0158 | 0.0958 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0313 | 0.4209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0504 | 0.3281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0093 | 0.2548 | 0.0 | 0.0026 | 0.0099 | 0.1208 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.935 | 0.1140 | 650 | 2.9196 | 0.0041 | 0.0105 | 0.0025 | 0.0024 | 0.0042 | 0.0028 | 0.0142 | 0.0325 | 0.0367 | 0.0114 | 0.0287 | 0.0561 | 0.0 | 0.0 | 0.0002 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0334 | 0.3532 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0376 | 0.4984 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0905 | 0.3565 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0163 | 0.3455 | 0.0 | 0.0063 | 0.0111 | 0.1268 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2927 | 0.1227 | 700 | 2.9527 | 0.0038 | 0.0105 | 0.0021 | 0.0018 | 0.0049 | 0.003 | 0.0125 | 0.0311 | 0.0351 | 0.0078 | 0.0332 | 0.0579 | 0.0 | 0.0 | 0.0019 | 0.021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0349 | 0.3019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0443 | 0.5413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0685 | 0.3121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0159 | 0.2897 | 0.0 | 0.0088 | 0.0087 | 0.1377 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1906 | 0.1315 | 750 | 2.9313 | 0.004 | 0.01 | 0.0026 | 0.0023 | 0.0051 | 0.0026 | 0.0151 | 0.0341 | 0.038 | 0.012 | 0.0317 | 0.0418 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0332 | 0.3869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0356 | 0.5307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0905 | 0.3475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0159 | 0.3252 | 0.0 | 0.0052 | 0.0098 | 0.1523 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7748 | 0.1403 | 800 | 2.9227 | 0.0041 | 0.0103 | 0.0027 | 0.002 | 0.0052 | 0.003 | 0.0131 | 0.0313 | 0.0357 | 0.0119 | 0.0309 | 0.0412 | 0.0 | 0.0 | 0.001 | 0.0167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0343 | 0.3439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0494 | 0.5185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0841 | 0.316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0129 | 0.3118 | 0.0 | 0.0093 | 0.0064 | 0.1268 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2487 | 0.1490 | 850 | 2.8708 | 0.0048 | 0.0118 | 0.003 | 0.0021 | 0.0053 | 0.0043 | 0.0172 | 0.0359 | 0.0402 | 0.01 | 0.0368 | 0.0489 | 0.0 | 0.0 | 0.0029 | 0.0458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.039 | 0.3846 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0703 | 0.6069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0697 | 0.297 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0254 | 0.3489 | 0.0 | 0.0073 | 0.0131 | 0.1606 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4871 | 0.1578 | 900 | 2.8274 | 0.0049 | 0.0118 | 0.0035 | 0.0025 | 0.0066 | 0.0036 | 0.0177 | 0.042 | 0.0461 | 0.0132 | 0.039 | 0.0603 | 0.0 | 0.0 | 0.0101 | 0.14 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0335 | 0.4625 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0471 | 0.5772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1008 | 0.397 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.35 | 0.0 | 0.0071 | 0.0143 | 0.1862 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1435 | 0.1666 | 950 | 2.8172 | 0.0053 | 0.013 | 0.0035 | 0.0027 | 0.0086 | 0.0039 | 0.0176 | 0.0423 | 0.0465 | 0.0127 | 0.042 | 0.0543 | 0.0 | 0.0 | 0.0128 | 0.1718 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0104 | 0.006 | 0.0 | 0.0 | 0.0386 | 0.4269 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0506 | 0.6368 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0982 | 0.364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.3465 | 0.0 | 0.0116 | 0.012 | 0.1748 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7159 | 0.1753 | 1000 | 2.7730 | 0.0062 | 0.0144 | 0.0044 | 0.0034 | 0.0068 | 0.0048 | 0.0208 | 0.0449 | 0.0488 | 0.0128 | 0.0428 | 0.0588 | 0.0 | 0.0 | 0.0123 | 0.2142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0469 | 0.5022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0664 | 0.5752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.117 | 0.3807 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.3686 | 0.0 | 0.0119 | 0.023 | 0.1941 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.059 | 0.1841 | 1050 | 2.7846 | 0.0053 | 0.0127 | 0.0038 | 0.0027 | 0.0071 | 0.0037 | 0.0163 | 0.0407 | 0.0441 | 0.012 | 0.0387 | 0.0499 | 0.0 | 0.0 | 0.0097 | 0.1413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0356 | 0.4535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0525 | 0.5734 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1107 | 0.3775 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0187 | 0.314 | 0.0 | 0.0153 | 0.0169 | 0.1539 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.749 | 0.1929 | 1100 | 2.7407 | 0.0051 | 0.0126 | 0.0033 | 0.0025 | 0.0081 | 0.0038 | 0.0188 | 0.0446 | 0.0481 | 0.0127 | 0.0411 | 0.0603 | 0.0 | 0.0 | 0.0115 | 0.1536 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0337 | 0.5199 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0514 | 0.6402 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0952 | 0.3801 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0235 | 0.3514 | 0.0 | 0.0149 | 0.0208 | 0.1521 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2432 | 0.2016 | 1150 | 2.7425 | 0.0053 | 0.0128 | 0.0039 | 0.0025 | 0.0064 | 0.0052 | 0.0199 | 0.043 | 0.0463 | 0.0123 | 0.0381 | 0.0572 | 0.0 | 0.0 | 0.0118 | 0.1697 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0044 | 0.0 | 0.0 | 0.0393 | 0.5045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0639 | 0.5961 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0877 | 0.3931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0211 | 0.2974 | 0.0 | 0.0131 | 0.0141 | 0.1521 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6831 | 0.2104 | 1200 | 2.7058 | 0.0064 | 0.0149 | 0.0048 | 0.0033 | 0.0086 | 0.005 | 0.0222 | 0.0475 | 0.051 | 0.0143 | 0.0443 | 0.0604 | 0.0 | 0.0 | 0.0172 | 0.278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0125 | 0.0192 | 0.0 | 0.0 | 0.0375 | 0.5087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0632 | 0.602 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.116 | 0.4038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0259 | 0.3521 | 0.0 | 0.0131 | 0.0225 | 0.167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5862 | 0.2192 | 1250 | 2.7894 | 0.0052 | 0.0126 | 0.0034 | 0.004 | 0.007 | 0.0031 | 0.0183 | 0.0441 | 0.0479 | 0.0124 | 0.0399 | 0.057 | 0.0 | 0.0 | 0.0136 | 0.2458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0289 | 0.4571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0306 | 0.5967 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1212 | 0.3892 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0228 | 0.3256 | 0.0 | 0.0058 | 0.0228 | 0.1836 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0477 | 0.2280 | 1300 | 2.6845 | 0.0063 | 0.0145 | 0.0046 | 0.0034 | 0.0076 | 0.0057 | 0.0225 | 0.0476 | 0.0508 | 0.0129 | 0.0446 | 0.063 | 0.0 | 0.0 | 0.0153 | 0.2655 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0429 | 0.5189 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0753 | 0.601 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1112 | 0.4033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0271 | 0.3516 | 0.0 | 0.0142 | 0.0196 | 0.1813 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3961 | 0.2367 | 1350 | 2.7204 | 0.0065 | 0.0151 | 0.0046 | 0.003 | 0.0075 | 0.0052 | 0.0215 | 0.0449 | 0.0478 | 0.0098 | 0.0431 | 0.0608 | 0.0 | 0.0 | 0.0145 | 0.2699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0511 | 0.4958 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0767 | 0.5943 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1089 | 0.337 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.024 | 0.3253 | 0.0 | 0.0153 | 0.017 | 0.1534 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7523 | 0.2455 | 1400 | 2.6409 | 0.0071 | 0.016 | 0.0056 | 0.0031 | 0.0094 | 0.006 | 0.0239 | 0.049 | 0.0524 | 0.0133 | 0.0471 | 0.0616 | 0.0 | 0.0 | 0.0171 | 0.2943 | 0.0 | 0.0 | 0.0 | 0.0 | 0.002 | 0.0011 | 0.0 | 0.0 | 0.0564 | 0.5301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0832 | 0.6335 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0031 | 0.0067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1127 | 0.3648 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0345 | 0.3824 | 0.0 | 0.0177 | 0.0197 | 0.1786 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1543 | 0.2543 | 1450 | 2.6222 | 0.0073 | 0.0162 | 0.0057 | 0.003 | 0.0088 | 0.0065 | 0.0233 | 0.0504 | 0.0532 | 0.0129 | 0.0487 | 0.0676 | 0.0 | 0.0 | 0.0204 | 0.2826 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0125 | 0.0538 | 0.0 | 0.0 | 0.0563 | 0.5228 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0795 | 0.6618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0091 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1107 | 0.3492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0345 | 0.3613 | 0.0 | 0.0188 | 0.0196 | 0.1866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6159 | 0.2630 | 1500 | 2.6280 | 0.0063 | 0.0143 | 0.0049 | 0.0035 | 0.0121 | 0.0056 | 0.0222 | 0.0513 | 0.0548 | 0.0171 | 0.0514 | 0.0626 | 0.0 | 0.0 | 0.0238 | 0.3125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0131 | 0.0637 | 0.0 | 0.0 | 0.0393 | 0.5022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0642 | 0.6593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0954 | 0.3978 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0346 | 0.3793 | 0.0001 | 0.0222 | 0.0189 | 0.1699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9723 | 0.2718 | 1550 | 2.6108 | 0.0064 | 0.0151 | 0.0043 | 0.0041 | 0.009 | 0.0056 | 0.023 | 0.0528 | 0.0555 | 0.0128 | 0.0491 | 0.0723 | 0.0 | 0.0 | 0.0183 | 0.3337 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.0066 | 0.0 | 0.0 | 0.0396 | 0.5881 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0661 | 0.6565 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.0189 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0978 | 0.3676 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0402 | 0.3718 | 0.0 | 0.0162 | 0.0224 | 0.1956 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7771 | 0.2806 | 1600 | 2.6634 | 0.0066 | 0.0164 | 0.0041 | 0.0033 | 0.0102 | 0.006 | 0.0208 | 0.0476 | 0.0501 | 0.0107 | 0.0449 | 0.0612 | 0.0 | 0.0 | 0.028 | 0.2784 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0383 | 0.0956 | 0.0 | 0.0 | 0.0353 | 0.467 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0582 | 0.6116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.0061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0982 | 0.3607 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0233 | 0.2909 | 0.0 | 0.02 | 0.0172 | 0.1758 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1096 | 0.2893 | 1650 | 2.6056 | 0.0072 | 0.0165 | 0.0054 | 0.0045 | 0.0107 | 0.0057 | 0.0233 | 0.0522 | 0.0548 | 0.0155 | 0.0503 | 0.0627 | 0.0 | 0.0 | 0.0236 | 0.2517 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0257 | 0.0555 | 0.0 | 0.0 | 0.0385 | 0.5474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0544 | 0.6831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1234 | 0.4098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0437 | 0.3424 | 0.0 | 0.0259 | 0.0201 | 0.1888 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4115 | 0.2981 | 1700 | 2.5755 | 0.0074 | 0.0171 | 0.0054 | 0.0062 | 0.0114 | 0.006 | 0.0237 | 0.0521 | 0.0549 | 0.0173 | 0.0529 | 0.0651 | 0.0 | 0.0 | 0.0198 | 0.3127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0049 | 0.0121 | 0.0 | 0.0 | 0.0582 | 0.5609 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0763 | 0.6457 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1189 | 0.3901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0372 | 0.3403 | 0.0 | 0.0228 | 0.0257 | 0.2216 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9269 | 0.3069 | 1750 | 2.6026 | 0.0073 | 0.0169 | 0.0054 | 0.0046 | 0.0104 | 0.0062 | 0.0228 | 0.0509 | 0.0533 | 0.0138 | 0.0492 | 0.0626 | 0.0 | 0.0 | 0.0212 | 0.2786 | 0.0 | 0.0 | 0.0 | 0.0 | 0.014 | 0.0275 | 0.0 | 0.0 | 0.0425 | 0.5353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0823 | 0.6587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1087 | 0.3829 | 0.0007 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0431 | 0.3405 | 0.0 | 0.0205 | 0.0235 | 0.193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6799 | 0.3156 | 1800 | 2.5833 | 0.0088 | 0.0195 | 0.007 | 0.0035 | 0.0107 | 0.0079 | 0.025 | 0.0522 | 0.0549 | 0.0142 | 0.0514 | 0.0634 | 0.0 | 0.0 | 0.0312 | 0.2992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0426 | 0.117 | 0.0 | 0.0 | 0.0424 | 0.4894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1008 | 0.6413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1177 | 0.3945 | 0.0013 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0421 | 0.3553 | 0.0 | 0.0246 | 0.0241 | 0.1937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.006 | 0.3244 | 1850 | 2.5880 | 0.0082 | 0.0186 | 0.0061 | 0.0037 | 0.0131 | 0.0066 | 0.0257 | 0.0549 | 0.058 | 0.0123 | 0.0534 | 0.0708 | 0.0 | 0.0 | 0.0345 | 0.3229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0363 | 0.1346 | 0.0 | 0.0 | 0.0388 | 0.5433 | 0.0 | 0.0 | 0.0079 | 0.005 | 0.0 | 0.0 | 0.0819 | 0.6921 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1325 | 0.3889 | 0.001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0266 | 0.3536 | 0.0 | 0.0162 | 0.0172 | 0.1779 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5858 | 0.3332 | 1900 | 2.5454 | 0.0083 | 0.0187 | 0.0066 | 0.0083 | 0.0111 | 0.007 | 0.0258 | 0.0532 | 0.0558 | 0.0182 | 0.0509 | 0.0667 | 0.0 | 0.0 | 0.0312 | 0.3076 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0424 | 0.1027 | 0.0 | 0.0 | 0.0388 | 0.5221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0787 | 0.6823 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1286 | 0.3915 | 0.0005 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0443 | 0.346 | 0.0 | 0.0237 | 0.0181 | 0.1758 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5725 | 0.3419 | 1950 | 2.5653 | 0.0072 | 0.0172 | 0.0051 | 0.0036 | 0.011 | 0.0057 | 0.0239 | 0.0518 | 0.0543 | 0.0133 | 0.051 | 0.0626 | 0.0 | 0.0 | 0.0297 | 0.3201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.004 | 0.0099 | 0.0 | 0.0 | 0.0362 | 0.5583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0747 | 0.6368 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0232 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1194 | 0.4112 | 0.0008 | 0.0061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0455 | 0.3313 | 0.0 | 0.0209 | 0.0207 | 0.179 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3282 | 0.3507 | 2000 | 2.5389 | 0.0075 | 0.019 | 0.0049 | 0.0046 | 0.0126 | 0.0062 | 0.0249 | 0.0533 | 0.0555 | 0.0139 | 0.0527 | 0.0667 | 0.0 | 0.0 | 0.0327 | 0.3345 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0309 | 0.0791 | 0.0 | 0.0 | 0.0346 | 0.5272 | 0.0 | 0.0 | 0.002 | 0.0144 | 0.0 | 0.0 | 0.0571 | 0.6435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1172 | 0.3879 | 0.0003 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0486 | 0.328 | 0.0004 | 0.0265 | 0.0223 | 0.1917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7413 | 0.3595 | 2050 | 2.5175 | 0.0082 | 0.0184 | 0.0063 | 0.0065 | 0.0113 | 0.0068 | 0.0254 | 0.0545 | 0.0565 | 0.0156 | 0.0526 | 0.0691 | 0.0 | 0.0 | 0.027 | 0.3201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0149 | 0.017 | 0.0 | 0.0 | 0.038 | 0.5875 | 0.0 | 0.0 | 0.0038 | 0.005 | 0.0 | 0.0 | 0.0896 | 0.7039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1289 | 0.4177 | 0.0005 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0588 | 0.3411 | 0.0 | 0.017 | 0.0149 | 0.1652 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0964 | 0.3682 | 2100 | 2.4969 | 0.0087 | 0.0196 | 0.0065 | 0.0071 | 0.0128 | 0.0069 | 0.0288 | 0.057 | 0.0592 | 0.0177 | 0.0593 | 0.0662 | 0.0 | 0.0 | 0.0286 | 0.3347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0172 | 0.039 | 0.0 | 0.0 | 0.0392 | 0.5468 | 0.0 | 0.0 | 0.0131 | 0.0312 | 0.0 | 0.0 | 0.0714 | 0.7169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0232 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1451 | 0.4155 | 0.0097 | 0.0188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0569 | 0.361 | 0.0 | 0.0284 | 0.0199 | 0.2037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5156 | 0.3770 | 2150 | 2.5189 | 0.0088 | 0.0198 | 0.0072 | 0.0045 | 0.013 | 0.0069 | 0.0265 | 0.0558 | 0.0583 | 0.0155 | 0.053 | 0.0702 | 0.0 | 0.0 | 0.0316 | 0.3254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0384 | 0.0665 | 0.0 | 0.0 | 0.0353 | 0.5301 | 0.0 | 0.0 | 0.0126 | 0.0262 | 0.0 | 0.0 | 0.0625 | 0.7154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0091 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1458 | 0.4241 | 0.0017 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0516 | 0.3588 | 0.0 | 0.0237 | 0.0246 | 0.188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4525 | 0.3858 | 2200 | 2.4799 | 0.0093 | 0.0209 | 0.0068 | 0.0076 | 0.0133 | 0.0076 | 0.0288 | 0.0581 | 0.0603 | 0.019 | 0.0648 | 0.0683 | 0.0 | 0.0 | 0.0307 | 0.354 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0027 | 0.0 | 0.0 | 0.0393 | 0.583 | 0.0 | 0.0 | 0.0206 | 0.0825 | 0.0 | 0.0 | 0.104 | 0.7022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1433 | 0.4177 | 0.0022 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0589 | 0.3652 | 0.0 | 0.0271 | 0.0254 | 0.1888 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5135 | 0.3945 | 2250 | 2.4928 | 0.0089 | 0.0209 | 0.0063 | 0.0053 | 0.0129 | 0.0075 | 0.0281 | 0.0543 | 0.0566 | 0.0165 | 0.0544 | 0.0646 | 0.0 | 0.0 | 0.0289 | 0.3284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0046 | 0.0176 | 0.0 | 0.0 | 0.0418 | 0.5554 | 0.0 | 0.0 | 0.008 | 0.03 | 0.0 | 0.0 | 0.0966 | 0.6295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1491 | 0.4096 | 0.0016 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0522 | 0.3494 | 0.0 | 0.0263 | 0.0243 | 0.1982 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.243 | 0.4033 | 2300 | 2.5027 | 0.0092 | 0.0214 | 0.0066 | 0.0047 | 0.012 | 0.008 | 0.0302 | 0.0562 | 0.0584 | 0.0158 | 0.0569 | 0.0661 | 0.0 | 0.0 | 0.0298 | 0.3701 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0115 | 0.0 | 0.0 | 0.0589 | 0.5926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0975 | 0.6219 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0225 | 0.0 | 0.0 | 0.0004 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0451 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1552 | 0.3895 | 0.0011 | 0.0192 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0554 | 0.3741 | 0.0001 | 0.0252 | 0.0225 | 0.2011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4243 | 0.4121 | 2350 | 2.4586 | 0.0112 | 0.0243 | 0.0091 | 0.0078 | 0.0136 | 0.01 | 0.0335 | 0.0592 | 0.0612 | 0.0191 | 0.0586 | 0.0707 | 0.0 | 0.0 | 0.0312 | 0.3761 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0222 | 0.0687 | 0.0 | 0.0 | 0.0671 | 0.6058 | 0.0 | 0.0 | 0.001 | 0.0025 | 0.0 | 0.0 | 0.1231 | 0.6679 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1703 | 0.4109 | 0.0004 | 0.015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0741 | 0.3793 | 0.0 | 0.0291 | 0.0221 | 0.2048 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1652 | 0.4208 | 2400 | 2.4811 | 0.0107 | 0.023 | 0.0089 | 0.0055 | 0.013 | 0.0104 | 0.0323 | 0.0587 | 0.0608 | 0.0184 | 0.0579 | 0.0722 | 0.0 | 0.0 | 0.0297 | 0.3538 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0144 | 0.0538 | 0.0 | 0.0 | 0.0598 | 0.5827 | 0.0 | 0.0 | 0.0048 | 0.015 | 0.0 | 0.0 | 0.1252 | 0.6722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0132 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0323 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.153 | 0.4349 | 0.0011 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0756 | 0.3917 | 0.0001 | 0.036 | 0.0262 | 0.2031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0276 | 0.4296 | 2450 | 2.4642 | 0.0118 | 0.025 | 0.0102 | 0.0085 | 0.0127 | 0.0122 | 0.0333 | 0.0577 | 0.0601 | 0.0224 | 0.0608 | 0.0689 | 0.0 | 0.0 | 0.03 | 0.3508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0117 | 0.0357 | 0.0 | 0.0 | 0.0759 | 0.5426 | 0.0 | 0.0 | 0.0105 | 0.04 | 0.0 | 0.0 | 0.162 | 0.6463 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1458 | 0.4323 | 0.0029 | 0.0216 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0735 | 0.3735 | 0.0001 | 0.0291 | 0.03 | 0.2444 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7157 | 0.4384 | 2500 | 2.4744 | 0.0114 | 0.0245 | 0.0094 | 0.007 | 0.0134 | 0.0112 | 0.0337 | 0.0604 | 0.0624 | 0.0203 | 0.0602 | 0.0728 | 0.0 | 0.0 | 0.0253 | 0.3511 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0221 | 0.0758 | 0.0 | 0.0 | 0.0601 | 0.583 | 0.0 | 0.0 | 0.013 | 0.0469 | 0.0 | 0.0 | 0.1373 | 0.6579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.014 | 0.0 | 0.0 | 0.0002 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1623 | 0.4422 | 0.0017 | 0.0249 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0767 | 0.3772 | 0.0 | 0.0291 | 0.0259 | 0.2342 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4116 | 0.4471 | 2550 | 2.5058 | 0.0114 | 0.0247 | 0.0094 | 0.0055 | 0.0146 | 0.0113 | 0.0336 | 0.0582 | 0.0601 | 0.0175 | 0.0576 | 0.0806 | 0.0 | 0.0 | 0.0264 | 0.3239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0313 | 0.083 | 0.0 | 0.0 | 0.0666 | 0.5439 | 0.0 | 0.0 | 0.0184 | 0.0831 | 0.0 | 0.0 | 0.126 | 0.6717 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0109 | 0.0 | 0.0 | 0.0003 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1619 | 0.4105 | 0.0008 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0722 | 0.3659 | 0.0 | 0.0274 | 0.0177 | 0.2046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3045 | 0.4559 | 2600 | 2.4346 | 0.0118 | 0.0263 | 0.0091 | 0.0047 | 0.0142 | 0.0113 | 0.0359 | 0.0617 | 0.0636 | 0.0162 | 0.0633 | 0.0811 | 0.0 | 0.0 | 0.0322 | 0.3284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0516 | 0.1225 | 0.0 | 0.0 | 0.0578 | 0.6096 | 0.0 | 0.0 | 0.0211 | 0.0806 | 0.0 | 0.0 | 0.1313 | 0.6831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0244 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1462 | 0.3883 | 0.0016 | 0.0268 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.072 | 0.401 | 0.0001 | 0.0323 | 0.0287 | 0.2105 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7487 | 0.4647 | 2650 | 2.4230 | 0.012 | 0.026 | 0.0094 | 0.0059 | 0.0153 | 0.0109 | 0.0351 | 0.0606 | 0.0624 | 0.0162 | 0.0644 | 0.0759 | 0.0 | 0.0 | 0.0306 | 0.379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0249 | 0.0478 | 0.0 | 0.0 | 0.0636 | 0.584 | 0.0 | 0.0 | 0.0264 | 0.1006 | 0.0 | 0.0 | 0.1258 | 0.6687 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1639 | 0.403 | 0.0022 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.002 | 0.0015 | 0.0 | 0.0 | 0.0819 | 0.3821 | 0.0001 | 0.0317 | 0.0279 | 0.1973 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5341 | 0.4734 | 2700 | 2.4047 | 0.0123 | 0.0268 | 0.0102 | 0.0058 | 0.0154 | 0.0121 | 0.0366 | 0.063 | 0.0648 | 0.0184 | 0.0716 | 0.0732 | 0.0 | 0.0 | 0.0326 | 0.3693 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0239 | 0.0758 | 0.0 | 0.0 | 0.0698 | 0.5378 | 0.0 | 0.0 | 0.0201 | 0.1138 | 0.0 | 0.0 | 0.1383 | 0.7185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1568 | 0.4323 | 0.0012 | 0.0277 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0009 | 0.003 | 0.006 | 0.0 | 0.0 | 0.0872 | 0.3811 | 0.0001 | 0.0299 | 0.0314 | 0.211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9958 | 0.4822 | 2750 | 2.4361 | 0.0113 | 0.0248 | 0.0094 | 0.0054 | 0.0141 | 0.0107 | 0.034 | 0.0592 | 0.0609 | 0.0165 | 0.0621 | 0.08 | 0.0 | 0.0 | 0.0312 | 0.3646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0151 | 0.0495 | 0.0 | 0.0 | 0.0782 | 0.566 | 0.0 | 0.0 | 0.0159 | 0.0756 | 0.0 | 0.0 | 0.1104 | 0.6681 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0567 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1889 | 0.4517 | 0.0007 | 0.0141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.3254 | 0.0004 | 0.0325 | 0.0217 | 0.1811 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1237 | 0.4910 | 2800 | 2.4345 | 0.0125 | 0.0265 | 0.0104 | 0.0055 | 0.0167 | 0.0111 | 0.0371 | 0.0658 | 0.0677 | 0.0183 | 0.07 | 0.0832 | 0.0 | 0.0 | 0.0316 | 0.3992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.034 | 0.0681 | 0.0 | 0.0 | 0.0639 | 0.5715 | 0.0 | 0.0 | 0.0214 | 0.1037 | 0.0 | 0.0 | 0.0996 | 0.7461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0132 | 0.0072 | 0.0397 | 0.0004 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1948 | 0.4217 | 0.0021 | 0.0272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0097 | 0.0035 | 0.0052 | 0.0 | 0.0 | 0.0802 | 0.3751 | 0.0001 | 0.0317 | 0.0255 | 0.2304 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7383 | 0.4997 | 2850 | 2.4580 | 0.014 | 0.0294 | 0.0114 | 0.0059 | 0.0157 | 0.017 | 0.0351 | 0.0585 | 0.06 | 0.015 | 0.0616 | 0.0781 | 0.0 | 0.0 | 0.026 | 0.3604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.022 | 0.0214 | 0.0 | 0.0 | 0.0809 | 0.5119 | 0.0 | 0.0 | 0.0183 | 0.1144 | 0.0 | 0.0 | 0.1565 | 0.6553 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.014 | 0.0071 | 0.0233 | 0.0001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0524 | 0.0 | 0.0 | 0.0008 | 0.0058 | 0.0 | 0.0 | 0.2148 | 0.4119 | 0.0007 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0923 | 0.3404 | 0.0 | 0.0289 | 0.0205 | 0.1929 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3969 | 0.5085 | 2900 | 2.3994 | 0.0139 | 0.0298 | 0.0115 | 0.006 | 0.0171 | 0.0122 | 0.0408 | 0.0676 | 0.0693 | 0.0167 | 0.0653 | 0.0863 | 0.0 | 0.0 | 0.0385 | 0.4015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.054 | 0.1984 | 0.0 | 0.0 | 0.0743 | 0.6016 | 0.0 | 0.0 | 0.0321 | 0.1419 | 0.0 | 0.0 | 0.111 | 0.7096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0248 | 0.0022 | 0.011 | 0.0003 | 0.0193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.0598 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2075 | 0.4093 | 0.0008 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0059 | 0.0028 | 0.002 | 0.0015 | 0.0 | 0.0 | 0.0735 | 0.348 | 0.0001 | 0.0317 | 0.034 | 0.2201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8744 | 0.5173 | 2950 | 2.4236 | 0.0142 | 0.0302 | 0.0117 | 0.0062 | 0.0187 | 0.0127 | 0.0415 | 0.0688 | 0.0707 | 0.0164 | 0.0666 | 0.092 | 0.0 | 0.0 | 0.0364 | 0.4017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0809 | 0.2731 | 0.0 | 0.0 | 0.0636 | 0.592 | 0.0 | 0.0 | 0.0255 | 0.1325 | 0.0 | 0.0 | 0.1172 | 0.6996 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0178 | 0.0056 | 0.0151 | 0.0 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0549 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2097 | 0.4215 | 0.0026 | 0.0362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0018 | 0.0077 | 0.0082 | 0.0 | 0.0 | 0.0703 | 0.3609 | 0.0001 | 0.0287 | 0.031 | 0.2069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2069 | 0.5260 | 3000 | 2.3837 | 0.0143 | 0.0299 | 0.0122 | 0.0062 | 0.0167 | 0.0131 | 0.0414 | 0.0661 | 0.068 | 0.0177 | 0.0681 | 0.0858 | 0.0 | 0.0 | 0.035 | 0.382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0618 | 0.2099 | 0.0 | 0.0 | 0.0645 | 0.5782 | 0.0 | 0.0 | 0.02 | 0.1213 | 0.0 | 0.0 | 0.1504 | 0.6894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.0209 | 0.0025 | 0.0068 | 0.0004 | 0.0193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2023 | 0.4255 | 0.0026 | 0.0343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0882 | 0.3815 | 0.0 | 0.0293 | 0.0264 | 0.1881 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.471 | 0.5348 | 3050 | 2.3813 | 0.0143 | 0.0313 | 0.0112 | 0.0085 | 0.0157 | 0.0203 | 0.0422 | 0.066 | 0.0677 | 0.0189 | 0.067 | 0.0854 | 0.0 | 0.0 | 0.0297 | 0.3494 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0647 | 0.2203 | 0.0 | 0.0 | 0.0699 | 0.5606 | 0.0 | 0.0 | 0.0218 | 0.1425 | 0.0 | 0.0 | 0.1336 | 0.6982 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0171 | 0.0195 | 0.0397 | 0.0001 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0506 | 0.0 | 0.0 | 0.0011 | 0.005 | 0.0 | 0.0 | 0.1917 | 0.4071 | 0.0009 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0014 | 0.002 | 0.0015 | 0.0 | 0.0 | 0.0911 | 0.3622 | 0.0001 | 0.0386 | 0.0269 | 0.1804 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5737 | 0.5436 | 3100 | 2.3910 | 0.0144 | 0.0307 | 0.0122 | 0.0069 | 0.017 | 0.0128 | 0.0415 | 0.0665 | 0.0682 | 0.0171 | 0.0645 | 0.0904 | 0.0 | 0.0 | 0.0344 | 0.3867 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0721 | 0.2582 | 0.0 | 0.0 | 0.0725 | 0.5795 | 0.0 | 0.0 | 0.0262 | 0.1388 | 0.0 | 0.0 | 0.1138 | 0.6502 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0132 | 0.0119 | 0.0247 | 0.0002 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0372 | 0.0 | 0.0 | 0.0001 | 0.0008 | 0.0 | 0.0 | 0.203 | 0.4221 | 0.0006 | 0.0244 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.003 | 0.0014 | 0.0181 | 0.0179 | 0.0 | 0.0 | 0.0723 | 0.3391 | 0.0001 | 0.0409 | 0.0341 | 0.1889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0143 | 0.5523 | 3150 | 2.3837 | 0.0142 | 0.0321 | 0.0107 | 0.006 | 0.019 | 0.0167 | 0.0412 | 0.0647 | 0.0664 | 0.0178 | 0.0698 | 0.0834 | 0.0 | 0.0 | 0.0335 | 0.3383 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0634 | 0.2132 | 0.0 | 0.0 | 0.0699 | 0.5256 | 0.0 | 0.0 | 0.0248 | 0.1394 | 0.0 | 0.0 | 0.138 | 0.6937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0186 | 0.0119 | 0.0082 | 0.0005 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.003 | 0.0561 | 0.0 | 0.0 | 0.0005 | 0.005 | 0.0 | 0.0 | 0.1948 | 0.4151 | 0.0007 | 0.0122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.004 | 0.0023 | 0.0051 | 0.0112 | 0.0 | 0.0 | 0.0665 | 0.3439 | 0.0001 | 0.0371 | 0.0353 | 0.2085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0058 | 0.5611 | 3200 | 2.3539 | 0.0163 | 0.0341 | 0.0143 | 0.0067 | 0.0198 | 0.0192 | 0.0451 | 0.0696 | 0.0712 | 0.0192 | 0.0717 | 0.0908 | 0.0 | 0.0 | 0.0348 | 0.3801 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0734 | 0.2516 | 0.0 | 0.0 | 0.0795 | 0.5734 | 0.0 | 0.0 | 0.0288 | 0.1725 | 0.0 | 0.0 | 0.1602 | 0.677 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0194 | 0.0139 | 0.0096 | 0.0001 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0591 | 0.0 | 0.0 | 0.004 | 0.0033 | 0.0 | 0.0 | 0.2225 | 0.4366 | 0.001 | 0.0155 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.006 | 0.0107 | 0.0313 | 0.0 | 0.0 | 0.0795 | 0.368 | 0.0001 | 0.0375 | 0.0376 | 0.2274 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9248 | 0.5699 | 3250 | 2.3710 | 0.0152 | 0.0344 | 0.0116 | 0.0055 | 0.0183 | 0.0144 | 0.0458 | 0.0728 | 0.0744 | 0.0159 | 0.0723 | 0.0968 | 0.0 | 0.0 | 0.0381 | 0.3682 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1006 | 0.3775 | 0.0 | 0.0 | 0.079 | 0.5728 | 0.0 | 0.0 | 0.0366 | 0.2144 | 0.0 | 0.0 | 0.1095 | 0.6823 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0124 | 0.0109 | 0.0192 | 0.0002 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1749 | 0.3696 | 0.0017 | 0.0254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0222 | 0.053 | 0.0 | 0.0 | 0.0861 | 0.3861 | 0.0005 | 0.0364 | 0.0356 | 0.2265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4845 | 0.5786 | 3300 | 2.3575 | 0.0172 | 0.0358 | 0.0152 | 0.007 | 0.0211 | 0.0206 | 0.0465 | 0.0751 | 0.0769 | 0.0204 | 0.0789 | 0.0961 | 0.0 | 0.0 | 0.0338 | 0.39 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1084 | 0.3352 | 0.0 | 0.0 | 0.0901 | 0.5865 | 0.0 | 0.0 | 0.0256 | 0.1637 | 0.0 | 0.0 | 0.1222 | 0.6957 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0395 | 0.0131 | 0.0192 | 0.0001 | 0.0101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.072 | 0.0 | 0.0 | 0.0002 | 0.005 | 0.0 | 0.0 | 0.2249 | 0.4389 | 0.0015 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0074 | 0.0242 | 0.0515 | 0.0 | 0.0 | 0.1046 | 0.3955 | 0.0011 | 0.0308 | 0.0377 | 0.2663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6436 | 0.5874 | 3350 | 2.3762 | 0.016 | 0.034 | 0.0139 | 0.0058 | 0.0193 | 0.0205 | 0.0413 | 0.0657 | 0.0674 | 0.0168 | 0.066 | 0.0903 | 0.0 | 0.0 | 0.0423 | 0.3475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1146 | 0.2973 | 0.0 | 0.0 | 0.082 | 0.559 | 0.0 | 0.0 | 0.0217 | 0.0969 | 0.0 | 0.0 | 0.1201 | 0.6346 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0248 | 0.0059 | 0.0123 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0366 | 0.0 | 0.0 | 0.0011 | 0.0083 | 0.0 | 0.0 | 0.2079 | 0.4224 | 0.0006 | 0.0221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0149 | 0.0172 | 0.0 | 0.0 | 0.0955 | 0.3659 | 0.0 | 0.0334 | 0.028 | 0.2235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8165 | 0.5962 | 3400 | 2.3591 | 0.0168 | 0.0365 | 0.0139 | 0.0059 | 0.0208 | 0.0241 | 0.0467 | 0.0731 | 0.0749 | 0.0174 | 0.0757 | 0.106 | 0.0 | 0.0 | 0.0347 | 0.3883 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1181 | 0.3434 | 0.0 | 0.0 | 0.0839 | 0.5728 | 0.0 | 0.0 | 0.029 | 0.1544 | 0.0 | 0.0 | 0.1241 | 0.6687 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0341 | 0.01 | 0.0288 | 0.0002 | 0.011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0037 | 0.0726 | 0.0 | 0.0 | 0.0006 | 0.0142 | 0.0 | 0.0 | 0.2038 | 0.3949 | 0.0013 | 0.0254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0069 | 0.0236 | 0.0672 | 0.0 | 0.0 | 0.0974 | 0.3919 | 0.0001 | 0.0366 | 0.0339 | 0.2364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6289 | 0.6049 | 3450 | 2.3350 | 0.0193 | 0.0396 | 0.0164 | 0.0072 | 0.023 | 0.0238 | 0.0485 | 0.0748 | 0.0766 | 0.02 | 0.0758 | 0.1012 | 0.0 | 0.0 | 0.0408 | 0.4006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1158 | 0.3297 | 0.0 | 0.0 | 0.1037 | 0.5962 | 0.0 | 0.0 | 0.0372 | 0.1556 | 0.0 | 0.0 | 0.1328 | 0.6974 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0419 | 0.0162 | 0.0247 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0027 | 0.0622 | 0.0 | 0.0 | 0.0003 | 0.0058 | 0.0 | 0.0 | 0.2475 | 0.4331 | 0.0009 | 0.023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0009 | 0.0291 | 0.0627 | 0.0 | 0.0 | 0.1206 | 0.3855 | 0.0001 | 0.0405 | 0.0395 | 0.2619 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9005 | 0.6137 | 3500 | 2.3229 | 0.0181 | 0.0379 | 0.015 | 0.0066 | 0.0222 | 0.023 | 0.0504 | 0.077 | 0.079 | 0.0178 | 0.0783 | 0.1085 | 0.0 | 0.0 | 0.0441 | 0.3786 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1523 | 0.4462 | 0.0 | 0.0 | 0.0911 | 0.6067 | 0.0 | 0.0 | 0.0328 | 0.1513 | 0.0 | 0.0 | 0.1074 | 0.7035 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0295 | 0.0098 | 0.0301 | 0.0001 | 0.0064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0561 | 0.0 | 0.0 | 0.0004 | 0.0058 | 0.0 | 0.0 | 0.2127 | 0.4042 | 0.0015 | 0.0225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.0111 | 0.0169 | 0.0903 | 0.0 | 0.0 | 0.1206 | 0.4023 | 0.0001 | 0.041 | 0.0379 | 0.2482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.295 | 0.6225 | 3550 | 2.3397 | 0.0182 | 0.0381 | 0.0157 | 0.007 | 0.0213 | 0.0223 | 0.05 | 0.0778 | 0.0796 | 0.0186 | 0.0807 | 0.1027 | 0.0 | 0.0 | 0.0416 | 0.3835 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1718 | 0.4214 | 0.0 | 0.0 | 0.0627 | 0.6054 | 0.0 | 0.0 | 0.027 | 0.1444 | 0.0 | 0.0 | 0.1068 | 0.736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0434 | 0.0109 | 0.0479 | 0.0 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0494 | 0.0 | 0.0 | 0.0008 | 0.0158 | 0.0 | 0.0 | 0.2304 | 0.4273 | 0.0003 | 0.0113 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.0147 | 0.0163 | 0.0746 | 0.0 | 0.0 | 0.1267 | 0.3987 | 0.0001 | 0.0496 | 0.039 | 0.2342 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.664 | 0.6312 | 3600 | 2.3075 | 0.0189 | 0.0386 | 0.0172 | 0.0072 | 0.0239 | 0.0226 | 0.0494 | 0.0755 | 0.077 | 0.0222 | 0.0815 | 0.0931 | 0.0 | 0.0 | 0.0438 | 0.3909 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1359 | 0.3659 | 0.0 | 0.0 | 0.0953 | 0.5885 | 0.0 | 0.0 | 0.0297 | 0.1575 | 0.0 | 0.0 | 0.1464 | 0.7272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0512 | 0.0198 | 0.0178 | 0.0003 | 0.0101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0494 | 0.0 | 0.0 | 0.0001 | 0.005 | 0.0 | 0.0 | 0.2067 | 0.4464 | 0.0019 | 0.0254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0074 | 0.0168 | 0.0463 | 0.0 | 0.0 | 0.1319 | 0.3749 | 0.0 | 0.036 | 0.0383 | 0.2424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3103 | 0.6400 | 3650 | 2.3209 | 0.0191 | 0.0394 | 0.0169 | 0.0096 | 0.0212 | 0.0185 | 0.0519 | 0.0778 | 0.0794 | 0.0225 | 0.0798 | 0.1005 | 0.0 | 0.0 | 0.0549 | 0.421 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1577 | 0.3593 | 0.0 | 0.0 | 0.0847 | 0.6151 | 0.0 | 0.0 | 0.0355 | 0.1963 | 0.0 | 0.0 | 0.1194 | 0.7134 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.038 | 0.0022 | 0.0082 | 0.0001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.0634 | 0.0 | 0.0 | 0.0004 | 0.0042 | 0.0 | 0.0 | 0.2359 | 0.4422 | 0.0031 | 0.0277 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.0134 | 0.0202 | 0.0672 | 0.0 | 0.0 | 0.1118 | 0.3757 | 0.0001 | 0.0457 | 0.042 | 0.2555 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7188 | 0.6488 | 3700 | 2.3122 | 0.0183 | 0.0388 | 0.0156 | 0.0068 | 0.0229 | 0.0237 | 0.0508 | 0.0781 | 0.0796 | 0.0195 | 0.0807 | 0.1046 | 0.0 | 0.0 | 0.0472 | 0.4246 | 0.0 | 0.0 | 0.0 | 0.0 | 0.111 | 0.306 | 0.0 | 0.0 | 0.0829 | 0.6026 | 0.0 | 0.0 | 0.0491 | 0.2537 | 0.0 | 0.0 | 0.1227 | 0.7313 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0326 | 0.0218 | 0.0685 | 0.0 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.0683 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.235 | 0.4297 | 0.0034 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0031 | 0.0161 | 0.0144 | 0.0425 | 0.0 | 0.0 | 0.1053 | 0.3698 | 0.0001 | 0.0401 | 0.0408 | 0.2382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4934 | 0.6575 | 3750 | 2.3195 | 0.0178 | 0.0384 | 0.0141 | 0.0066 | 0.0228 | 0.018 | 0.0521 | 0.0779 | 0.0794 | 0.0189 | 0.0788 | 0.1065 | 0.0 | 0.0 | 0.044 | 0.4176 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1247 | 0.3462 | 0.0 | 0.0 | 0.0788 | 0.5926 | 0.0 | 0.0 | 0.0424 | 0.2275 | 0.0 | 0.0 | 0.1169 | 0.7004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0349 | 0.0174 | 0.1082 | 0.0005 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0598 | 0.0 | 0.0 | 0.0008 | 0.01 | 0.0 | 0.0 | 0.2264 | 0.4133 | 0.0035 | 0.0418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0055 | 0.0122 | 0.0455 | 0.0 | 0.0 | 0.109 | 0.3746 | 0.0001 | 0.0394 | 0.0405 | 0.2159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.652 | 0.6663 | 3800 | 2.3062 | 0.0188 | 0.0388 | 0.0168 | 0.0072 | 0.0206 | 0.0227 | 0.0507 | 0.0784 | 0.0798 | 0.0188 | 0.0772 | 0.1042 | 0.0 | 0.0 | 0.0459 | 0.4222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1322 | 0.3429 | 0.0 | 0.0 | 0.0787 | 0.5965 | 0.0 | 0.0 | 0.0445 | 0.2081 | 0.0 | 0.0 | 0.1157 | 0.7171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0326 | 0.0387 | 0.0863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0512 | 0.0 | 0.0 | 0.0004 | 0.0167 | 0.0 | 0.0 | 0.2275 | 0.4498 | 0.003 | 0.0441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0074 | 0.0208 | 0.0612 | 0.0 | 0.0 | 0.1063 | 0.3782 | 0.0001 | 0.0386 | 0.0461 | 0.22 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4515 | 0.6751 | 3850 | 2.3129 | 0.0192 | 0.0399 | 0.0162 | 0.0087 | 0.0217 | 0.0261 | 0.0479 | 0.0724 | 0.0737 | 0.0208 | 0.0757 | 0.0962 | 0.0 | 0.0 | 0.0461 | 0.4284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0712 | 0.1896 | 0.0 | 0.0 | 0.1288 | 0.5362 | 0.0015 | 0.0029 | 0.0363 | 0.1856 | 0.0 | 0.0 | 0.1532 | 0.6776 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0597 | 0.0378 | 0.0795 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0646 | 0.0 | 0.0 | 0.0009 | 0.01 | 0.0 | 0.0 | 0.2305 | 0.4397 | 0.0038 | 0.046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0005 | 0.0152 | 0.0463 | 0.0 | 0.0 | 0.1182 | 0.3667 | 0.0001 | 0.0386 | 0.0375 | 0.2202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.022 | 0.6839 | 3900 | 2.2908 | 0.0195 | 0.0405 | 0.0162 | 0.0079 | 0.0221 | 0.0273 | 0.0548 | 0.0819 | 0.0834 | 0.0202 | 0.0811 | 0.1161 | 0.0 | 0.0 | 0.0465 | 0.4252 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1172 | 0.3852 | 0.0 | 0.0 | 0.1043 | 0.5904 | 0.0 | 0.0 | 0.0587 | 0.26 | 0.0 | 0.0 | 0.1241 | 0.7116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.0698 | 0.0459 | 0.0712 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0616 | 0.0 | 0.0 | 0.0012 | 0.0225 | 0.0 | 0.0 | 0.2331 | 0.4353 | 0.0054 | 0.0545 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0092 | 0.0187 | 0.0843 | 0.0 | 0.0 | 0.0914 | 0.3698 | 0.0001 | 0.0437 | 0.0451 | 0.2427 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2464 | 0.6926 | 3950 | 2.2897 | 0.0204 | 0.0435 | 0.0165 | 0.0069 | 0.0217 | 0.0292 | 0.0547 | 0.0797 | 0.0812 | 0.0207 | 0.0811 | 0.109 | 0.0 | 0.0 | 0.043 | 0.3985 | 0.0 | 0.0 | 0.0 | 0.0 | 0.113 | 0.3505 | 0.0 | 0.0 | 0.1428 | 0.5606 | 0.0 | 0.0 | 0.0436 | 0.2656 | 0.0 | 0.0 | 0.1676 | 0.7024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0837 | 0.0365 | 0.0986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.064 | 0.0 | 0.0 | 0.0019 | 0.035 | 0.0 | 0.0 | 0.2033 | 0.3996 | 0.0053 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0078 | 0.0236 | 0.0731 | 0.0 | 0.0 | 0.1034 | 0.369 | 0.0001 | 0.0382 | 0.0498 | 0.2417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4056 | 0.7014 | 4000 | 2.2959 | 0.0201 | 0.0432 | 0.0162 | 0.0075 | 0.0215 | 0.0269 | 0.0533 | 0.0782 | 0.0798 | 0.0166 | 0.0771 | 0.1047 | 0.0 | 0.0 | 0.0583 | 0.4265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1172 | 0.35 | 0.0 | 0.0 | 0.1024 | 0.599 | 0.0 | 0.0 | 0.0499 | 0.24 | 0.0 | 0.0 | 0.1493 | 0.6591 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0434 | 0.0263 | 0.0904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0567 | 0.0 | 0.0 | 0.0025 | 0.0492 | 0.0 | 0.0 | 0.2298 | 0.4159 | 0.0055 | 0.0484 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0041 | 0.0247 | 0.0828 | 0.0 | 0.0 | 0.1172 | 0.365 | 0.0003 | 0.0461 | 0.0352 | 0.1939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9531 | 0.7102 | 4050 | 2.2578 | 0.0225 | 0.0452 | 0.02 | 0.0078 | 0.0252 | 0.0288 | 0.0585 | 0.0879 | 0.0894 | 0.0206 | 0.0933 | 0.1094 | 0.0 | 0.0 | 0.0445 | 0.4208 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1388 | 0.4335 | 0.0 | 0.0 | 0.1216 | 0.6208 | 0.0 | 0.0 | 0.0507 | 0.3331 | 0.0 | 0.0 | 0.1659 | 0.7421 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0512 | 0.0429 | 0.0945 | 0.0012 | 0.011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0598 | 0.0 | 0.0 | 0.0024 | 0.0375 | 0.0 | 0.0 | 0.2319 | 0.4338 | 0.0034 | 0.0319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0059 | 0.0341 | 0.0232 | 0.0978 | 0.0 | 0.0 | 0.1588 | 0.4211 | 0.0001 | 0.0382 | 0.042 | 0.2525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.445 | 0.7189 | 4100 | 2.2731 | 0.0212 | 0.0441 | 0.0184 | 0.0088 | 0.0245 | 0.0269 | 0.0593 | 0.0891 | 0.0902 | 0.022 | 0.0941 | 0.1127 | 0.0 | 0.0 | 0.0451 | 0.4449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1366 | 0.4264 | 0.0 | 0.0 | 0.123 | 0.6285 | 0.0 | 0.0 | 0.0577 | 0.3569 | 0.0 | 0.0 | 0.1732 | 0.7504 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0512 | 0.0225 | 0.1041 | 0.0002 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0036 | 0.0817 | 0.0 | 0.0 | 0.0039 | 0.03 | 0.0 | 0.0 | 0.2022 | 0.3936 | 0.0085 | 0.0484 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0086 | 0.0171 | 0.0219 | 0.1276 | 0.0 | 0.0 | 0.1284 | 0.3962 | 0.0001 | 0.0321 | 0.0378 | 0.2532 | 0.0 | 0.0 | 0.0007 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0963 | 0.7277 | 4150 | 2.3154 | 0.0204 | 0.044 | 0.0162 | 0.0071 | 0.0242 | 0.0252 | 0.0557 | 0.0852 | 0.0866 | 0.0186 | 0.0862 | 0.1182 | 0.0 | 0.0 | 0.0539 | 0.4515 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1181 | 0.4302 | 0.0 | 0.0 | 0.1298 | 0.626 | 0.0003 | 0.0019 | 0.0624 | 0.2988 | 0.0 | 0.0 | 0.1452 | 0.728 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0605 | 0.03 | 0.0877 | 0.0001 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0043 | 0.089 | 0.0 | 0.0 | 0.0013 | 0.0158 | 0.0 | 0.0 | 0.2163 | 0.378 | 0.0077 | 0.0352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.0152 | 0.0234 | 0.1269 | 0.0 | 0.0 | 0.1091 | 0.3613 | 0.0001 | 0.0285 | 0.0344 | 0.2468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8857 | 0.7365 | 4200 | 2.2681 | 0.0216 | 0.0441 | 0.0181 | 0.0082 | 0.0256 | 0.0275 | 0.0584 | 0.0866 | 0.088 | 0.0213 | 0.0898 | 0.1214 | 0.0 | 0.0 | 0.05 | 0.4358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1226 | 0.4192 | 0.0 | 0.0 | 0.1206 | 0.6103 | 0.0 | 0.0 | 0.055 | 0.3262 | 0.0 | 0.0 | 0.1739 | 0.7248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0442 | 0.0123 | 0.0685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0066 | 0.097 | 0.0 | 0.0 | 0.0018 | 0.0333 | 0.0 | 0.0 | 0.2455 | 0.4239 | 0.0037 | 0.0516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0226 | 0.0251 | 0.1015 | 0.0 | 0.0 | 0.1281 | 0.3928 | 0.0001 | 0.0351 | 0.0378 | 0.2628 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6289 | 0.7452 | 4250 | 2.2522 | 0.0206 | 0.0433 | 0.017 | 0.0079 | 0.0244 | 0.0277 | 0.0583 | 0.0884 | 0.0899 | 0.0221 | 0.0937 | 0.1222 | 0.0 | 0.0 | 0.0418 | 0.4195 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1299 | 0.4582 | 0.0 | 0.0 | 0.1051 | 0.6167 | 0.0001 | 0.001 | 0.0481 | 0.3169 | 0.0 | 0.0 | 0.1654 | 0.7472 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0736 | 0.0365 | 0.1055 | 0.0 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0738 | 0.0 | 0.0 | 0.0044 | 0.04 | 0.0 | 0.0 | 0.2179 | 0.3932 | 0.003 | 0.061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.0295 | 0.0263 | 0.0851 | 0.0 | 0.0 | 0.1219 | 0.412 | 0.0001 | 0.0353 | 0.0372 | 0.2653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7337 | 0.7540 | 4300 | 2.2464 | 0.0219 | 0.0449 | 0.0196 | 0.0086 | 0.0259 | 0.0227 | 0.0593 | 0.0896 | 0.0912 | 0.0225 | 0.0947 | 0.1188 | 0.0 | 0.0 | 0.048 | 0.4297 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1479 | 0.433 | 0.0 | 0.0 | 0.0927 | 0.6314 | 0.0 | 0.0 | 0.0506 | 0.3344 | 0.0 | 0.0 | 0.1579 | 0.7055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0628 | 0.0305 | 0.1247 | 0.0003 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0027 | 0.0835 | 0.0 | 0.0 | 0.0042 | 0.0508 | 0.0 | 0.0 | 0.2543 | 0.438 | 0.0063 | 0.0681 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0217 | 0.0227 | 0.0993 | 0.0 | 0.0 | 0.1349 | 0.4193 | 0.0001 | 0.0409 | 0.0471 | 0.2504 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2441 | 0.7628 | 4350 | 2.2675 | 0.0207 | 0.0454 | 0.0163 | 0.0075 | 0.0233 | 0.0246 | 0.0544 | 0.0823 | 0.0837 | 0.0192 | 0.0866 | 0.1135 | 0.0 | 0.0 | 0.0458 | 0.404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1233 | 0.3731 | 0.0 | 0.0 | 0.1252 | 0.609 | 0.0 | 0.0 | 0.0394 | 0.3088 | 0.0 | 0.0 | 0.1951 | 0.6789 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0527 | 0.0183 | 0.1082 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0628 | 0.0 | 0.0 | 0.0022 | 0.0367 | 0.0 | 0.0 | 0.2142 | 0.3965 | 0.0071 | 0.07 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0189 | 0.0183 | 0.0828 | 0.0 | 0.0 | 0.1172 | 0.3609 | 0.0001 | 0.0403 | 0.0421 | 0.2466 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3729 | 0.7715 | 4400 | 2.2554 | 0.0231 | 0.0479 | 0.0191 | 0.0092 | 0.0279 | 0.0274 | 0.0584 | 0.0864 | 0.0879 | 0.025 | 0.0925 | 0.109 | 0.0 | 0.0 | 0.0487 | 0.3953 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1141 | 0.4527 | 0.0 | 0.0 | 0.1769 | 0.6038 | 0.0002 | 0.001 | 0.0553 | 0.2869 | 0.0 | 0.0 | 0.1943 | 0.7307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0682 | 0.0256 | 0.1192 | 0.0001 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0793 | 0.0 | 0.0 | 0.0017 | 0.0233 | 0.0 | 0.0 | 0.213 | 0.4187 | 0.0081 | 0.0653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.0212 | 0.0242 | 0.0955 | 0.0 | 0.0 | 0.14 | 0.3754 | 0.0001 | 0.0418 | 0.0508 | 0.2615 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0254 | 0.7803 | 4450 | 2.2672 | 0.0226 | 0.0462 | 0.0194 | 0.0077 | 0.0243 | 0.0308 | 0.0576 | 0.0852 | 0.0867 | 0.0208 | 0.091 | 0.1138 | 0.0 | 0.0 | 0.0568 | 0.4121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1379 | 0.4808 | 0.0 | 0.0 | 0.1366 | 0.5907 | 0.004 | 0.0133 | 0.0493 | 0.2525 | 0.0 | 0.0 | 0.188 | 0.6762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0519 | 0.0144 | 0.0795 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0494 | 0.0 | 0.0 | 0.0007 | 0.0192 | 0.0 | 0.0 | 0.2306 | 0.4435 | 0.0077 | 0.0531 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.0475 | 0.0161 | 0.141 | 0.0 | 0.0 | 0.1506 | 0.391 | 0.0001 | 0.0472 | 0.0387 | 0.2399 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6321 | 0.7891 | 4500 | 2.2413 | 0.0238 | 0.0471 | 0.0212 | 0.008 | 0.0247 | 0.032 | 0.0615 | 0.09 | 0.0916 | 0.024 | 0.0917 | 0.1235 | 0.0 | 0.0 | 0.0518 | 0.4235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1546 | 0.4527 | 0.0 | 0.0 | 0.1439 | 0.6125 | 0.002 | 0.0105 | 0.0518 | 0.3106 | 0.0 | 0.0 | 0.1815 | 0.7156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.003 | 0.076 | 0.0327 | 0.1205 | 0.0001 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.0756 | 0.0 | 0.0 | 0.0038 | 0.055 | 0.0 | 0.0 | 0.2443 | 0.4562 | 0.0045 | 0.0577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0036 | 0.0253 | 0.0262 | 0.1313 | 0.0 | 0.0 | 0.1411 | 0.4027 | 0.0001 | 0.0392 | 0.0441 | 0.2427 | 0.0 | 0.0 | 0.001 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3166 | 0.7978 | 4550 | 2.2315 | 0.025 | 0.0482 | 0.0238 | 0.0091 | 0.0275 | 0.03 | 0.0626 | 0.093 | 0.0946 | 0.0245 | 0.0923 | 0.1323 | 0.0 | 0.0 | 0.0567 | 0.465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1628 | 0.4335 | 0.0 | 0.0 | 0.1222 | 0.6244 | 0.0006 | 0.0029 | 0.0717 | 0.3262 | 0.0 | 0.0 | 0.1704 | 0.7344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0036 | 0.0674 | 0.0455 | 0.1822 | 0.0001 | 0.0101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0106 | 0.0713 | 0.0 | 0.0 | 0.0031 | 0.0575 | 0.0 | 0.0 | 0.2591 | 0.4732 | 0.0076 | 0.0554 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0078 | 0.0262 | 0.1381 | 0.0 | 0.0 | 0.1558 | 0.4149 | 0.0001 | 0.0371 | 0.0507 | 0.2475 | 0.0 | 0.0 | 0.0004 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2897 | 0.8066 | 4600 | 2.2356 | 0.0253 | 0.0489 | 0.024 | 0.0107 | 0.0273 | 0.036 | 0.0631 | 0.0922 | 0.0937 | 0.0261 | 0.0968 | 0.1288 | 0.0 | 0.0 | 0.0485 | 0.454 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1537 | 0.3802 | 0.0 | 0.0 | 0.142 | 0.6237 | 0.0012 | 0.0038 | 0.066 | 0.3356 | 0.0 | 0.0 | 0.2018 | 0.749 | 0.0 | 0.0 | 0.0 | 0.0 | 0.005 | 0.0736 | 0.0532 | 0.1699 | 0.0002 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0038 | 0.0652 | 0.0 | 0.0 | 0.0028 | 0.0617 | 0.0 | 0.0 | 0.2618 | 0.4626 | 0.0075 | 0.0653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0034 | 0.0143 | 0.0143 | 0.1284 | 0.0 | 0.0 | 0.1526 | 0.4104 | 0.0001 | 0.0341 | 0.0479 | 0.2611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3089 | 0.8154 | 4650 | 2.2320 | 0.025 | 0.0495 | 0.0224 | 0.0097 | 0.0256 | 0.0366 | 0.0604 | 0.0887 | 0.0904 | 0.0264 | 0.0901 | 0.1232 | 0.0 | 0.0 | 0.0467 | 0.4195 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1425 | 0.3923 | 0.0 | 0.0 | 0.1551 | 0.5946 | 0.0045 | 0.0086 | 0.0586 | 0.3169 | 0.0 | 0.0 | 0.2098 | 0.7175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0068 | 0.0977 | 0.0424 | 0.1315 | 0.0001 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0036 | 0.0518 | 0.0 | 0.0 | 0.0037 | 0.0708 | 0.0 | 0.0 | 0.2668 | 0.4665 | 0.0046 | 0.0634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0088 | 0.0163 | 0.1022 | 0.0 | 0.0 | 0.1416 | 0.4044 | 0.0001 | 0.042 | 0.0469 | 0.2539 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4478 | 0.8241 | 4700 | 2.2284 | 0.024 | 0.0483 | 0.0209 | 0.0085 | 0.026 | 0.0371 | 0.0626 | 0.092 | 0.0937 | 0.0271 | 0.0956 | 0.1232 | 0.0 | 0.0 | 0.0436 | 0.4025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1502 | 0.4637 | 0.0 | 0.0 | 0.1567 | 0.5782 | 0.0034 | 0.0162 | 0.0557 | 0.3219 | 0.0 | 0.0 | 0.2035 | 0.7366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0969 | 0.0429 | 0.1671 | 0.0002 | 0.0156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0543 | 0.0 | 0.0 | 0.003 | 0.0517 | 0.0 | 0.0 | 0.2328 | 0.4534 | 0.0049 | 0.0554 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0045 | 0.0355 | 0.019 | 0.1306 | 0.0 | 0.0 | 0.1336 | 0.4139 | 0.0001 | 0.0388 | 0.0465 | 0.279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5326 | 0.8329 | 4750 | 2.2199 | 0.0238 | 0.05 | 0.0201 | 0.0085 | 0.0268 | 0.0316 | 0.0598 | 0.0888 | 0.0905 | 0.0237 | 0.0925 | 0.121 | 0.0 | 0.0 | 0.0528 | 0.4163 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1544 | 0.439 | 0.0 | 0.0 | 0.1331 | 0.5804 | 0.0005 | 0.0076 | 0.0545 | 0.2894 | 0.0 | 0.0 | 0.1977 | 0.7049 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.0775 | 0.0317 | 0.1562 | 0.0002 | 0.0275 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.053 | 0.0 | 0.0 | 0.0064 | 0.0375 | 0.0 | 0.0 | 0.2521 | 0.4531 | 0.006 | 0.069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0212 | 0.0214 | 0.1328 | 0.0 | 0.0 | 0.123 | 0.3976 | 0.0001 | 0.0388 | 0.0512 | 0.2605 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.57 | 0.8417 | 4800 | 2.2124 | 0.0234 | 0.0491 | 0.0196 | 0.0086 | 0.0265 | 0.0285 | 0.0595 | 0.09 | 0.0916 | 0.022 | 0.0957 | 0.1235 | 0.0 | 0.0 | 0.0566 | 0.4339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1458 | 0.4341 | 0.0 | 0.0 | 0.1062 | 0.609 | 0.0003 | 0.0038 | 0.0679 | 0.3056 | 0.0 | 0.0 | 0.193 | 0.7217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0046 | 0.0504 | 0.0292 | 0.1151 | 0.0004 | 0.0275 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0026 | 0.0634 | 0.0 | 0.0 | 0.0052 | 0.055 | 0.0 | 0.0 | 0.2552 | 0.4596 | 0.0055 | 0.0714 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.0373 | 0.0207 | 0.1351 | 0.0 | 0.0 | 0.1366 | 0.407 | 0.0001 | 0.0399 | 0.0439 | 0.2442 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3935 | 0.8504 | 4850 | 2.1941 | 0.025 | 0.0506 | 0.0219 | 0.0092 | 0.0276 | 0.0342 | 0.0625 | 0.0934 | 0.0947 | 0.0219 | 0.0983 | 0.1245 | 0.0 | 0.0 | 0.0568 | 0.4195 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1584 | 0.4973 | 0.0 | 0.0 | 0.1103 | 0.6426 | 0.0 | 0.0 | 0.0805 | 0.325 | 0.0 | 0.0 | 0.1912 | 0.7598 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.0581 | 0.0417 | 0.1699 | 0.0003 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0034 | 0.0695 | 0.0 | 0.0 | 0.0035 | 0.0483 | 0.0 | 0.0 | 0.2723 | 0.4504 | 0.0045 | 0.0516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0047 | 0.0585 | 0.0156 | 0.109 | 0.0 | 0.0 | 0.1534 | 0.4082 | 0.0001 | 0.0375 | 0.0466 | 0.2357 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3942 | 0.8592 | 4900 | 2.2235 | 0.0248 | 0.0505 | 0.0212 | 0.0102 | 0.0254 | 0.0285 | 0.0648 | 0.0945 | 0.0958 | 0.0251 | 0.1021 | 0.1139 | 0.0 | 0.0 | 0.0396 | 0.414 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1794 | 0.4824 | 0.0 | 0.0 | 0.1204 | 0.6253 | 0.0005 | 0.0019 | 0.0582 | 0.3375 | 0.0 | 0.0 | 0.2281 | 0.7685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.107 | 0.0343 | 0.1849 | 0.0003 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.0902 | 0.0 | 0.0 | 0.0027 | 0.0425 | 0.0 | 0.0 | 0.2194 | 0.4191 | 0.0076 | 0.0577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0117 | 0.0535 | 0.0291 | 0.1373 | 0.0 | 0.0 | 0.1532 | 0.3832 | 0.0001 | 0.0325 | 0.0445 | 0.2459 | 0.0 | 0.0 | 0.0002 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.27 | 0.8680 | 4950 | 2.2098 | 0.0271 | 0.0544 | 0.0243 | 0.0116 | 0.0276 | 0.0304 | 0.0658 | 0.098 | 0.0993 | 0.0273 | 0.1036 | 0.1252 | 0.0 | 0.0 | 0.0489 | 0.4436 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1619 | 0.5115 | 0.0 | 0.0 | 0.1646 | 0.6301 | 0.0 | 0.0 | 0.0601 | 0.3875 | 0.0 | 0.0 | 0.241 | 0.7337 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.1333 | 0.0448 | 0.1808 | 0.0 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.006 | 0.0976 | 0.0 | 0.0 | 0.0038 | 0.035 | 0.0 | 0.0 | 0.264 | 0.4588 | 0.007 | 0.0587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0116 | 0.0512 | 0.021 | 0.1537 | 0.0 | 0.0 | 0.1566 | 0.3904 | 0.0001 | 0.0399 | 0.047 | 0.2534 | 0.0 | 0.0 | 0.0001 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6008 | 0.8767 | 5000 | 2.2116 | 0.0257 | 0.0506 | 0.0232 | 0.0099 | 0.0287 | 0.028 | 0.0671 | 0.1008 | 0.102 | 0.0278 | 0.1085 | 0.126 | 0.0 | 0.0 | 0.0547 | 0.4335 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1643 | 0.5703 | 0.0 | 0.0 | 0.1322 | 0.6535 | 0.0004 | 0.0057 | 0.0719 | 0.3475 | 0.0 | 0.0 | 0.1999 | 0.7614 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0092 | 0.1279 | 0.039 | 0.2123 | 0.0003 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0057 | 0.075 | 0.0 | 0.0 | 0.0017 | 0.0358 | 0.0 | 0.0 | 0.2631 | 0.4415 | 0.0049 | 0.0596 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0196 | 0.0779 | 0.0211 | 0.1866 | 0.0 | 0.0 | 0.1529 | 0.4091 | 0.0001 | 0.0312 | 0.0429 | 0.2437 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7406 | 0.8855 | 5050 | 2.2007 | 0.0255 | 0.0516 | 0.0223 | 0.0093 | 0.0284 | 0.0302 | 0.0633 | 0.0943 | 0.0956 | 0.0259 | 0.101 | 0.12 | 0.0 | 0.0 | 0.0486 | 0.4146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1521 | 0.4989 | 0.0 | 0.0 | 0.1625 | 0.6385 | 0.0056 | 0.0086 | 0.0528 | 0.3369 | 0.0 | 0.0 | 0.2306 | 0.7606 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0977 | 0.0257 | 0.211 | 0.0 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.0646 | 0.0 | 0.0 | 0.0024 | 0.0467 | 0.0 | 0.0 | 0.245 | 0.4308 | 0.0037 | 0.0653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0093 | 0.0535 | 0.0174 | 0.0806 | 0.0 | 0.0 | 0.1533 | 0.4043 | 0.0001 | 0.0375 | 0.0526 | 0.2443 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9428 | 0.8943 | 5100 | 2.1794 | 0.0275 | 0.054 | 0.0251 | 0.01 | 0.0287 | 0.0339 | 0.0674 | 0.1009 | 0.1025 | 0.03 | 0.1073 | 0.1295 | 0.0 | 0.0 | 0.0491 | 0.4561 | 0.0 | 0.0 | 0.0 | 0.0 | 0.164 | 0.5473 | 0.0 | 0.0 | 0.1663 | 0.6606 | 0.0052 | 0.0181 | 0.0692 | 0.3606 | 0.0 | 0.0 | 0.2479 | 0.7715 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0049 | 0.1078 | 0.0369 | 0.2438 | 0.0002 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.0909 | 0.0 | 0.0 | 0.0035 | 0.0583 | 0.0 | 0.0 | 0.2657 | 0.4533 | 0.007 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.0382 | 0.0151 | 0.1015 | 0.0 | 0.0 | 0.1693 | 0.4329 | 0.0001 | 0.0366 | 0.0525 | 0.2379 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4088 | 0.9030 | 5150 | 2.2078 | 0.0272 | 0.0535 | 0.0248 | 0.0097 | 0.0282 | 0.0366 | 0.0643 | 0.0951 | 0.0966 | 0.0284 | 0.1002 | 0.1318 | 0.0 | 0.0 | 0.0497 | 0.4398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1666 | 0.4797 | 0.0 | 0.0 | 0.1763 | 0.6385 | 0.0009 | 0.0067 | 0.0654 | 0.3363 | 0.0 | 0.0 | 0.253 | 0.7569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0047 | 0.0946 | 0.0349 | 0.2288 | 0.0001 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0677 | 0.0 | 0.0 | 0.0013 | 0.0667 | 0.0 | 0.0 | 0.2539 | 0.439 | 0.0049 | 0.0817 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0116 | 0.0332 | 0.0236 | 0.0896 | 0.0 | 0.0 | 0.1517 | 0.3973 | 0.0001 | 0.0384 | 0.0483 | 0.2354 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2932 | 0.9118 | 5200 | 2.1739 | 0.0289 | 0.057 | 0.0255 | 0.0105 | 0.0292 | 0.0392 | 0.0674 | 0.0987 | 0.1 | 0.0263 | 0.1053 | 0.1279 | 0.0 | 0.0 | 0.0543 | 0.4381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1806 | 0.533 | 0.0 | 0.0 | 0.1994 | 0.6423 | 0.0049 | 0.0133 | 0.0712 | 0.3738 | 0.0 | 0.0 | 0.2333 | 0.7374 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0152 | 0.1062 | 0.0552 | 0.2329 | 0.0001 | 0.0101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.078 | 0.0 | 0.0 | 0.0022 | 0.0767 | 0.0 | 0.0 | 0.2526 | 0.4309 | 0.0046 | 0.0629 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.0544 | 0.0198 | 0.1097 | 0.0 | 0.0 | 0.1761 | 0.4091 | 0.0001 | 0.044 | 0.0475 | 0.2461 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6092 | 0.9206 | 5250 | 2.1831 | 0.0271 | 0.0534 | 0.024 | 0.0115 | 0.0279 | 0.0337 | 0.0663 | 0.0966 | 0.0981 | 0.0285 | 0.1046 | 0.1207 | 0.0 | 0.0 | 0.0532 | 0.4525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1891 | 0.5071 | 0.0 | 0.0 | 0.156 | 0.6487 | 0.0064 | 0.0152 | 0.0613 | 0.3631 | 0.0 | 0.0 | 0.2392 | 0.74 | 0.0 | 0.0 | 0.0 | 0.0 | 0.007 | 0.1202 | 0.0282 | 0.1945 | 0.0002 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0029 | 0.0701 | 0.0 | 0.0 | 0.0007 | 0.035 | 0.0 | 0.0 | 0.2654 | 0.4524 | 0.0081 | 0.085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0066 | 0.0521 | 0.0257 | 0.0978 | 0.0 | 0.0 | 0.1547 | 0.3956 | 0.0001 | 0.0418 | 0.0418 | 0.2243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1242 | 0.9293 | 5300 | 2.1793 | 0.0274 | 0.0553 | 0.0237 | 0.0111 | 0.0303 | 0.034 | 0.0682 | 0.1009 | 0.1021 | 0.0282 | 0.1046 | 0.1245 | 0.0 | 0.0 | 0.0578 | 0.4678 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1881 | 0.5429 | 0.0 | 0.0 | 0.1534 | 0.6433 | 0.0023 | 0.0067 | 0.0807 | 0.3544 | 0.0 | 0.0 | 0.2184 | 0.7689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.1264 | 0.042 | 0.2137 | 0.0001 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0049 | 0.089 | 0.0 | 0.0 | 0.0016 | 0.0475 | 0.0 | 0.0 | 0.2352 | 0.4392 | 0.0089 | 0.077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.011 | 0.0507 | 0.0283 | 0.1522 | 0.0 | 0.0 | 0.172 | 0.3992 | 0.0001 | 0.0384 | 0.0523 | 0.2628 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.287 | 0.9381 | 5350 | 2.1810 | 0.0273 | 0.0551 | 0.0242 | 0.0111 | 0.0304 | 0.0356 | 0.0685 | 0.1012 | 0.1025 | 0.0268 | 0.1058 | 0.1301 | 0.0 | 0.0 | 0.0617 | 0.4699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1497 | 0.5681 | 0.0 | 0.0 | 0.1557 | 0.6619 | 0.0019 | 0.0067 | 0.0825 | 0.3506 | 0.0 | 0.0 | 0.2229 | 0.7565 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0088 | 0.1209 | 0.0388 | 0.237 | 0.0002 | 0.0202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.0811 | 0.0 | 0.0 | 0.0022 | 0.0467 | 0.0 | 0.0 | 0.2562 | 0.4352 | 0.0067 | 0.0634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.0475 | 0.0329 | 0.1463 | 0.0 | 0.0 | 0.1704 | 0.413 | 0.0001 | 0.0437 | 0.0537 | 0.2486 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.529 | 0.9469 | 5400 | 2.1900 | 0.0277 | 0.0552 | 0.0244 | 0.0134 | 0.0286 | 0.0358 | 0.0652 | 0.0982 | 0.0997 | 0.0281 | 0.0987 | 0.13 | 0.0 | 0.0 | 0.0658 | 0.4689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1443 | 0.5516 | 0.0 | 0.0 | 0.1557 | 0.6506 | 0.001 | 0.0019 | 0.0792 | 0.3256 | 0.0 | 0.0 | 0.2318 | 0.7398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0165 | 0.1155 | 0.0311 | 0.1562 | 0.0005 | 0.0349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0817 | 0.0 | 0.0 | 0.0038 | 0.0983 | 0.0 | 0.0 | 0.2752 | 0.4626 | 0.0092 | 0.0521 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0086 | 0.0327 | 0.0255 | 0.1343 | 0.0 | 0.0 | 0.1715 | 0.399 | 0.0002 | 0.0429 | 0.051 | 0.236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3909 | 0.9556 | 5450 | 2.1913 | 0.0279 | 0.0556 | 0.0243 | 0.0128 | 0.0275 | 0.0356 | 0.0637 | 0.0935 | 0.0948 | 0.0263 | 0.0968 | 0.1234 | 0.0 | 0.0 | 0.0553 | 0.4316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1381 | 0.4819 | 0.0 | 0.0 | 0.1821 | 0.6542 | 0.002 | 0.0019 | 0.0654 | 0.3494 | 0.0 | 0.0 | 0.2732 | 0.7539 | 0.0 | 0.0 | 0.0 | 0.0 | 0.012 | 0.1031 | 0.0206 | 0.1096 | 0.0001 | 0.0083 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0634 | 0.0 | 0.0 | 0.0042 | 0.0892 | 0.0 | 0.0 | 0.268 | 0.453 | 0.0131 | 0.0897 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.0359 | 0.0133 | 0.0806 | 0.0 | 0.0 | 0.1805 | 0.3909 | 0.0001 | 0.0399 | 0.048 | 0.224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4546 | 0.9644 | 5500 | 2.2053 | 0.0268 | 0.0544 | 0.0224 | 0.008 | 0.0273 | 0.0376 | 0.0618 | 0.0925 | 0.0937 | 0.022 | 0.0945 | 0.1217 | 0.0 | 0.0 | 0.0586 | 0.4504 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1431 | 0.506 | 0.0 | 0.0 | 0.1762 | 0.634 | 0.0003 | 0.0019 | 0.0642 | 0.3137 | 0.0 | 0.0 | 0.2602 | 0.7652 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.0829 | 0.033 | 0.1027 | 0.0001 | 0.0064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0701 | 0.0 | 0.0 | 0.0036 | 0.0942 | 0.0 | 0.0 | 0.233 | 0.4196 | 0.0108 | 0.0723 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0072 | 0.0488 | 0.0145 | 0.0843 | 0.0 | 0.0 | 0.1726 | 0.387 | 0.0001 | 0.0425 | 0.0455 | 0.2296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0263 | 0.9732 | 5550 | 2.1659 | 0.0288 | 0.0567 | 0.0256 | 0.0111 | 0.0286 | 0.0379 | 0.0656 | 0.0966 | 0.098 | 0.0264 | 0.0986 | 0.1214 | 0.0 | 0.0 | 0.0545 | 0.4536 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1461 | 0.5187 | 0.0 | 0.0 | 0.2152 | 0.6401 | 0.0027 | 0.0162 | 0.0698 | 0.37 | 0.0 | 0.0 | 0.2707 | 0.7423 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0045 | 0.0853 | 0.023 | 0.1027 | 0.0003 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0036 | 0.1006 | 0.0 | 0.0 | 0.0018 | 0.0692 | 0.0 | 0.0 | 0.2728 | 0.458 | 0.0156 | 0.0836 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0065 | 0.0512 | 0.0229 | 0.1149 | 0.0 | 0.0 | 0.1689 | 0.4008 | 0.0001 | 0.0407 | 0.048 | 0.2467 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0187 | 0.9819 | 5600 | 2.1618 | 0.0292 | 0.0575 | 0.0259 | 0.0124 | 0.0303 | 0.0353 | 0.0658 | 0.0975 | 0.0986 | 0.027 | 0.0981 | 0.1271 | 0.0 | 0.0 | 0.0655 | 0.4684 | 0.0 | 0.0 | 0.0 | 0.0 | 0.151 | 0.517 | 0.0 | 0.0 | 0.2052 | 0.6628 | 0.0034 | 0.0114 | 0.0749 | 0.3594 | 0.0 | 0.0 | 0.2332 | 0.7362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.093 | 0.0415 | 0.1233 | 0.0004 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0037 | 0.0884 | 0.0 | 0.0 | 0.0019 | 0.0842 | 0.0 | 0.0 | 0.2796 | 0.4662 | 0.0162 | 0.0695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0049 | 0.0401 | 0.0236 | 0.1179 | 0.0 | 0.0 | 0.1901 | 0.3975 | 0.0001 | 0.0424 | 0.0452 | 0.2454 | 0.0 | 0.0 | 0.0007 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2843 | 0.9907 | 5650 | 2.1621 | 0.0292 | 0.0579 | 0.0259 | 0.0106 | 0.0304 | 0.0371 | 0.0684 | 0.0991 | 0.1004 | 0.0279 | 0.1043 | 0.1265 | 0.0 | 0.0 | 0.0674 | 0.4701 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1536 | 0.4758 | 0.0 | 0.0 | 0.206 | 0.6446 | 0.0034 | 0.0143 | 0.0934 | 0.3819 | 0.0 | 0.0 | 0.2377 | 0.7661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0052 | 0.107 | 0.0267 | 0.1671 | 0.0002 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0047 | 0.0976 | 0.0 | 0.0 | 0.0028 | 0.0708 | 0.0 | 0.0 | 0.2495 | 0.433 | 0.0138 | 0.0695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.0562 | 0.03 | 0.15 | 0.0 | 0.0 | 0.1869 | 0.3975 | 0.0001 | 0.0405 | 0.0527 | 0.2554 | 0.0 | 0.0 | 0.0003 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3061 | 0.9995 | 5700 | 2.1518 | 0.03 | 0.0592 | 0.0261 | 0.0105 | 0.0302 | 0.0406 | 0.0689 | 0.0991 | 0.1006 | 0.0288 | 0.1075 | 0.121 | 0.0 | 0.0 | 0.061 | 0.4386 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1541 | 0.4434 | 0.0 | 0.0 | 0.2372 | 0.6423 | 0.0011 | 0.0067 | 0.073 | 0.3681 | 0.0 | 0.0 | 0.2791 | 0.7372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.005 | 0.0977 | 0.0344 | 0.2041 | 0.0003 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.114 | 0.0 | 0.0 | 0.0072 | 0.1067 | 0.0 | 0.0 | 0.2573 | 0.4452 | 0.0115 | 0.0878 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0093 | 0.0673 | 0.0255 | 0.144 | 0.0 | 0.0 | 0.1682 | 0.3956 | 0.0001 | 0.0459 | 0.0508 | 0.2574 | 0.0 | 0.0 | 0.0004 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2847 | 1.0082 | 5750 | 2.1567 | 0.0294 | 0.0602 | 0.026 | 0.0098 | 0.0282 | 0.0416 | 0.0668 | 0.0974 | 0.0987 | 0.0255 | 0.107 | 0.1219 | 0.0 | 0.0 | 0.0565 | 0.4409 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1288 | 0.4247 | 0.0 | 0.0 | 0.224 | 0.6247 | 0.0013 | 0.0086 | 0.068 | 0.4013 | 0.0 | 0.0 | 0.2989 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0088 | 0.1287 | 0.0481 | 0.1836 | 0.0002 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0033 | 0.1189 | 0.0 | 0.0 | 0.0048 | 0.0942 | 0.0 | 0.0 | 0.2502 | 0.4321 | 0.0117 | 0.0723 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.0544 | 0.0266 | 0.1246 | 0.0 | 0.0 | 0.1657 | 0.3853 | 0.0001 | 0.0444 | 0.05 | 0.2506 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5527 | 1.0170 | 5800 | 2.1534 | 0.0288 | 0.0578 | 0.025 | 0.011 | 0.0294 | 0.0392 | 0.0681 | 0.1001 | 0.1016 | 0.0268 | 0.1052 | 0.1375 | 0.0 | 0.0 | 0.0676 | 0.4672 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1258 | 0.4703 | 0.0 | 0.0 | 0.1922 | 0.6272 | 0.0005 | 0.0057 | 0.0743 | 0.4094 | 0.0 | 0.0 | 0.2645 | 0.7156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.0915 | 0.0319 | 0.2027 | 0.0002 | 0.0193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0029 | 0.1134 | 0.0 | 0.0 | 0.0068 | 0.1175 | 0.0 | 0.0 | 0.2701 | 0.4351 | 0.0101 | 0.0817 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.011 | 0.0438 | 0.0384 | 0.1642 | 0.0 | 0.0 | 0.1695 | 0.3968 | 0.0002 | 0.0506 | 0.0563 | 0.2602 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6725 | 1.0258 | 5850 | 2.1543 | 0.0282 | 0.058 | 0.0239 | 0.0093 | 0.0295 | 0.0388 | 0.0685 | 0.1003 | 0.1017 | 0.0223 | 0.1078 | 0.1361 | 0.0 | 0.0 | 0.0683 | 0.4479 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1438 | 0.5203 | 0.0 | 0.0 | 0.1877 | 0.6199 | 0.0042 | 0.0124 | 0.073 | 0.3762 | 0.0 | 0.0 | 0.2332 | 0.7372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0181 | 0.1124 | 0.0418 | 0.1685 | 0.0014 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0031 | 0.1043 | 0.0 | 0.0 | 0.0058 | 0.13 | 0.0 | 0.0 | 0.2568 | 0.4297 | 0.0088 | 0.0728 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0521 | 0.0228 | 0.1776 | 0.0 | 0.0 | 0.1698 | 0.3892 | 0.0002 | 0.0483 | 0.0498 | 0.2438 | 0.0 | 0.0 | 0.0005 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8928 | 1.0345 | 5900 | 2.1276 | 0.0307 | 0.06 | 0.0285 | 0.0114 | 0.0329 | 0.0403 | 0.0719 | 0.105 | 0.1065 | 0.0282 | 0.1128 | 0.1331 | 0.0 | 0.0 | 0.0634 | 0.4669 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1488 | 0.5577 | 0.0 | 0.0 | 0.208 | 0.6468 | 0.0001 | 0.0019 | 0.0764 | 0.4125 | 0.0 | 0.0 | 0.2666 | 0.7372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0088 | 0.1132 | 0.0332 | 0.2041 | 0.0003 | 0.0193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0034 | 0.0945 | 0.0 | 0.0 | 0.0078 | 0.1383 | 0.0 | 0.0 | 0.29 | 0.4645 | 0.0137 | 0.0864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0143 | 0.0673 | 0.0259 | 0.156 | 0.0 | 0.0 | 0.1933 | 0.4177 | 0.0002 | 0.0547 | 0.0562 | 0.2584 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3354 | 1.0433 | 5950 | 2.1253 | 0.0308 | 0.0603 | 0.0276 | 0.0132 | 0.0329 | 0.0412 | 0.0736 | 0.1077 | 0.1092 | 0.0312 | 0.1176 | 0.1363 | 0.0 | 0.0 | 0.0547 | 0.4826 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1284 | 0.5462 | 0.0 | 0.0 | 0.2132 | 0.6663 | 0.0005 | 0.0048 | 0.0706 | 0.4156 | 0.0 | 0.0 | 0.2871 | 0.7545 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0182 | 0.1357 | 0.0485 | 0.2466 | 0.0002 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.1189 | 0.0 | 0.0 | 0.0079 | 0.1192 | 0.0 | 0.0 | 0.2797 | 0.4489 | 0.0128 | 0.0911 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0145 | 0.0806 | 0.024 | 0.1716 | 0.0 | 0.0 | 0.1984 | 0.4229 | 0.0002 | 0.0511 | 0.0533 | 0.2526 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3447 | 1.0521 | 6000 | 2.1262 | 0.0308 | 0.0594 | 0.0286 | 0.0121 | 0.0324 | 0.0377 | 0.074 | 0.1079 | 0.1092 | 0.0291 | 0.1161 | 0.1406 | 0.0 | 0.0 | 0.0599 | 0.4727 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1453 | 0.5511 | 0.0 | 0.0 | 0.2069 | 0.6625 | 0.0 | 0.0 | 0.0788 | 0.4175 | 0.0 | 0.0 | 0.2718 | 0.765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0154 | 0.1008 | 0.0349 | 0.2534 | 0.0002 | 0.0064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.1085 | 0.0 | 0.0 | 0.0078 | 0.1183 | 0.0 | 0.0 | 0.2921 | 0.4665 | 0.0103 | 0.0779 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0128 | 0.0825 | 0.0236 | 0.2119 | 0.0 | 0.0 | 0.1977 | 0.4299 | 0.0001 | 0.0444 | 0.0565 | 0.2552 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3796 | 1.0608 | 6050 | 2.1284 | 0.0316 | 0.0607 | 0.0291 | 0.0138 | 0.0343 | 0.0412 | 0.0749 | 0.1086 | 0.11 | 0.036 | 0.1175 | 0.1417 | 0.0 | 0.0 | 0.0533 | 0.4754 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1478 | 0.5341 | 0.0 | 0.0 | 0.2302 | 0.6436 | 0.0 | 0.0 | 0.0717 | 0.4231 | 0.0 | 0.0 | 0.2734 | 0.7722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0145 | 0.1333 | 0.0471 | 0.274 | 0.0003 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.136 | 0.0 | 0.0 | 0.0075 | 0.0958 | 0.0 | 0.0 | 0.2932 | 0.4732 | 0.0095 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0088 | 0.0691 | 0.0332 | 0.1679 | 0.0 | 0.0 | 0.203 | 0.4333 | 0.0001 | 0.0457 | 0.0544 | 0.2792 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9545 | 1.0696 | 6100 | 2.1257 | 0.0316 | 0.0623 | 0.0285 | 0.0139 | 0.0343 | 0.0423 | 0.0739 | 0.1084 | 0.1097 | 0.0316 | 0.1181 | 0.1413 | 0.0 | 0.0 | 0.0619 | 0.4767 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1336 | 0.5247 | 0.0 | 0.0 | 0.2104 | 0.6439 | 0.0017 | 0.0048 | 0.0853 | 0.4075 | 0.0 | 0.0 | 0.2867 | 0.7689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.1194 | 0.0516 | 0.2877 | 0.0005 | 0.0321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.1079 | 0.0 | 0.0 | 0.0092 | 0.125 | 0.0 | 0.0 | 0.2895 | 0.453 | 0.0077 | 0.0986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0105 | 0.0705 | 0.0251 | 0.1933 | 0.0 | 0.0 | 0.1927 | 0.4193 | 0.0001 | 0.0377 | 0.0587 | 0.2762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3951 | 1.0784 | 6150 | 2.1307 | 0.033 | 0.0637 | 0.0301 | 0.0141 | 0.0343 | 0.0439 | 0.0745 | 0.1082 | 0.1095 | 0.033 | 0.1197 | 0.1365 | 0.0 | 0.0 | 0.0594 | 0.4725 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1118 | 0.5203 | 0.0 | 0.0 | 0.2504 | 0.6359 | 0.002 | 0.0038 | 0.0836 | 0.4025 | 0.0 | 0.0 | 0.3144 | 0.7579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0218 | 0.1287 | 0.0592 | 0.2712 | 0.0007 | 0.033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.009 | 0.1091 | 0.0 | 0.0 | 0.0076 | 0.12 | 0.0 | 0.0 | 0.2939 | 0.4603 | 0.0137 | 0.0869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0175 | 0.1092 | 0.021 | 0.1776 | 0.0 | 0.0 | 0.1956 | 0.4322 | 0.0001 | 0.0362 | 0.0544 | 0.2755 | 0.0 | 0.0 | 0.0003 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4454 | 1.0871 | 6200 | 2.1275 | 0.0324 | 0.0628 | 0.0296 | 0.0125 | 0.0345 | 0.046 | 0.0739 | 0.1076 | 0.1088 | 0.0306 | 0.1165 | 0.1454 | 0.0 | 0.0 | 0.074 | 0.4703 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1483 | 0.5637 | 0.0 | 0.0 | 0.2274 | 0.6554 | 0.0 | 0.0 | 0.0904 | 0.4263 | 0.0 | 0.0 | 0.2861 | 0.7563 | 0.0 | 0.0 | 0.0 | 0.0 | 0.018 | 0.1 | 0.0541 | 0.2041 | 0.0008 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0033 | 0.0976 | 0.0 | 0.0 | 0.0076 | 0.135 | 0.0 | 0.0 | 0.2857 | 0.4623 | 0.0109 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0139 | 0.1147 | 0.0235 | 0.1881 | 0.0 | 0.0 | 0.1916 | 0.4139 | 0.0001 | 0.0394 | 0.0557 | 0.2802 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0004 | 1.0959 | 6250 | 2.1419 | 0.0311 | 0.0599 | 0.0288 | 0.0133 | 0.0345 | 0.0418 | 0.0725 | 0.1035 | 0.1046 | 0.0288 | 0.1131 | 0.1385 | 0.0 | 0.0 | 0.0677 | 0.4684 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1473 | 0.5104 | 0.0 | 0.0 | 0.2277 | 0.6484 | 0.0009 | 0.0067 | 0.0708 | 0.4225 | 0.0 | 0.0 | 0.2847 | 0.7656 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0111 | 0.0868 | 0.0416 | 0.2068 | 0.0006 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.0976 | 0.0 | 0.0 | 0.0048 | 0.1175 | 0.0 | 0.0 | 0.2776 | 0.461 | 0.0087 | 0.0953 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.0627 | 0.0219 | 0.1239 | 0.0 | 0.0 | 0.1982 | 0.4168 | 0.0001 | 0.0384 | 0.0564 | 0.2537 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2314 | 1.1047 | 6300 | 2.1156 | 0.0326 | 0.0622 | 0.0301 | 0.0126 | 0.0335 | 0.0466 | 0.0739 | 0.1065 | 0.1079 | 0.0314 | 0.1151 | 0.1345 | 0.0 | 0.0 | 0.0661 | 0.4841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1673 | 0.5242 | 0.0 | 0.0 | 0.2156 | 0.6689 | 0.0008 | 0.0067 | 0.0895 | 0.4344 | 0.0 | 0.0 | 0.2855 | 0.7756 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0166 | 0.1 | 0.0463 | 0.2219 | 0.0006 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.0976 | 0.0 | 0.0 | 0.0073 | 0.1492 | 0.0 | 0.0 | 0.2771 | 0.4582 | 0.0118 | 0.0991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0124 | 0.0682 | 0.0202 | 0.1276 | 0.0 | 0.0 | 0.2127 | 0.4199 | 0.0001 | 0.0479 | 0.0641 | 0.2582 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3041 | 1.1134 | 6350 | 2.1205 | 0.0321 | 0.0637 | 0.0298 | 0.0117 | 0.0333 | 0.0501 | 0.074 | 0.1087 | 0.1102 | 0.0324 | 0.1167 | 0.1395 | 0.0 | 0.0 | 0.0717 | 0.485 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1584 | 0.583 | 0.0 | 0.0 | 0.2246 | 0.6529 | 0.0 | 0.0 | 0.0916 | 0.4044 | 0.0 | 0.0 | 0.2552 | 0.7569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.1403 | 0.0549 | 0.2178 | 0.0003 | 0.0101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0047 | 0.1128 | 0.0 | 0.0 | 0.0103 | 0.145 | 0.0 | 0.0 | 0.2611 | 0.4476 | 0.013 | 0.1061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0176 | 0.0986 | 0.026 | 0.1664 | 0.0 | 0.0 | 0.2017 | 0.4179 | 0.0002 | 0.0496 | 0.0645 | 0.2705 | 0.0 | 0.0 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2943 | 1.1222 | 6400 | 2.1115 | 0.0323 | 0.0627 | 0.03 | 0.0125 | 0.0335 | 0.0476 | 0.0753 | 0.108 | 0.1095 | 0.0308 | 0.1161 | 0.1382 | 0.0 | 0.0 | 0.0692 | 0.4864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1675 | 0.5467 | 0.0 | 0.0 | 0.2264 | 0.6513 | 0.0044 | 0.021 | 0.084 | 0.4281 | 0.0 | 0.0 | 0.2572 | 0.7573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0186 | 0.1504 | 0.0534 | 0.2219 | 0.0003 | 0.011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.103 | 0.0 | 0.0 | 0.0072 | 0.1242 | 0.0 | 0.0 | 0.2882 | 0.4654 | 0.0126 | 0.1117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0147 | 0.0747 | 0.0274 | 0.1478 | 0.0 | 0.0 | 0.1974 | 0.4287 | 0.0002 | 0.0502 | 0.0523 | 0.2554 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2894 | 1.1310 | 6450 | 2.1048 | 0.0324 | 0.0629 | 0.03 | 0.012 | 0.0344 | 0.0441 | 0.0751 | 0.1084 | 0.1098 | 0.0299 | 0.1153 | 0.1377 | 0.0 | 0.0 | 0.0721 | 0.4767 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1755 | 0.5692 | 0.0 | 0.0 | 0.2248 | 0.6452 | 0.0023 | 0.0105 | 0.0823 | 0.4169 | 0.0 | 0.0 | 0.2573 | 0.7669 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0184 | 0.1605 | 0.0451 | 0.2767 | 0.0003 | 0.011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0027 | 0.0957 | 0.0 | 0.0 | 0.0066 | 0.1175 | 0.0 | 0.0 | 0.2817 | 0.4644 | 0.0154 | 0.0873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.0622 | 0.0317 | 0.1634 | 0.0 | 0.0 | 0.1986 | 0.4304 | 0.0001 | 0.0429 | 0.0657 | 0.2552 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5445 | 1.1398 | 6500 | 2.0931 | 0.0336 | 0.0643 | 0.0315 | 0.0128 | 0.0347 | 0.0477 | 0.0749 | 0.1084 | 0.1099 | 0.0353 | 0.1162 | 0.143 | 0.0 | 0.0 | 0.0637 | 0.4691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1687 | 0.5159 | 0.0 | 0.0 | 0.2607 | 0.6478 | 0.0029 | 0.0152 | 0.081 | 0.4044 | 0.0 | 0.0 | 0.2721 | 0.7703 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0169 | 0.1527 | 0.0571 | 0.2753 | 0.0004 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.1134 | 0.0 | 0.0 | 0.0054 | 0.1292 | 0.0 | 0.0 | 0.2917 | 0.4734 | 0.0117 | 0.1174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0104 | 0.0553 | 0.0325 | 0.147 | 0.0 | 0.0 | 0.2039 | 0.438 | 0.0001 | 0.041 | 0.0635 | 0.2658 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1373 | 1.1485 | 6550 | 2.1068 | 0.0325 | 0.0633 | 0.0305 | 0.0114 | 0.0331 | 0.0464 | 0.0732 | 0.106 | 0.1074 | 0.0324 | 0.115 | 0.1396 | 0.0 | 0.0 | 0.0668 | 0.4814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.152 | 0.4566 | 0.0 | 0.0 | 0.2523 | 0.641 | 0.0019 | 0.0086 | 0.0987 | 0.3975 | 0.0 | 0.0 | 0.2611 | 0.7437 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0181 | 0.1729 | 0.0505 | 0.3329 | 0.0003 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0029 | 0.0933 | 0.0 | 0.0 | 0.0063 | 0.1183 | 0.0 | 0.0 | 0.2551 | 0.4298 | 0.0102 | 0.1056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0401 | 0.0327 | 0.1478 | 0.0 | 0.0 | 0.212 | 0.4357 | 0.0002 | 0.0455 | 0.0668 | 0.2645 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0899 | 1.1573 | 6600 | 2.0957 | 0.034 | 0.0656 | 0.0312 | 0.0135 | 0.033 | 0.0488 | 0.0724 | 0.1052 | 0.1065 | 0.0347 | 0.1157 | 0.1338 | 0.0 | 0.0 | 0.0635 | 0.461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1629 | 0.4984 | 0.0 | 0.0 | 0.2725 | 0.6202 | 0.0018 | 0.0095 | 0.0818 | 0.39 | 0.0 | 0.0 | 0.293 | 0.7585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0181 | 0.162 | 0.0587 | 0.2767 | 0.0002 | 0.0321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0927 | 0.0 | 0.0 | 0.0072 | 0.1275 | 0.0 | 0.0 | 0.2779 | 0.4462 | 0.0092 | 0.1056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0098 | 0.0465 | 0.0288 | 0.1284 | 0.0 | 0.0 | 0.2055 | 0.4348 | 0.0001 | 0.0438 | 0.0679 | 0.2651 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6531 | 1.1661 | 6650 | 2.1063 | 0.0335 | 0.0668 | 0.0299 | 0.0135 | 0.0331 | 0.0485 | 0.0734 | 0.1047 | 0.106 | 0.032 | 0.116 | 0.1372 | 0.0 | 0.0 | 0.0699 | 0.4593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1374 | 0.4615 | 0.0 | 0.0 | 0.2721 | 0.6176 | 0.0019 | 0.0086 | 0.0909 | 0.3981 | 0.0 | 0.0 | 0.2954 | 0.7498 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0238 | 0.1643 | 0.0512 | 0.2575 | 0.0003 | 0.0376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.111 | 0.0 | 0.0 | 0.0053 | 0.1083 | 0.0 | 0.0 | 0.2758 | 0.4511 | 0.0128 | 0.1052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.0622 | 0.03 | 0.1545 | 0.0 | 0.0 | 0.1973 | 0.4149 | 0.0001 | 0.0431 | 0.06 | 0.2688 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3924 | 1.1748 | 6700 | 2.1004 | 0.034 | 0.0665 | 0.0309 | 0.0138 | 0.0331 | 0.0496 | 0.0739 | 0.1061 | 0.1074 | 0.0337 | 0.1163 | 0.1334 | 0.0 | 0.0 | 0.0705 | 0.4604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1445 | 0.4593 | 0.0 | 0.0 | 0.261 | 0.6138 | 0.0031 | 0.0143 | 0.0898 | 0.395 | 0.0 | 0.0 | 0.2917 | 0.7516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0274 | 0.1946 | 0.059 | 0.2712 | 0.0003 | 0.0367 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.1037 | 0.0 | 0.0 | 0.0064 | 0.1167 | 0.0 | 0.0 | 0.29 | 0.4675 | 0.0083 | 0.1056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0129 | 0.0558 | 0.0306 | 0.1552 | 0.0 | 0.0 | 0.1988 | 0.4177 | 0.0001 | 0.0457 | 0.0655 | 0.2735 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8972 | 1.1836 | 6750 | 2.1142 | 0.0321 | 0.0643 | 0.0287 | 0.0131 | 0.0334 | 0.047 | 0.0721 | 0.1037 | 0.105 | 0.031 | 0.1114 | 0.1331 | 0.0 | 0.0 | 0.078 | 0.4627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1532 | 0.4846 | 0.0 | 0.0 | 0.2444 | 0.6125 | 0.003 | 0.0248 | 0.1033 | 0.3794 | 0.0 | 0.0 | 0.2562 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0148 | 0.1364 | 0.0433 | 0.2836 | 0.0002 | 0.0248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.1073 | 0.0 | 0.0 | 0.0055 | 0.0992 | 0.0 | 0.0 | 0.259 | 0.4448 | 0.0134 | 0.1146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0052 | 0.0304 | 0.024 | 0.1604 | 0.0 | 0.0 | 0.1982 | 0.4143 | 0.0001 | 0.0399 | 0.0672 | 0.2772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0404 | 1.1924 | 6800 | 2.0965 | 0.0338 | 0.0658 | 0.0307 | 0.0123 | 0.0338 | 0.0481 | 0.0795 | 0.1131 | 0.1143 | 0.0334 | 0.1217 | 0.144 | 0.0 | 0.0 | 0.0789 | 0.4877 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.5363 | 0.0 | 0.0 | 0.2355 | 0.6446 | 0.0017 | 0.0114 | 0.0955 | 0.4494 | 0.0 | 0.0 | 0.2908 | 0.736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0209 | 0.1806 | 0.0448 | 0.2849 | 0.0003 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0065 | 0.1415 | 0.0 | 0.0 | 0.008 | 0.1483 | 0.0 | 0.0 | 0.3004 | 0.4678 | 0.0114 | 0.1 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0151 | 0.0912 | 0.0311 | 0.2119 | 0.0 | 0.0 | 0.2047 | 0.4183 | 0.0002 | 0.0425 | 0.0677 | 0.2759 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7764 | 1.2011 | 6850 | 2.0876 | 0.0343 | 0.0658 | 0.0314 | 0.0134 | 0.0345 | 0.0481 | 0.0789 | 0.1131 | 0.1143 | 0.0359 | 0.1222 | 0.1353 | 0.0 | 0.0 | 0.0765 | 0.4911 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1476 | 0.533 | 0.0 | 0.0 | 0.2651 | 0.6391 | 0.0008 | 0.0038 | 0.1018 | 0.4325 | 0.0 | 0.0 | 0.2775 | 0.7512 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0223 | 0.186 | 0.0434 | 0.3274 | 0.0003 | 0.0404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.1293 | 0.0 | 0.0 | 0.0051 | 0.1117 | 0.0 | 0.0 | 0.3015 | 0.4811 | 0.0085 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.013 | 0.0972 | 0.0276 | 0.2067 | 0.0 | 0.0 | 0.2102 | 0.4234 | 0.0001 | 0.0394 | 0.0666 | 0.2794 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.373 | 1.2099 | 6900 | 2.0944 | 0.0343 | 0.0657 | 0.0312 | 0.0138 | 0.0341 | 0.0475 | 0.0788 | 0.1139 | 0.1152 | 0.0375 | 0.125 | 0.1375 | 0.0 | 0.0 | 0.0683 | 0.4947 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1306 | 0.5 | 0.0 | 0.0 | 0.2694 | 0.6497 | 0.0005 | 0.0019 | 0.1059 | 0.4531 | 0.0 | 0.0 | 0.2864 | 0.7551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0215 | 0.2101 | 0.0507 | 0.3507 | 0.0004 | 0.045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0012 | 0.0074 | 0.1555 | 0.0 | 0.0 | 0.0043 | 0.1033 | 0.0 | 0.0 | 0.3039 | 0.4749 | 0.0088 | 0.0751 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.009 | 0.0848 | 0.0343 | 0.2 | 0.0 | 0.0 | 0.2072 | 0.4169 | 0.0002 | 0.0409 | 0.0673 | 0.285 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2988 | 1.2187 | 6950 | 2.0902 | 0.0346 | 0.0665 | 0.0323 | 0.0138 | 0.0347 | 0.0499 | 0.0791 | 0.1152 | 0.1164 | 0.0334 | 0.1244 | 0.1421 | 0.0 | 0.0 | 0.0777 | 0.5072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1498 | 0.5566 | 0.0 | 0.0 | 0.2526 | 0.6583 | 0.0011 | 0.0086 | 0.1072 | 0.4563 | 0.0 | 0.0 | 0.2711 | 0.7555 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0304 | 0.1961 | 0.0612 | 0.3342 | 0.0005 | 0.0358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.153 | 0.0 | 0.0 | 0.0054 | 0.1167 | 0.0 | 0.0 | 0.2994 | 0.4682 | 0.0108 | 0.0756 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.0876 | 0.0238 | 0.1978 | 0.0 | 0.0 | 0.2109 | 0.4193 | 0.0001 | 0.039 | 0.0687 | 0.2893 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9452 | 1.2274 | 7000 | 2.0789 | 0.035 | 0.0667 | 0.0324 | 0.0143 | 0.0347 | 0.0496 | 0.0803 | 0.1147 | 0.116 | 0.036 | 0.1251 | 0.1409 | 0.0 | 0.0 | 0.0741 | 0.522 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1408 | 0.5555 | 0.0 | 0.0 | 0.2483 | 0.6715 | 0.0021 | 0.0171 | 0.0997 | 0.4506 | 0.0 | 0.0 | 0.3015 | 0.7563 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0264 | 0.1721 | 0.0575 | 0.3205 | 0.0006 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0108 | 0.1482 | 0.0 | 0.0 | 0.0074 | 0.0975 | 0.0 | 0.0 | 0.3074 | 0.4745 | 0.0069 | 0.0887 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0144 | 0.0954 | 0.0267 | 0.1888 | 0.0 | 0.0 | 0.2131 | 0.424 | 0.0001 | 0.0409 | 0.0725 | 0.2837 | 0.0 | 0.0 | 0.0001 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7197 | 1.2362 | 7050 | 2.0809 | 0.0333 | 0.0639 | 0.0302 | 0.0143 | 0.0344 | 0.0454 | 0.0778 | 0.1142 | 0.1155 | 0.0354 | 0.126 | 0.1389 | 0.0 | 0.0 | 0.0659 | 0.5034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1479 | 0.5621 | 0.0 | 0.0 | 0.239 | 0.6625 | 0.0017 | 0.0124 | 0.0873 | 0.4425 | 0.0 | 0.0 | 0.2734 | 0.7732 | 0.0 | 0.0 | 0.0 | 0.0 | 0.018 | 0.1837 | 0.0595 | 0.3082 | 0.0006 | 0.0257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.1317 | 0.0 | 0.0 | 0.0065 | 0.1242 | 0.0 | 0.0 | 0.293 | 0.4677 | 0.009 | 0.1052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0122 | 0.076 | 0.028 | 0.1881 | 0.0 | 0.0 | 0.2123 | 0.4245 | 0.0002 | 0.0455 | 0.07 | 0.2752 | 0.0 | 0.0 | 0.0002 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0763 | 1.2450 | 7100 | 2.0765 | 0.0334 | 0.0645 | 0.0308 | 0.0141 | 0.0351 | 0.0456 | 0.0774 | 0.1125 | 0.114 | 0.035 | 0.122 | 0.1424 | 0.0 | 0.0 | 0.0783 | 0.5167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1545 | 0.5357 | 0.0 | 0.0 | 0.2464 | 0.6615 | 0.0059 | 0.0257 | 0.091 | 0.4356 | 0.0 | 0.0 | 0.2627 | 0.7502 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.1527 | 0.0534 | 0.3384 | 0.0003 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0025 | 0.0057 | 0.139 | 0.0 | 0.0 | 0.0055 | 0.1017 | 0.0 | 0.0 | 0.3019 | 0.4788 | 0.0075 | 0.1099 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.0604 | 0.0281 | 0.1694 | 0.0 | 0.0 | 0.2133 | 0.4193 | 0.0002 | 0.0522 | 0.065 | 0.2667 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8273 | 1.2537 | 7150 | 2.0701 | 0.0336 | 0.0643 | 0.031 | 0.0134 | 0.0345 | 0.0471 | 0.0776 | 0.1134 | 0.1146 | 0.0356 | 0.1212 | 0.1388 | 0.0 | 0.0 | 0.0788 | 0.514 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1462 | 0.5302 | 0.0 | 0.0 | 0.2461 | 0.6622 | 0.0067 | 0.0276 | 0.0904 | 0.4375 | 0.0 | 0.0 | 0.2737 | 0.7516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.01 | 0.1504 | 0.049 | 0.3726 | 0.0006 | 0.0339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.006 | 0.1488 | 0.0 | 0.0 | 0.0048 | 0.1008 | 0.0 | 0.0 | 0.303 | 0.4742 | 0.0067 | 0.0967 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0571 | 0.0305 | 0.1769 | 0.0 | 0.0 | 0.2158 | 0.4237 | 0.0002 | 0.0397 | 0.0696 | 0.2701 | 0.0 | 0.0 | 0.0001 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5652 | 1.2625 | 7200 | 2.0800 | 0.0338 | 0.0644 | 0.0306 | 0.0138 | 0.0353 | 0.0506 | 0.0795 | 0.1165 | 0.1178 | 0.0356 | 0.1245 | 0.1474 | 0.0 | 0.0 | 0.0821 | 0.5206 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1448 | 0.5555 | 0.0 | 0.0 | 0.2237 | 0.6782 | 0.0016 | 0.0124 | 0.1057 | 0.4556 | 0.0 | 0.0 | 0.2802 | 0.762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0125 | 0.1744 | 0.0529 | 0.3808 | 0.0005 | 0.0367 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.1396 | 0.0 | 0.0 | 0.0054 | 0.1092 | 0.0 | 0.0 | 0.3076 | 0.4806 | 0.0087 | 0.1047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.094 | 0.0237 | 0.1791 | 0.0 | 0.0 | 0.2126 | 0.4193 | 0.0002 | 0.0399 | 0.0705 | 0.2743 | 0.0 | 0.0 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.202 | 1.2713 | 7250 | 2.0764 | 0.0356 | 0.0686 | 0.0331 | 0.0173 | 0.0359 | 0.0508 | 0.0822 | 0.1202 | 0.1215 | 0.0375 | 0.1341 | 0.1462 | 0.0 | 0.0 | 0.0747 | 0.5089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1454 | 0.5555 | 0.0 | 0.0 | 0.2254 | 0.6657 | 0.0044 | 0.0267 | 0.1109 | 0.4669 | 0.0 | 0.0 | 0.2854 | 0.7821 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0401 | 0.224 | 0.0846 | 0.3384 | 0.0005 | 0.0541 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0025 | 0.0066 | 0.1451 | 0.0 | 0.0 | 0.003 | 0.1133 | 0.0 | 0.0 | 0.3127 | 0.4793 | 0.0118 | 0.1056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0156 | 0.1452 | 0.0234 | 0.2187 | 0.0 | 0.0 | 0.2217 | 0.4254 | 0.0002 | 0.0461 | 0.07 | 0.2844 | 0.0 | 0.0 | 0.0001 | 0.001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2662 | 1.2800 | 7300 | 2.0737 | 0.035 | 0.067 | 0.0327 | 0.0128 | 0.0358 | 0.0505 | 0.0787 | 0.1165 | 0.118 | 0.0356 | 0.1288 | 0.1454 | 0.0 | 0.0 | 0.0804 | 0.507 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1415 | 0.5489 | 0.0 | 0.0 | 0.2251 | 0.6577 | 0.0029 | 0.0238 | 0.1166 | 0.4569 | 0.0 | 0.0 | 0.2954 | 0.762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0257 | 0.2256 | 0.0697 | 0.2959 | 0.0005 | 0.0495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0063 | 0.139 | 0.0 | 0.0 | 0.0039 | 0.1017 | 0.0 | 0.0 | 0.311 | 0.484 | 0.0111 | 0.115 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.1028 | 0.0227 | 0.2022 | 0.0 | 0.0 | 0.2148 | 0.4142 | 0.0002 | 0.0474 | 0.0719 | 0.2925 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9212 | 1.2888 | 7350 | 2.0606 | 0.0361 | 0.0688 | 0.0337 | 0.0157 | 0.036 | 0.0558 | 0.0783 | 0.1154 | 0.1167 | 0.0369 | 0.1246 | 0.1468 | 0.0 | 0.0 | 0.082 | 0.514 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1342 | 0.5143 | 0.0 | 0.0 | 0.2544 | 0.6708 | 0.0012 | 0.0086 | 0.1015 | 0.4419 | 0.0 | 0.0 | 0.3102 | 0.7728 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0314 | 0.2178 | 0.0784 | 0.3027 | 0.0004 | 0.033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0037 | 0.0053 | 0.125 | 0.0 | 0.0 | 0.0076 | 0.1233 | 0.0 | 0.0 | 0.3155 | 0.4909 | 0.0119 | 0.1169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0099 | 0.0811 | 0.0204 | 0.1821 | 0.0 | 0.0 | 0.2234 | 0.4229 | 0.0002 | 0.0535 | 0.0738 | 0.2894 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4398 | 1.2976 | 7400 | 2.0734 | 0.0354 | 0.0678 | 0.0334 | 0.0146 | 0.0351 | 0.0559 | 0.0783 | 0.1137 | 0.1149 | 0.0359 | 0.1232 | 0.1461 | 0.0 | 0.0 | 0.0748 | 0.5068 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1481 | 0.5148 | 0.0 | 0.0 | 0.2523 | 0.6635 | 0.0021 | 0.0086 | 0.0908 | 0.4431 | 0.0 | 0.0 | 0.3035 | 0.7687 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0288 | 0.2093 | 0.0789 | 0.2808 | 0.0007 | 0.0349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0025 | 0.0059 | 0.1421 | 0.0 | 0.0 | 0.0044 | 0.1033 | 0.0 | 0.0 | 0.3101 | 0.4814 | 0.0091 | 0.1113 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0141 | 0.0972 | 0.0282 | 0.1858 | 0.0 | 0.0 | 0.21 | 0.422 | 0.0002 | 0.0468 | 0.0669 | 0.2604 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7504 | 1.3063 | 7450 | 2.0585 | 0.0358 | 0.068 | 0.0333 | 0.0161 | 0.0355 | 0.0552 | 0.0783 | 0.1144 | 0.1156 | 0.0378 | 0.1265 | 0.1426 | 0.0 | 0.0 | 0.0719 | 0.5102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1495 | 0.5192 | 0.0 | 0.0 | 0.2698 | 0.6689 | 0.0003 | 0.0038 | 0.0928 | 0.4588 | 0.0 | 0.0 | 0.3127 | 0.7713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.029 | 0.2116 | 0.0631 | 0.2918 | 0.0002 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 | 0.1402 | 0.0 | 0.0 | 0.0049 | 0.09 | 0.0 | 0.0 | 0.3072 | 0.4732 | 0.0099 | 0.1146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0192 | 0.1134 | 0.0252 | 0.1716 | 0.0 | 0.0 | 0.2119 | 0.4273 | 0.0002 | 0.0511 | 0.073 | 0.2803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2353 | 1.3151 | 7500 | 2.0669 | 0.0344 | 0.066 | 0.0316 | 0.0157 | 0.0338 | 0.0531 | 0.0764 | 0.1123 | 0.1137 | 0.0381 | 0.1228 | 0.1398 | 0.0 | 0.0 | 0.0712 | 0.4998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1534 | 0.5071 | 0.0 | 0.0 | 0.2517 | 0.6612 | 0.0002 | 0.0019 | 0.0904 | 0.4663 | 0.0 | 0.0 | 0.3037 | 0.7699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.027 | 0.2163 | 0.0545 | 0.2616 | 0.0002 | 0.0119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.1329 | 0.0 | 0.0 | 0.0058 | 0.11 | 0.0 | 0.0 | 0.3039 | 0.4732 | 0.0079 | 0.1089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0191 | 0.112 | 0.0258 | 0.1657 | 0.0 | 0.0 | 0.1894 | 0.4035 | 0.0002 | 0.0549 | 0.0722 | 0.2692 | 0.0 | 0.0 | 0.0002 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.5166 | 1.3239 | 7550 | 2.0691 | 0.0349 | 0.0668 | 0.032 | 0.0134 | 0.0356 | 0.0546 | 0.0763 | 0.1112 | 0.1125 | 0.0367 | 0.1188 | 0.1437 | 0.0 | 0.0 | 0.0809 | 0.5225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1432 | 0.5027 | 0.0 | 0.0 | 0.2653 | 0.6798 | 0.0006 | 0.0057 | 0.1002 | 0.4469 | 0.0 | 0.0 | 0.2979 | 0.7504 | 0.0 | 0.0 | 0.0 | 0.0 | 0.017 | 0.1907 | 0.053 | 0.2493 | 0.0012 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0012 | 0.0062 | 0.1396 | 0.0 | 0.0 | 0.0054 | 0.105 | 0.0 | 0.0 | 0.2974 | 0.4655 | 0.0146 | 0.1164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0138 | 0.0853 | 0.024 | 0.15 | 0.0 | 0.0 | 0.2091 | 0.4176 | 0.0002 | 0.0528 | 0.0738 | 0.2701 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5791 | 1.3326 | 7600 | 2.0577 | 0.0357 | 0.0677 | 0.033 | 0.0144 | 0.0357 | 0.0545 | 0.0779 | 0.1117 | 0.1128 | 0.0376 | 0.1198 | 0.1452 | 0.0 | 0.0 | 0.0733 | 0.5138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1383 | 0.5165 | 0.0 | 0.0 | 0.2832 | 0.6679 | 0.0002 | 0.0029 | 0.0997 | 0.4412 | 0.0 | 0.0 | 0.3019 | 0.7512 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.1992 | 0.0651 | 0.2644 | 0.0003 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.1482 | 0.0 | 0.0 | 0.0066 | 0.0958 | 0.0 | 0.0 | 0.306 | 0.4758 | 0.0146 | 0.1254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0153 | 0.0945 | 0.0218 | 0.1306 | 0.0 | 0.0 | 0.2171 | 0.4202 | 0.0002 | 0.0491 | 0.0768 | 0.2769 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8534 | 1.3414 | 7650 | 2.0632 | 0.0352 | 0.067 | 0.0328 | 0.014 | 0.037 | 0.0532 | 0.0771 | 0.1112 | 0.1125 | 0.0364 | 0.1178 | 0.1436 | 0.0 | 0.0 | 0.0778 | 0.5059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1301 | 0.5099 | 0.0 | 0.0 | 0.2802 | 0.6724 | 0.0005 | 0.0076 | 0.1024 | 0.4481 | 0.0 | 0.0 | 0.2832 | 0.7443 | 0.0 | 0.0 | 0.0 | 0.0 | 0.012 | 0.1775 | 0.0577 | 0.2712 | 0.0004 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0045 | 0.1512 | 0.0 | 0.0 | 0.0079 | 0.0925 | 0.0 | 0.0 | 0.303 | 0.4711 | 0.0151 | 0.1174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0147 | 0.0756 | 0.0283 | 0.1396 | 0.0 | 0.0 | 0.2248 | 0.4299 | 0.0002 | 0.0479 | 0.0785 | 0.285 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0 | 0.0 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0372 | 1.3502 | 7700 | 2.0622 | 0.0362 | 0.0686 | 0.034 | 0.0148 | 0.0375 | 0.0539 | 0.0795 | 0.1144 | 0.1157 | 0.0368 | 0.1227 | 0.1427 | 0.0 | 0.0 | 0.0791 | 0.5186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1405 | 0.5269 | 0.0 | 0.0 | 0.2762 | 0.6747 | 0.0046 | 0.0248 | 0.1139 | 0.4437 | 0.0 | 0.0 | 0.2869 | 0.7675 | 0.0 | 0.0 | 0.0 | 0.0 | 0.022 | 0.1667 | 0.054 | 0.2904 | 0.0036 | 0.0376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 | 0.1445 | 0.0 | 0.0 | 0.0071 | 0.1167 | 0.0 | 0.0 | 0.3176 | 0.4869 | 0.0145 | 0.1089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.017 | 0.0945 | 0.026 | 0.156 | 0.0 | 0.0 | 0.2234 | 0.4359 | 0.0002 | 0.0431 | 0.0749 | 0.2829 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2954 | 1.3589 | 7750 | 2.0489 | 0.0366 | 0.07 | 0.0342 | 0.0142 | 0.0379 | 0.0566 | 0.0788 | 0.1147 | 0.116 | 0.0374 | 0.126 | 0.1471 | 0.0 | 0.0 | 0.0784 | 0.5097 | 0.0 | 0.0 | 0.0 | 0.0 | 0.156 | 0.5302 | 0.0 | 0.0 | 0.2802 | 0.6679 | 0.0019 | 0.0152 | 0.1047 | 0.4469 | 0.0 | 0.0 | 0.3061 | 0.7831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0259 | 0.1922 | 0.0602 | 0.274 | 0.0016 | 0.0413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0025 | 0.0057 | 0.1451 | 0.0 | 0.0 | 0.0048 | 0.1167 | 0.0 | 0.0 | 0.3027 | 0.4799 | 0.0136 | 0.1131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0203 | 0.1018 | 0.0238 | 0.1507 | 0.0 | 0.0 | 0.2184 | 0.4311 | 0.0001 | 0.0463 | 0.076 | 0.2852 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.521 | 1.3677 | 7800 | 2.0584 | 0.0362 | 0.0699 | 0.033 | 0.0153 | 0.0368 | 0.0556 | 0.0804 | 0.1151 | 0.1164 | 0.0383 | 0.1292 | 0.1486 | 0.0 | 0.0 | 0.0758 | 0.4936 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1549 | 0.5143 | 0.0 | 0.0 | 0.2973 | 0.6619 | 0.008 | 0.039 | 0.0856 | 0.4412 | 0.0 | 0.0 | 0.3105 | 0.7642 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0358 | 0.2171 | 0.0599 | 0.2753 | 0.0014 | 0.0367 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0025 | 0.0047 | 0.1402 | 0.0 | 0.0 | 0.0061 | 0.1192 | 0.0 | 0.0 | 0.2881 | 0.4659 | 0.0099 | 0.1141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0168 | 0.1138 | 0.0262 | 0.1776 | 0.0 | 0.0 | 0.2086 | 0.4265 | 0.0003 | 0.058 | 0.0732 | 0.2852 | 0.0 | 0.0 | 0.0003 | 0.0031 | 0.0 | 0.0 | 0.0008 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2057 | 1.3765 | 7850 | 2.0523 | 0.0357 | 0.068 | 0.0331 | 0.0143 | 0.0362 | 0.0582 | 0.0789 | 0.1139 | 0.1154 | 0.0382 | 0.1226 | 0.1501 | 0.0 | 0.0 | 0.0765 | 0.4989 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1496 | 0.5297 | 0.0 | 0.0 | 0.2762 | 0.6596 | 0.0119 | 0.0419 | 0.0838 | 0.4319 | 0.0 | 0.0 | 0.3048 | 0.75 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0237 | 0.2016 | 0.0597 | 0.2644 | 0.0003 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0012 | 0.0037 | 0.1348 | 0.0 | 0.0 | 0.0059 | 0.1442 | 0.0 | 0.0 | 0.3045 | 0.4818 | 0.0095 | 0.1174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0166 | 0.1092 | 0.0237 | 0.159 | 0.0 | 0.0 | 0.2167 | 0.4293 | 0.0002 | 0.0549 | 0.0768 | 0.2802 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9904 | 1.3852 | 7900 | 2.0525 | 0.0354 | 0.0686 | 0.0328 | 0.0135 | 0.0369 | 0.0552 | 0.0793 | 0.1139 | 0.1152 | 0.0353 | 0.1232 | 0.1508 | 0.0 | 0.0 | 0.0814 | 0.4879 | 0.0 | 0.0 | 0.0 | 0.0 | 0.158 | 0.5555 | 0.0 | 0.0 | 0.2656 | 0.6577 | 0.008 | 0.0238 | 0.0958 | 0.4137 | 0.0 | 0.0 | 0.2888 | 0.7587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0167 | 0.1752 | 0.0696 | 0.2726 | 0.0009 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0012 | 0.0056 | 0.1384 | 0.0 | 0.0 | 0.0053 | 0.135 | 0.0 | 0.0 | 0.2964 | 0.4752 | 0.0089 | 0.1094 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0181 | 0.1115 | 0.0305 | 0.2067 | 0.0 | 0.0 | 0.2037 | 0.4177 | 0.0003 | 0.0502 | 0.0735 | 0.2821 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0287 | 1.3940 | 7950 | 2.0550 | 0.0362 | 0.0689 | 0.034 | 0.0152 | 0.0372 | 0.056 | 0.0775 | 0.1133 | 0.1147 | 0.0363 | 0.1258 | 0.147 | 0.0 | 0.0 | 0.0734 | 0.4748 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1621 | 0.5341 | 0.0 | 0.0 | 0.2756 | 0.6567 | 0.009 | 0.0286 | 0.0756 | 0.4288 | 0.0 | 0.0 | 0.3175 | 0.765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0303 | 0.2163 | 0.0726 | 0.2438 | 0.0002 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.1305 | 0.0 | 0.0 | 0.0046 | 0.1225 | 0.0 | 0.0 | 0.3009 | 0.4791 | 0.0089 | 0.1174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0172 | 0.1147 | 0.022 | 0.1724 | 0.0 | 0.0 | 0.2125 | 0.4283 | 0.0002 | 0.059 | 0.0735 | 0.2742 | 0.0 | 0.0 | 0.0001 | 0.0036 | 0.0 | 0.0 | 0.0012 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4249 | 1.4028 | 8000 | 2.0509 | 0.0359 | 0.0688 | 0.0337 | 0.0148 | 0.0368 | 0.0593 | 0.0787 | 0.1136 | 0.1149 | 0.0375 | 0.122 | 0.1494 | 0.0 | 0.0 | 0.0797 | 0.4837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.154 | 0.5209 | 0.0 | 0.0 | 0.2575 | 0.6545 | 0.0068 | 0.0238 | 0.0836 | 0.45 | 0.0 | 0.0 | 0.315 | 0.7589 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0185 | 0.1938 | 0.0697 | 0.2836 | 0.0002 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0012 | 0.0054 | 0.1476 | 0.0 | 0.0 | 0.0075 | 0.1358 | 0.0 | 0.0 | 0.3064 | 0.4787 | 0.008 | 0.1094 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0166 | 0.0954 | 0.0304 | 0.1746 | 0.0 | 0.0 | 0.2162 | 0.4163 | 0.0002 | 0.0562 | 0.075 | 0.2797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0371 | 1.4115 | 8050 | 2.0420 | 0.0359 | 0.0681 | 0.0342 | 0.0142 | 0.0372 | 0.0579 | 0.0793 | 0.1139 | 0.1152 | 0.0367 | 0.1236 | 0.1488 | 0.0 | 0.0 | 0.0849 | 0.4917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1524 | 0.5346 | 0.0 | 0.0 | 0.267 | 0.6641 | 0.0129 | 0.0333 | 0.0803 | 0.4425 | 0.0 | 0.0 | 0.3139 | 0.7569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0231 | 0.2132 | 0.0655 | 0.274 | 0.0002 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0025 | 0.0046 | 0.1384 | 0.0 | 0.0 | 0.0074 | 0.1358 | 0.0 | 0.0 | 0.3086 | 0.4769 | 0.0092 | 0.1023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0189 | 0.0917 | 0.0266 | 0.1724 | 0.0 | 0.0 | 0.2073 | 0.4171 | 0.0002 | 0.0535 | 0.0685 | 0.2739 | 0.0 | 0.0 | 0.0 | 0.0031 | 0.0 | 0.0 | 0.0007 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9464 | 1.4203 | 8100 | 2.0416 | 0.0359 | 0.0693 | 0.033 | 0.0154 | 0.0373 | 0.0571 | 0.0804 | 0.1153 | 0.1167 | 0.0389 | 0.1248 | 0.1526 | 0.0 | 0.0 | 0.0811 | 0.4915 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1489 | 0.5407 | 0.0 | 0.0 | 0.2689 | 0.6606 | 0.0134 | 0.0419 | 0.0916 | 0.4425 | 0.0 | 0.0 | 0.308 | 0.7498 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0227 | 0.2155 | 0.0629 | 0.2822 | 0.0004 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0012 | 0.0057 | 0.139 | 0.0 | 0.0 | 0.0081 | 0.1467 | 0.0 | 0.0 | 0.3002 | 0.4654 | 0.0083 | 0.1108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.1028 | 0.0292 | 0.2007 | 0.0 | 0.0 | 0.2073 | 0.4141 | 0.0002 | 0.0547 | 0.0743 | 0.2832 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2194 | 1.4291 | 8150 | 2.0344 | 0.0362 | 0.069 | 0.0329 | 0.0149 | 0.0364 | 0.0552 | 0.0814 | 0.1166 | 0.1178 | 0.0368 | 0.1282 | 0.1479 | 0.0 | 0.0 | 0.0852 | 0.4951 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1491 | 0.5335 | 0.0 | 0.0 | 0.2566 | 0.6647 | 0.0079 | 0.0324 | 0.0921 | 0.4375 | 0.0 | 0.0 | 0.3083 | 0.7583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.033 | 0.2295 | 0.0683 | 0.311 | 0.0007 | 0.0349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0012 | 0.0041 | 0.1384 | 0.0 | 0.0 | 0.0072 | 0.145 | 0.0 | 0.0 | 0.3037 | 0.4691 | 0.0085 | 0.1178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0165 | 0.106 | 0.0294 | 0.194 | 0.0 | 0.0 | 0.2186 | 0.4228 | 0.0002 | 0.0463 | 0.0741 | 0.2812 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9928 | 1.4378 | 8200 | 2.0385 | 0.0372 | 0.0716 | 0.0339 | 0.0151 | 0.0355 | 0.0574 | 0.0801 | 0.1162 | 0.1174 | 0.037 | 0.1258 | 0.1479 | 0.0 | 0.0 | 0.0831 | 0.4981 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1445 | 0.5368 | 0.0 | 0.0 | 0.2743 | 0.667 | 0.0132 | 0.0476 | 0.078 | 0.4506 | 0.0079 | 0.0078 | 0.3179 | 0.7352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0456 | 0.2256 | 0.0613 | 0.2836 | 0.0007 | 0.0257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.1238 | 0.0 | 0.0 | 0.0074 | 0.1392 | 0.0 | 0.0 | 0.3175 | 0.4829 | 0.0113 | 0.1122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0191 | 0.1318 | 0.025 | 0.1627 | 0.0 | 0.0 | 0.2282 | 0.4309 | 0.0002 | 0.0524 | 0.0741 | 0.2832 | 0.0 | 0.0 | 0.0 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0025 | 1.4466 | 8250 | 2.0398 | 0.0376 | 0.0718 | 0.0343 | 0.0157 | 0.0366 | 0.056 | 0.0799 | 0.1156 | 0.117 | 0.0369 | 0.127 | 0.1499 | 0.0 | 0.0 | 0.0848 | 0.4953 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1438 | 0.5258 | 0.0 | 0.0 | 0.2837 | 0.6705 | 0.0132 | 0.0448 | 0.0834 | 0.4469 | 0.0089 | 0.0087 | 0.3254 | 0.7581 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0454 | 0.2264 | 0.0615 | 0.2589 | 0.0004 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0025 | 0.006 | 0.1451 | 0.0 | 0.0 | 0.0069 | 0.1392 | 0.0 | 0.0 | 0.3059 | 0.4795 | 0.0124 | 0.1258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.022 | 0.1184 | 0.0213 | 0.1463 | 0.0 | 0.0 | 0.2271 | 0.4301 | 0.0002 | 0.0528 | 0.0753 | 0.2853 | 0.0 | 0.0 | 0.0001 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3483 | 1.4554 | 8300 | 2.0424 | 0.037 | 0.0719 | 0.034 | 0.0142 | 0.0352 | 0.0564 | 0.0789 | 0.1142 | 0.1155 | 0.0366 | 0.1258 | 0.1457 | 0.0 | 0.0 | 0.0791 | 0.4964 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1426 | 0.4956 | 0.0 | 0.0 | 0.2932 | 0.65 | 0.008 | 0.0286 | 0.0808 | 0.4325 | 0.0 | 0.0 | 0.3359 | 0.7528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.238 | 0.0673 | 0.2699 | 0.0006 | 0.0156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.006 | 0.1409 | 0.0 | 0.0 | 0.008 | 0.1483 | 0.0 | 0.0 | 0.3033 | 0.4787 | 0.0136 | 0.116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0203 | 0.1235 | 0.0244 | 0.1754 | 0.0 | 0.0 | 0.2058 | 0.413 | 0.0002 | 0.0491 | 0.0728 | 0.2841 | 0.0 | 0.0 | 0.0 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1513 | 1.4641 | 8350 | 2.0401 | 0.0373 | 0.0729 | 0.0331 | 0.0152 | 0.0361 | 0.0592 | 0.0815 | 0.1156 | 0.1168 | 0.0383 | 0.1255 | 0.1502 | 0.0 | 0.0 | 0.0848 | 0.4928 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1534 | 0.5418 | 0.0 | 0.0 | 0.2889 | 0.6401 | 0.0054 | 0.0286 | 0.0842 | 0.445 | 0.0 | 0.0 | 0.3159 | 0.7555 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0461 | 0.2442 | 0.0714 | 0.2438 | 0.0008 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.1348 | 0.0 | 0.0 | 0.0107 | 0.1792 | 0.0 | 0.0 | 0.2976 | 0.4749 | 0.0101 | 0.1042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0226 | 0.1382 | 0.0249 | 0.1813 | 0.0 | 0.0 | 0.2175 | 0.425 | 0.0002 | 0.047 | 0.071 | 0.271 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.001 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7237 | 1.4729 | 8400 | 2.0362 | 0.0377 | 0.0731 | 0.0342 | 0.0155 | 0.0363 | 0.0532 | 0.0818 | 0.1163 | 0.1175 | 0.0376 | 0.1282 | 0.1457 | 0.0 | 0.0 | 0.0826 | 0.4898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.152 | 0.5341 | 0.0 | 0.0 | 0.2857 | 0.6497 | 0.0078 | 0.0295 | 0.0879 | 0.4581 | 0.0 | 0.0 | 0.3276 | 0.7492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0443 | 0.2426 | 0.0636 | 0.2726 | 0.0006 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0068 | 0.1354 | 0.0 | 0.0 | 0.0115 | 0.1658 | 0.0 | 0.0 | 0.3099 | 0.4877 | 0.0108 | 0.116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0234 | 0.1276 | 0.0263 | 0.1866 | 0.0 | 0.0 | 0.2185 | 0.4199 | 0.0002 | 0.0496 | 0.0739 | 0.2729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8576 | 1.4817 | 8450 | 2.0393 | 0.0373 | 0.0721 | 0.0338 | 0.0178 | 0.0358 | 0.0538 | 0.0826 | 0.1162 | 0.1175 | 0.0395 | 0.1293 | 0.146 | 0.0 | 0.0 | 0.0848 | 0.5047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1491 | 0.5319 | 0.0 | 0.0 | 0.2812 | 0.6583 | 0.0093 | 0.041 | 0.0875 | 0.4556 | 0.0 | 0.0 | 0.3276 | 0.7429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0498 | 0.2349 | 0.0615 | 0.2616 | 0.0005 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.1213 | 0.0 | 0.0 | 0.0105 | 0.165 | 0.0 | 0.0 | 0.3085 | 0.4752 | 0.0129 | 0.116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0207 | 0.141 | 0.0212 | 0.1746 | 0.0 | 0.0 | 0.2187 | 0.4193 | 0.0002 | 0.0575 | 0.0699 | 0.2746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.57 | 1.4904 | 8500 | 2.0413 | 0.0377 | 0.0733 | 0.0343 | 0.0168 | 0.035 | 0.0566 | 0.082 | 0.1161 | 0.1176 | 0.0376 | 0.127 | 0.1486 | 0.0 | 0.0 | 0.0832 | 0.4807 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1668 | 0.5703 | 0.0 | 0.0 | 0.2745 | 0.6426 | 0.0118 | 0.0429 | 0.0811 | 0.425 | 0.0 | 0.0 | 0.3232 | 0.7573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0684 | 0.2512 | 0.0669 | 0.2041 | 0.0005 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.122 | 0.0 | 0.0 | 0.0111 | 0.185 | 0.0 | 0.0 | 0.2952 | 0.4755 | 0.0111 | 0.1146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0207 | 0.1571 | 0.0214 | 0.1881 | 0.0 | 0.0 | 0.2218 | 0.4286 | 0.0002 | 0.0573 | 0.0708 | 0.2748 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.616 | 1.4992 | 8550 | 2.0360 | 0.0387 | 0.0747 | 0.0353 | 0.0159 | 0.0359 | 0.06 | 0.0823 | 0.1169 | 0.1182 | 0.039 | 0.1274 | 0.1468 | 0.0 | 0.0 | 0.0807 | 0.4839 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1655 | 0.5264 | 0.0 | 0.0 | 0.2855 | 0.6378 | 0.0145 | 0.0438 | 0.0851 | 0.4512 | 0.0079 | 0.0078 | 0.3243 | 0.761 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0627 | 0.2519 | 0.0721 | 0.2315 | 0.0007 | 0.0477 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.005 | 0.0045 | 0.1171 | 0.0 | 0.0 | 0.0145 | 0.1925 | 0.0 | 0.0 | 0.3044 | 0.4847 | 0.0105 | 0.1108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0182 | 0.1401 | 0.0221 | 0.1709 | 0.0 | 0.0 | 0.2311 | 0.4345 | 0.0002 | 0.05 | 0.0746 | 0.2863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3928 | 1.5080 | 8600 | 2.0387 | 0.0385 | 0.0736 | 0.0358 | 0.0156 | 0.037 | 0.0585 | 0.0812 | 0.1154 | 0.1167 | 0.038 | 0.1291 | 0.1437 | 0.0 | 0.0 | 0.0721 | 0.4835 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1702 | 0.5286 | 0.0 | 0.0 | 0.2925 | 0.6365 | 0.011 | 0.0333 | 0.0793 | 0.425 | 0.0 | 0.0 | 0.3331 | 0.7612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0691 | 0.2504 | 0.0754 | 0.2315 | 0.0008 | 0.0596 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0037 | 0.0054 | 0.1323 | 0.0 | 0.0 | 0.0109 | 0.1667 | 0.0 | 0.0 | 0.3028 | 0.4859 | 0.0089 | 0.1028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0191 | 0.1267 | 0.0219 | 0.1649 | 0.0 | 0.0 | 0.2275 | 0.4363 | 0.0002 | 0.0506 | 0.0699 | 0.2897 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0271 | 1.5167 | 8650 | 2.0488 | 0.0376 | 0.0737 | 0.0343 | 0.0151 | 0.0373 | 0.0559 | 0.0803 | 0.1128 | 0.1141 | 0.0352 | 0.126 | 0.1433 | 0.0 | 0.0 | 0.0768 | 0.4833 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1609 | 0.5143 | 0.0 | 0.0 | 0.2885 | 0.6381 | 0.0122 | 0.04 | 0.0777 | 0.4162 | 0.0 | 0.0 | 0.3283 | 0.7433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0627 | 0.2364 | 0.0736 | 0.2041 | 0.0006 | 0.0468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.005 | 0.0042 | 0.1256 | 0.0 | 0.0 | 0.0116 | 0.155 | 0.0 | 0.0 | 0.3014 | 0.4791 | 0.0101 | 0.1263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0171 | 0.1258 | 0.0222 | 0.1612 | 0.0 | 0.0 | 0.2104 | 0.4144 | 0.0002 | 0.0528 | 0.0717 | 0.2803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.352 | 1.5255 | 8700 | 2.0375 | 0.0383 | 0.0733 | 0.0355 | 0.0161 | 0.0369 | 0.0578 | 0.0814 | 0.1164 | 0.1178 | 0.0378 | 0.1304 | 0.147 | 0.0 | 0.0 | 0.0788 | 0.4983 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1648 | 0.5379 | 0.0 | 0.0 | 0.2971 | 0.6506 | 0.0071 | 0.0381 | 0.0836 | 0.4456 | 0.0 | 0.0 | 0.3388 | 0.7573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0596 | 0.2419 | 0.0747 | 0.2425 | 0.0008 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.015 | 0.0052 | 0.1433 | 0.0 | 0.0 | 0.0102 | 0.1617 | 0.0 | 0.0 | 0.2963 | 0.48 | 0.0121 | 0.1272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0169 | 0.1249 | 0.0219 | 0.1649 | 0.0 | 0.0 | 0.2265 | 0.4323 | 0.0002 | 0.0487 | 0.0679 | 0.2786 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4356 | 1.5343 | 8750 | 2.0275 | 0.0394 | 0.0749 | 0.0369 | 0.0181 | 0.0381 | 0.0574 | 0.0832 | 0.119 | 0.1203 | 0.0411 | 0.1308 | 0.1495 | 0.0 | 0.0 | 0.0843 | 0.5066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1737 | 0.5714 | 0.0 | 0.0 | 0.3026 | 0.6577 | 0.0109 | 0.0457 | 0.0965 | 0.4675 | 0.0 | 0.0 | 0.3224 | 0.7543 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0588 | 0.2659 | 0.0683 | 0.2342 | 0.0007 | 0.0275 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0063 | 0.0058 | 0.1396 | 0.0 | 0.0 | 0.0122 | 0.1517 | 0.0 | 0.0 | 0.31 | 0.484 | 0.0158 | 0.1225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.1369 | 0.0231 | 0.1888 | 0.0 | 0.0 | 0.2312 | 0.4347 | 0.0002 | 0.0474 | 0.0758 | 0.2898 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4577 | 1.5430 | 8800 | 2.0297 | 0.0388 | 0.0747 | 0.0355 | 0.0165 | 0.038 | 0.0574 | 0.0829 | 0.118 | 0.1193 | 0.039 | 0.1291 | 0.1517 | 0.0 | 0.0 | 0.0861 | 0.5047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1706 | 0.5791 | 0.0 | 0.0 | 0.2863 | 0.6593 | 0.0089 | 0.0371 | 0.1045 | 0.4481 | 0.0 | 0.0 | 0.3201 | 0.7577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0565 | 0.2388 | 0.0687 | 0.2425 | 0.0005 | 0.0202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.005 | 0.0057 | 0.1317 | 0.0 | 0.0 | 0.0105 | 0.1408 | 0.0 | 0.0 | 0.3064 | 0.4768 | 0.0105 | 0.1211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0182 | 0.1507 | 0.0277 | 0.2075 | 0.0 | 0.0 | 0.2315 | 0.4286 | 0.0002 | 0.0489 | 0.0726 | 0.2858 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5893 | 1.5518 | 8850 | 2.0230 | 0.0388 | 0.0737 | 0.0356 | 0.0152 | 0.0371 | 0.0579 | 0.0812 | 0.1172 | 0.1183 | 0.0392 | 0.129 | 0.146 | 0.0 | 0.0 | 0.0779 | 0.504 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1687 | 0.5522 | 0.0 | 0.0 | 0.308 | 0.6538 | 0.0052 | 0.0314 | 0.0931 | 0.4638 | 0.0 | 0.0 | 0.3284 | 0.7622 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0556 | 0.2527 | 0.0676 | 0.2178 | 0.0006 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0075 | 0.0047 | 0.1256 | 0.0 | 0.0 | 0.0108 | 0.1492 | 0.0 | 0.0 | 0.3091 | 0.4883 | 0.0096 | 0.1141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0199 | 0.1369 | 0.0266 | 0.1896 | 0.0 | 0.0 | 0.2268 | 0.4318 | 0.0002 | 0.0465 | 0.0719 | 0.2868 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9355 | 1.5606 | 8900 | 2.0280 | 0.0386 | 0.0737 | 0.0352 | 0.0162 | 0.0371 | 0.0568 | 0.0816 | 0.1171 | 0.1184 | 0.0382 | 0.1301 | 0.1483 | 0.0 | 0.0 | 0.0827 | 0.5155 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1655 | 0.5313 | 0.0 | 0.0 | 0.2905 | 0.6625 | 0.0076 | 0.0438 | 0.0925 | 0.4688 | 0.0 | 0.0 | 0.326 | 0.7553 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0654 | 0.2589 | 0.0646 | 0.2 | 0.0005 | 0.033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0075 | 0.0047 | 0.1274 | 0.0 | 0.0 | 0.0094 | 0.1517 | 0.0 | 0.0 | 0.3068 | 0.484 | 0.0111 | 0.115 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0174 | 0.1396 | 0.0266 | 0.1806 | 0.0 | 0.0 | 0.2288 | 0.4238 | 0.0002 | 0.0522 | 0.075 | 0.2936 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8093 | 1.5693 | 8950 | 2.0213 | 0.0395 | 0.0749 | 0.0369 | 0.0162 | 0.038 | 0.0571 | 0.0836 | 0.1195 | 0.1208 | 0.0405 | 0.1299 | 0.1496 | 0.0 | 0.0 | 0.085 | 0.5258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1692 | 0.522 | 0.0 | 0.0 | 0.2835 | 0.658 | 0.0124 | 0.06 | 0.0943 | 0.475 | 0.0 | 0.0 | 0.3282 | 0.7587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0769 | 0.2682 | 0.0651 | 0.2082 | 0.0005 | 0.0376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0125 | 0.0041 | 0.125 | 0.0 | 0.0 | 0.0099 | 0.1592 | 0.0 | 0.0 | 0.3167 | 0.4947 | 0.0121 | 0.1249 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.02 | 0.1553 | 0.0273 | 0.1948 | 0.0 | 0.0 | 0.2359 | 0.4314 | 0.0002 | 0.047 | 0.0768 | 0.2944 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4777 | 1.5781 | 9000 | 2.0237 | 0.0387 | 0.0743 | 0.0356 | 0.0157 | 0.0375 | 0.0587 | 0.0823 | 0.1187 | 0.12 | 0.0396 | 0.1296 | 0.1542 | 0.0 | 0.0 | 0.0848 | 0.5222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1582 | 0.5269 | 0.0 | 0.0 | 0.2882 | 0.6542 | 0.0087 | 0.0429 | 0.0962 | 0.4669 | 0.0 | 0.0 | 0.3211 | 0.7486 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0823 | 0.276 | 0.0595 | 0.1918 | 0.0005 | 0.0349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.015 | 0.0053 | 0.1409 | 0.0 | 0.0 | 0.0082 | 0.1608 | 0.0 | 0.0 | 0.3022 | 0.4798 | 0.0102 | 0.1155 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0211 | 0.1585 | 0.031 | 0.2164 | 0.0 | 0.0 | 0.2279 | 0.4254 | 0.0002 | 0.0513 | 0.0738 | 0.2896 | 0.0 | 0.0 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7726 | 1.5869 | 9050 | 2.0241 | 0.039 | 0.0743 | 0.0356 | 0.015 | 0.0389 | 0.0571 | 0.0829 | 0.1184 | 0.1197 | 0.0396 | 0.1328 | 0.1478 | 0.0 | 0.0 | 0.0827 | 0.5159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1565 | 0.5225 | 0.0 | 0.0 | 0.299 | 0.6564 | 0.0103 | 0.041 | 0.0984 | 0.4712 | 0.0 | 0.0 | 0.3271 | 0.7579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0754 | 0.2698 | 0.0655 | 0.2082 | 0.0005 | 0.0339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0125 | 0.0056 | 0.1341 | 0.0 | 0.0 | 0.0081 | 0.1367 | 0.0 | 0.0 | 0.3054 | 0.4854 | 0.0137 | 0.1188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0194 | 0.1535 | 0.03 | 0.2231 | 0.0 | 0.0 | 0.2227 | 0.4229 | 0.0002 | 0.0524 | 0.0729 | 0.2885 | 0.0 | 0.0 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2811 | 1.5957 | 9100 | 2.0181 | 0.0392 | 0.0741 | 0.0366 | 0.0184 | 0.0389 | 0.0567 | 0.0836 | 0.1188 | 0.1201 | 0.0434 | 0.1329 | 0.1483 | 0.0 | 0.0 | 0.0847 | 0.5163 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1597 | 0.5291 | 0.0 | 0.0 | 0.301 | 0.6577 | 0.0118 | 0.0524 | 0.0941 | 0.4538 | 0.0 | 0.0 | 0.3258 | 0.7622 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0668 | 0.2798 | 0.0744 | 0.2384 | 0.0006 | 0.0339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0137 | 0.0057 | 0.147 | 0.0 | 0.0 | 0.0069 | 0.1283 | 0.0 | 0.0 | 0.3111 | 0.4929 | 0.0102 | 0.1225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.1396 | 0.0271 | 0.1858 | 0.0 | 0.0 | 0.2298 | 0.4297 | 0.0002 | 0.047 | 0.0751 | 0.2933 | 0.0 | 0.0 | 0.0001 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.968 | 1.6044 | 9150 | 2.0146 | 0.0394 | 0.0742 | 0.0375 | 0.016 | 0.0392 | 0.0579 | 0.0843 | 0.1201 | 0.1213 | 0.0419 | 0.1343 | 0.1498 | 0.0 | 0.0 | 0.087 | 0.5189 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1598 | 0.5533 | 0.0 | 0.0 | 0.2941 | 0.659 | 0.016 | 0.0657 | 0.0978 | 0.4563 | 0.0 | 0.0 | 0.3224 | 0.7551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.065 | 0.2853 | 0.0803 | 0.2397 | 0.0005 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.01 | 0.0056 | 0.139 | 0.0 | 0.0 | 0.0076 | 0.1458 | 0.0 | 0.0 | 0.3122 | 0.4938 | 0.0097 | 0.1202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0209 | 0.1452 | 0.0257 | 0.191 | 0.0 | 0.0 | 0.2339 | 0.4304 | 0.0002 | 0.0498 | 0.0726 | 0.2895 | 0.0 | 0.0 | 0.0001 | 0.0036 | 0.0 | 0.0 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4473 | 1.6132 | 9200 | 2.0185 | 0.039 | 0.075 | 0.0363 | 0.0184 | 0.0397 | 0.0567 | 0.0847 | 0.1198 | 0.1211 | 0.0424 | 0.1308 | 0.1497 | 0.0 | 0.0 | 0.0912 | 0.5186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1644 | 0.5588 | 0.0 | 0.0 | 0.2866 | 0.6545 | 0.0145 | 0.0581 | 0.0995 | 0.4575 | 0.0 | 0.0 | 0.3125 | 0.7469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0681 | 0.2698 | 0.0761 | 0.2548 | 0.0005 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.015 | 0.0051 | 0.128 | 0.0 | 0.0 | 0.0086 | 0.1625 | 0.0 | 0.0 | 0.3104 | 0.4953 | 0.0103 | 0.1282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0196 | 0.147 | 0.0247 | 0.1873 | 0.0 | 0.0 | 0.229 | 0.4247 | 0.0002 | 0.0485 | 0.0743 | 0.2861 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1208 | 1.6220 | 9250 | 2.0144 | 0.0392 | 0.0743 | 0.0363 | 0.0181 | 0.0386 | 0.0576 | 0.0839 | 0.1198 | 0.121 | 0.0418 | 0.1332 | 0.1504 | 0.0 | 0.0 | 0.0868 | 0.5169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1576 | 0.5423 | 0.0 | 0.0 | 0.2964 | 0.6564 | 0.0108 | 0.0505 | 0.0986 | 0.4544 | 0.0 | 0.0 | 0.3208 | 0.7569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0663 | 0.2806 | 0.0797 | 0.2589 | 0.0005 | 0.0303 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0088 | 0.0068 | 0.1366 | 0.0 | 0.0 | 0.007 | 0.1533 | 0.0 | 0.0 | 0.3145 | 0.4903 | 0.0092 | 0.1286 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0198 | 0.1396 | 0.0245 | 0.1948 | 0.0 | 0.0 | 0.2338 | 0.4288 | 0.0002 | 0.0498 | 0.0713 | 0.2863 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1113 | 1.6307 | 9300 | 2.0159 | 0.0394 | 0.0755 | 0.0365 | 0.0157 | 0.0393 | 0.0595 | 0.0839 | 0.1202 | 0.1213 | 0.041 | 0.1329 | 0.1438 | 0.0 | 0.0 | 0.0851 | 0.5127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1611 | 0.5527 | 0.0 | 0.0 | 0.2993 | 0.659 | 0.0109 | 0.0505 | 0.0985 | 0.4706 | 0.0 | 0.0 | 0.3144 | 0.7543 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0725 | 0.2868 | 0.072 | 0.2219 | 0.0004 | 0.0303 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0113 | 0.0068 | 0.1335 | 0.0 | 0.0 | 0.0073 | 0.1542 | 0.0 | 0.0 | 0.3148 | 0.4906 | 0.014 | 0.131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0217 | 0.1507 | 0.0246 | 0.1985 | 0.0 | 0.0 | 0.2352 | 0.4313 | 0.0002 | 0.0472 | 0.0752 | 0.2911 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2009 | 1.6395 | 9350 | 2.0120 | 0.0396 | 0.0757 | 0.0366 | 0.0186 | 0.0387 | 0.06 | 0.085 | 0.1205 | 0.1216 | 0.0426 | 0.1339 | 0.146 | 0.0 | 0.0 | 0.0829 | 0.5104 | 0.0 | 0.0 | 0.0 | 0.0 | 0.163 | 0.5516 | 0.0 | 0.0 | 0.2935 | 0.6577 | 0.0136 | 0.0562 | 0.0978 | 0.4569 | 0.0 | 0.0 | 0.3192 | 0.7508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0801 | 0.2736 | 0.0739 | 0.237 | 0.0003 | 0.0248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0137 | 0.0062 | 0.1335 | 0.0 | 0.0 | 0.0078 | 0.1642 | 0.0 | 0.0 | 0.3151 | 0.4884 | 0.0099 | 0.1338 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.1576 | 0.0241 | 0.2097 | 0.0 | 0.0 | 0.2374 | 0.4357 | 0.0002 | 0.0511 | 0.0759 | 0.2883 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1311 | 1.6483 | 9400 | 2.0104 | 0.0397 | 0.0755 | 0.037 | 0.0188 | 0.0394 | 0.0584 | 0.0845 | 0.1202 | 0.1215 | 0.0434 | 0.1328 | 0.1462 | 0.0 | 0.0 | 0.0853 | 0.5095 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1598 | 0.5687 | 0.0 | 0.0 | 0.2893 | 0.6593 | 0.0094 | 0.0495 | 0.0977 | 0.4506 | 0.0 | 0.0 | 0.3218 | 0.7488 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0849 | 0.2721 | 0.0743 | 0.2342 | 0.0005 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.01 | 0.0055 | 0.1335 | 0.0 | 0.0 | 0.0071 | 0.1575 | 0.0 | 0.0 | 0.3196 | 0.4939 | 0.0095 | 0.1366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.1576 | 0.0251 | 0.2037 | 0.0 | 0.0 | 0.2373 | 0.4341 | 0.0002 | 0.0513 | 0.0766 | 0.291 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0671 | 1.6570 | 9450 | 2.0155 | 0.0396 | 0.0758 | 0.0368 | 0.0182 | 0.0393 | 0.059 | 0.0846 | 0.1204 | 0.1217 | 0.0419 | 0.1327 | 0.1512 | 0.0 | 0.0 | 0.0855 | 0.5108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1669 | 0.5879 | 0.0 | 0.0 | 0.2912 | 0.6651 | 0.0118 | 0.0543 | 0.1001 | 0.46 | 0.0 | 0.0 | 0.3204 | 0.7537 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0832 | 0.2659 | 0.0707 | 0.2247 | 0.0004 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0012 | 0.0063 | 0.1433 | 0.0 | 0.0 | 0.0061 | 0.1475 | 0.0 | 0.0 | 0.3142 | 0.4863 | 0.0107 | 0.1357 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0206 | 0.1594 | 0.0258 | 0.2097 | 0.0 | 0.0 | 0.2306 | 0.4282 | 0.0002 | 0.0515 | 0.0778 | 0.287 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3188 | 1.6658 | 9500 | 2.0094 | 0.0402 | 0.0762 | 0.0375 | 0.0192 | 0.0399 | 0.0566 | 0.0848 | 0.1214 | 0.1226 | 0.0448 | 0.1345 | 0.1507 | 0.0 | 0.0 | 0.0845 | 0.5125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1682 | 0.5714 | 0.0 | 0.0 | 0.296 | 0.6641 | 0.0137 | 0.0552 | 0.097 | 0.4681 | 0.0 | 0.0 | 0.3282 | 0.7618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0828 | 0.2744 | 0.0738 | 0.2438 | 0.0005 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0088 | 0.0052 | 0.1378 | 0.0 | 0.0 | 0.0069 | 0.1542 | 0.0 | 0.0 | 0.3222 | 0.4945 | 0.0134 | 0.1324 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.021 | 0.1576 | 0.0245 | 0.1993 | 0.0 | 0.0 | 0.2329 | 0.4326 | 0.0002 | 0.0528 | 0.0776 | 0.2893 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9184 | 1.6746 | 9550 | 2.0086 | 0.04 | 0.076 | 0.0371 | 0.0187 | 0.04 | 0.0571 | 0.0845 | 0.1206 | 0.1218 | 0.0426 | 0.1342 | 0.1495 | 0.0 | 0.0 | 0.0855 | 0.5165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1699 | 0.5654 | 0.0 | 0.0 | 0.2993 | 0.6606 | 0.0161 | 0.0581 | 0.0984 | 0.4619 | 0.0 | 0.0 | 0.3308 | 0.764 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0719 | 0.2659 | 0.0757 | 0.2425 | 0.0005 | 0.0275 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.01 | 0.0063 | 0.136 | 0.0 | 0.0 | 0.0072 | 0.155 | 0.0 | 0.0 | 0.3186 | 0.4932 | 0.009 | 0.1305 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0199 | 0.1525 | 0.025 | 0.194 | 0.0 | 0.0 | 0.2308 | 0.4302 | 0.0002 | 0.0487 | 0.0767 | 0.2904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2244 | 1.6833 | 9600 | 2.0115 | 0.0393 | 0.0756 | 0.036 | 0.0183 | 0.0389 | 0.0553 | 0.084 | 0.1199 | 0.121 | 0.0415 | 0.1323 | 0.1494 | 0.0 | 0.0 | 0.0872 | 0.5167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1632 | 0.556 | 0.0 | 0.0 | 0.2902 | 0.6551 | 0.0118 | 0.0514 | 0.0987 | 0.4606 | 0.0 | 0.0 | 0.3264 | 0.7561 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0754 | 0.2659 | 0.0728 | 0.2397 | 0.0005 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.01 | 0.0055 | 0.1451 | 0.0 | 0.0 | 0.0067 | 0.1542 | 0.0 | 0.0 | 0.3102 | 0.4818 | 0.0093 | 0.1286 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0199 | 0.1535 | 0.0258 | 0.2052 | 0.0 | 0.0 | 0.2266 | 0.4239 | 0.0001 | 0.0496 | 0.078 | 0.2851 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3524 | 1.6921 | 9650 | 2.0061 | 0.04 | 0.0758 | 0.0371 | 0.0182 | 0.0392 | 0.0572 | 0.085 | 0.1207 | 0.1218 | 0.0428 | 0.1321 | 0.1514 | 0.0 | 0.0 | 0.0869 | 0.5131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1629 | 0.5516 | 0.0 | 0.0 | 0.2975 | 0.6606 | 0.0106 | 0.0495 | 0.1013 | 0.4656 | 0.0 | 0.0 | 0.3336 | 0.7632 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0736 | 0.2682 | 0.0738 | 0.2521 | 0.0005 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0088 | 0.0059 | 0.1463 | 0.0 | 0.0 | 0.0077 | 0.1625 | 0.0 | 0.0 | 0.3194 | 0.4884 | 0.0087 | 0.1291 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0207 | 0.1516 | 0.0258 | 0.1978 | 0.0 | 0.0 | 0.2344 | 0.4304 | 0.0001 | 0.0476 | 0.0783 | 0.2877 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0916 | 1.7009 | 9700 | 2.0070 | 0.0399 | 0.0755 | 0.0369 | 0.0179 | 0.0392 | 0.0565 | 0.0849 | 0.1209 | 0.1222 | 0.0419 | 0.1332 | 0.148 | 0.0 | 0.0 | 0.0857 | 0.518 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1649 | 0.5648 | 0.0 | 0.0 | 0.2977 | 0.6577 | 0.0083 | 0.0419 | 0.0996 | 0.4638 | 0.0 | 0.0 | 0.3334 | 0.7608 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0731 | 0.2674 | 0.0747 | 0.2575 | 0.0004 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.01 | 0.0061 | 0.1372 | 0.0 | 0.0 | 0.008 | 0.1692 | 0.0 | 0.0 | 0.3177 | 0.4861 | 0.0095 | 0.1282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.1548 | 0.0261 | 0.2104 | 0.0 | 0.0 | 0.2315 | 0.4313 | 0.0001 | 0.0491 | 0.0774 | 0.2848 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6948 | 1.7096 | 9750 | 2.0054 | 0.04 | 0.0759 | 0.037 | 0.0192 | 0.0394 | 0.0564 | 0.0848 | 0.1205 | 0.1217 | 0.0436 | 0.1321 | 0.1483 | 0.0 | 0.0 | 0.0872 | 0.5136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1651 | 0.561 | 0.0 | 0.0 | 0.2996 | 0.659 | 0.0091 | 0.0419 | 0.0972 | 0.4594 | 0.0 | 0.0 | 0.3322 | 0.7644 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.2713 | 0.0702 | 0.2534 | 0.0003 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0088 | 0.0061 | 0.1348 | 0.0 | 0.0 | 0.0082 | 0.1692 | 0.0 | 0.0 | 0.3216 | 0.489 | 0.0103 | 0.1277 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.021 | 0.1544 | 0.0263 | 0.2 | 0.0 | 0.0 | 0.2333 | 0.4311 | 0.0002 | 0.0485 | 0.0779 | 0.2863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2661 | 1.7184 | 9800 | 2.0057 | 0.0402 | 0.0764 | 0.0372 | 0.0186 | 0.0396 | 0.0559 | 0.0851 | 0.1205 | 0.1216 | 0.0429 | 0.1323 | 0.1488 | 0.0 | 0.0 | 0.0869 | 0.5159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1682 | 0.5577 | 0.0 | 0.0 | 0.2978 | 0.6567 | 0.0092 | 0.0438 | 0.1028 | 0.47 | 0.0 | 0.0 | 0.332 | 0.7598 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0742 | 0.269 | 0.071 | 0.2507 | 0.0004 | 0.0257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0088 | 0.0062 | 0.1366 | 0.0 | 0.0 | 0.0079 | 0.1683 | 0.0 | 0.0 | 0.3198 | 0.4898 | 0.0101 | 0.1249 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0206 | 0.1539 | 0.027 | 0.1978 | 0.0 | 0.0 | 0.2346 | 0.4324 | 0.0001 | 0.0485 | 0.0786 | 0.2856 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9217 | 1.7272 | 9850 | 2.0053 | 0.0401 | 0.0759 | 0.0371 | 0.0181 | 0.0397 | 0.0565 | 0.0849 | 0.1203 | 0.1215 | 0.0434 | 0.132 | 0.1477 | 0.0 | 0.0 | 0.0857 | 0.5155 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1655 | 0.556 | 0.0 | 0.0 | 0.3021 | 0.6603 | 0.0093 | 0.041 | 0.099 | 0.4656 | 0.0 | 0.0 | 0.3336 | 0.7632 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.269 | 0.071 | 0.2466 | 0.0004 | 0.0248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0088 | 0.0062 | 0.1348 | 0.0 | 0.0 | 0.0086 | 0.1725 | 0.0 | 0.0 | 0.3211 | 0.4921 | 0.01 | 0.1221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0208 | 0.1516 | 0.0254 | 0.1963 | 0.0 | 0.0 | 0.2332 | 0.4345 | 0.0002 | 0.0487 | 0.079 | 0.2859 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0826 | 1.7359 | 9900 | 2.0036 | 0.0402 | 0.0758 | 0.0373 | 0.0181 | 0.0394 | 0.0565 | 0.0852 | 0.1206 | 0.1218 | 0.043 | 0.1319 | 0.1483 | 0.0 | 0.0 | 0.0856 | 0.5136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.167 | 0.5593 | 0.0 | 0.0 | 0.3008 | 0.6596 | 0.0087 | 0.0429 | 0.1011 | 0.465 | 0.0 | 0.0 | 0.3352 | 0.7622 | 0.0 | 0.0 | 0.0 | 0.0 | 0.073 | 0.2713 | 0.0713 | 0.2507 | 0.0006 | 0.0257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0088 | 0.0054 | 0.1396 | 0.0 | 0.0 | 0.0082 | 0.16 | 0.0 | 0.0 | 0.3228 | 0.4931 | 0.0097 | 0.1272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0209 | 0.1525 | 0.0257 | 0.2 | 0.0 | 0.0 | 0.2326 | 0.4338 | 0.0001 | 0.0478 | 0.079 | 0.2877 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9125 | 1.7447 | 9950 | 2.0039 | 0.0402 | 0.076 | 0.0373 | 0.018 | 0.0395 | 0.0563 | 0.0851 | 0.1204 | 0.1216 | 0.0431 | 0.1319 | 0.1485 | 0.0 | 0.0 | 0.0855 | 0.5153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1679 | 0.5577 | 0.0 | 0.0 | 0.3005 | 0.6599 | 0.0097 | 0.0438 | 0.1017 | 0.47 | 0.0 | 0.0 | 0.3345 | 0.7618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0725 | 0.2713 | 0.0714 | 0.2438 | 0.0006 | 0.0257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0088 | 0.0065 | 0.1378 | 0.0 | 0.0 | 0.0078 | 0.1533 | 0.0 | 0.0 | 0.3229 | 0.4946 | 0.0091 | 0.1263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.1521 | 0.0257 | 0.2 | 0.0 | 0.0 | 0.2338 | 0.4338 | 0.0002 | 0.0494 | 0.0789 | 0.2876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3548 | 1.7535 | 10000 | 2.0044 | 0.0402 | 0.076 | 0.0374 | 0.0184 | 0.0397 | 0.0565 | 0.0854 | 0.1208 | 0.122 | 0.0443 | 0.1322 | 0.1494 | 0.0 | 0.0 | 0.0858 | 0.514 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1682 | 0.5626 | 0.0 | 0.0 | 0.3009 | 0.6596 | 0.0095 | 0.0429 | 0.1019 | 0.4694 | 0.0 | 0.0 | 0.3346 | 0.7634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0716 | 0.269 | 0.0715 | 0.2466 | 0.0006 | 0.0257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0113 | 0.007 | 0.1366 | 0.0 | 0.0 | 0.008 | 0.1575 | 0.0 | 0.0 | 0.3226 | 0.4942 | 0.0089 | 0.1296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0209 | 0.1548 | 0.0266 | 0.2037 | 0.0 | 0.0 | 0.2336 | 0.434 | 0.0002 | 0.0504 | 0.0785 | 0.2888 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.3.1+cu121
- Datasets 3.5.0
- Tokenizers 0.19.1
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
sungkwan2/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
YANG-12/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3991
- Map: 0.0109
- Map 50: 0.0212
- Map 75: 0.0099
- Map Small: 0.0052
- Map Medium: 0.0152
- Map Large: 0.0094
- Mar 1: 0.0333
- Mar 10: 0.0612
- Mar 100: 0.0641
- Mar Small: 0.0219
- Mar Medium: 0.0593
- Mar Large: 0.0779
- Map Shirt, blouse: 0.0
- Mar 100 Shirt, blouse: 0.0
- Map Top, t-shirt, sweatshirt: 0.0267
- Mar 100 Top, t-shirt, sweatshirt: 0.3125
- Map Sweater: 0.0
- Mar 100 Sweater: 0.0
- Map Cardigan: 0.0
- Mar 100 Cardigan: 0.0
- Map Jacket: 0.0
- Mar 100 Jacket: 0.0
- Map Vest: 0.0
- Mar 100 Vest: 0.0
- Map Pants: 0.0805
- Mar 100 Pants: 0.6679
- Map Shorts: 0.0
- Mar 100 Shorts: 0.0
- Map Skirt: 0.0099
- Mar 100 Skirt: 0.0063
- Map Coat: 0.0
- Mar 100 Coat: 0.0
- Map Dress: 0.1037
- Mar 100 Dress: 0.7209
- Map Jumpsuit: 0.0
- Mar 100 Jumpsuit: 0.0
- Map Cape: 0.0
- Mar 100 Cape: 0.0
- Map Glasses: 0.0
- Mar 100 Glasses: 0.0
- Map Hat: 0.0
- Mar 100 Hat: 0.0
- Map Headband, head covering, hair accessory: 0.0
- Mar 100 Headband, head covering, hair accessory: 0.0
- Map Tie: 0.0
- Mar 100 Tie: 0.0
- Map Glove: 0.0
- Mar 100 Glove: 0.0
- Map Watch: 0.0
- Mar 100 Watch: 0.0
- Map Belt: 0.0
- Mar 100 Belt: 0.0
- Map Leg warmer: 0.0
- Mar 100 Leg warmer: 0.0
- Map Tights, stockings: 0.0
- Mar 100 Tights, stockings: 0.0
- Map Sock: 0.0
- Mar 100 Sock: 0.0
- Map Shoe: 0.1858
- Mar 100 Shoe: 0.4936
- Map Bag, wallet: 0.0003
- Mar 100 Bag, wallet: 0.0023
- Map Scarf: 0.0
- Mar 100 Scarf: 0.0
- Map Umbrella: 0.0
- Mar 100 Umbrella: 0.0
- Map Hood: 0.0
- Mar 100 Hood: 0.0
- Map Collar: 0.0
- Mar 100 Collar: 0.0
- Map Lapel: 0.0
- Mar 100 Lapel: 0.0
- Map Epaulette: 0.0
- Mar 100 Epaulette: 0.0
- Map Sleeve: 0.0607
- Mar 100 Sleeve: 0.4527
- Map Pocket: 0.0001
- Mar 100 Pocket: 0.0437
- Map Neckline: 0.0348
- Mar 100 Neckline: 0.2492
- Map Buckle: 0.0
- Mar 100 Buckle: 0.0
- Map Zipper: 0.0
- Mar 100 Zipper: 0.0
- Map Applique: 0.0
- Mar 100 Applique: 0.0
- Map Bead: 0.0
- Mar 100 Bead: 0.0
- Map Bow: 0.0
- Mar 100 Bow: 0.0
- Map Flower: 0.0
- Mar 100 Flower: 0.0
- Map Fringe: 0.0
- Mar 100 Fringe: 0.0
- Map Ribbon: 0.0
- Mar 100 Ribbon: 0.0
- Map Rivet: 0.0
- Mar 100 Rivet: 0.0
- Map Ruffle: 0.0
- Mar 100 Ruffle: 0.0
- Map Sequin: 0.0
- Mar 100 Sequin: 0.0
- Map Tassel: 0.0
- Mar 100 Tassel: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Shirt, blouse | Mar 100 Shirt, blouse | Map Top, t-shirt, sweatshirt | Mar 100 Top, t-shirt, sweatshirt | Map Sweater | Mar 100 Sweater | Map Cardigan | Mar 100 Cardigan | Map Jacket | Mar 100 Jacket | Map Vest | Mar 100 Vest | Map Pants | Mar 100 Pants | Map Shorts | Mar 100 Shorts | Map Skirt | Mar 100 Skirt | Map Coat | Mar 100 Coat | Map Dress | Mar 100 Dress | Map Jumpsuit | Mar 100 Jumpsuit | Map Cape | Mar 100 Cape | Map Glasses | Mar 100 Glasses | Map Hat | Mar 100 Hat | Map Headband, head covering, hair accessory | Mar 100 Headband, head covering, hair accessory | Map Tie | Mar 100 Tie | Map Glove | Mar 100 Glove | Map Watch | Mar 100 Watch | Map Belt | Mar 100 Belt | Map Leg warmer | Mar 100 Leg warmer | Map Tights, stockings | Mar 100 Tights, stockings | Map Sock | Mar 100 Sock | Map Shoe | Mar 100 Shoe | Map Bag, wallet | Mar 100 Bag, wallet | Map Scarf | Mar 100 Scarf | Map Umbrella | Mar 100 Umbrella | Map Hood | Mar 100 Hood | Map Collar | Mar 100 Collar | Map Lapel | Mar 100 Lapel | Map Epaulette | Mar 100 Epaulette | Map Sleeve | Mar 100 Sleeve | Map Pocket | Mar 100 Pocket | Map Neckline | Mar 100 Neckline | Map Buckle | Mar 100 Buckle | Map Zipper | Mar 100 Zipper | Map Applique | Mar 100 Applique | Map Bead | Mar 100 Bead | Map Bow | Mar 100 Bow | Map Flower | Mar 100 Flower | Map Fringe | Mar 100 Fringe | Map Ribbon | Mar 100 Ribbon | Map Rivet | Mar 100 Rivet | Map Ruffle | Mar 100 Ruffle | Map Sequin | Mar 100 Sequin | Map Tassel | Mar 100 Tassel |
|:-------------:|:------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------:|:--------------------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------:|:--------------:|:--------:|:------------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-----------:|:---------------:|:-------:|:-----------:|:-------------------------------------------:|:-----------------------------------------------:|:-------:|:-----------:|:---------:|:-------------:|:---------:|:-------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:--------:|:------------:|:---------------:|:-------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:----------:|:--------------:|:---------:|:-------------:|:-------------:|:-----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:--------:|:------------:|:-------:|:-----------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|
| 2.3489 | 0.4384 | 5000 | 2.5424 | 0.0078 | 0.0164 | 0.0067 | 0.004 | 0.0111 | 0.0068 | 0.0264 | 0.053 | 0.0564 | 0.0202 | 0.0517 | 0.0671 | 0.0 | 0.0 | 0.0185 | 0.1932 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0497 | 0.5615 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.089 | 0.6732 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1391 | 0.4866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0384 | 0.417 | 0.0001 | 0.0289 | 0.0232 | 0.2333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1423 | 0.8767 | 10000 | 2.3991 | 0.0109 | 0.0212 | 0.0099 | 0.0052 | 0.0152 | 0.0094 | 0.0333 | 0.0612 | 0.0641 | 0.0219 | 0.0593 | 0.0779 | 0.0 | 0.0 | 0.0267 | 0.3125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0805 | 0.6679 | 0.0 | 0.0 | 0.0099 | 0.0063 | 0.0 | 0.0 | 0.1037 | 0.7209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1858 | 0.4936 | 0.0003 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0607 | 0.4527 | 0.0001 | 0.0437 | 0.0348 | 0.2492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.7.0+cu126
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
zhengyu998/detr_finetuned_cppe5_Carla-COCO |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5_Carla-COCO
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the carla-coco-object-detection-dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9886
- Map: 0.2337
- Map 50: 0.3446
- Map 75: 0.233
- Map Small: 0.1595
- Map Medium: 0.5443
- Map Large: 0.9195
- Mar 1: 0.2413
- Mar 10: 0.4546
- Mar 100: 0.4658
- Mar Small: 0.3762
- Mar Medium: 0.7772
- Mar Large: 0.9417
- Map Coverall: 0.62
- Mar 100 Coverall: 0.6933
- Map Face Shield: 0.2334
- Mar 100 Face Shield: 0.4938
- Map Gloves: 0.0406
- Mar 100 Gloves: 0.4364
- Map Goggles: 0.2714
- Mar 100 Goggles: 0.3902
- Map Mask: 0.0032
- Mar 100 Mask: 0.3154
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 83 | 2.0575 | 0.0079 | 0.024 | 0.0032 | 0.0041 | 0.0388 | 0.4185 | 0.0149 | 0.0882 | 0.1529 | 0.1184 | 0.3278 | 0.7542 | 0.0358 | 0.4455 | 0.0004 | 0.0812 | 0.0009 | 0.1364 | 0.0022 | 0.1012 | 0.0 | 0.0 |
| No log | 2.0 | 166 | 1.8170 | 0.0125 | 0.0405 | 0.0024 | 0.0059 | 0.0564 | 0.3833 | 0.0208 | 0.0802 | 0.1121 | 0.0795 | 0.2078 | 0.8083 | 0.0607 | 0.373 | 0.0 | 0.0 | 0.0001 | 0.0364 | 0.0006 | 0.0665 | 0.001 | 0.0846 |
| No log | 3.0 | 249 | 1.7051 | 0.0333 | 0.0808 | 0.0214 | 0.017 | 0.0716 | 0.55 | 0.0522 | 0.1304 | 0.1636 | 0.1294 | 0.359 | 0.7833 | 0.161 | 0.5 | 0.001 | 0.1125 | 0.0008 | 0.0727 | 0.0037 | 0.1329 | 0.0 | 0.0 |
| No log | 4.0 | 332 | 1.5606 | 0.0621 | 0.1076 | 0.0725 | 0.0384 | 0.1191 | 0.6246 | 0.0636 | 0.1624 | 0.1908 | 0.1367 | 0.4395 | 0.875 | 0.3068 | 0.5573 | 0.0004 | 0.0562 | 0.0012 | 0.2 | 0.0023 | 0.125 | 0.0 | 0.0154 |
| No log | 5.0 | 415 | 1.4048 | 0.0597 | 0.1133 | 0.057 | 0.0354 | 0.2192 | 0.5628 | 0.0778 | 0.192 | 0.2163 | 0.1898 | 0.4136 | 0.875 | 0.2873 | 0.6028 | 0.0009 | 0.1063 | 0.0031 | 0.1545 | 0.0072 | 0.1793 | 0.0001 | 0.0385 |
| No log | 6.0 | 498 | 1.3473 | 0.0719 | 0.1314 | 0.0698 | 0.0455 | 0.2928 | 0.6042 | 0.0906 | 0.2615 | 0.2828 | 0.2319 | 0.5625 | 0.8958 | 0.3406 | 0.6197 | 0.0041 | 0.2625 | 0.0039 | 0.2727 | 0.0109 | 0.1976 | 0.0002 | 0.0615 |
| 2.6459 | 7.0 | 581 | 1.3085 | 0.0859 | 0.1431 | 0.0941 | 0.0461 | 0.2514 | 0.6921 | 0.1076 | 0.274 | 0.3119 | 0.2417 | 0.6042 | 0.9125 | 0.4091 | 0.6197 | 0.0058 | 0.3187 | 0.0053 | 0.4091 | 0.0092 | 0.1811 | 0.0001 | 0.0308 |
| 2.6459 | 8.0 | 664 | 1.3991 | 0.096 | 0.1896 | 0.0834 | 0.0514 | 0.3385 | 0.6803 | 0.1623 | 0.3106 | 0.3233 | 0.2407 | 0.6275 | 0.9042 | 0.3711 | 0.5567 | 0.0769 | 0.3625 | 0.0103 | 0.3909 | 0.0214 | 0.1677 | 0.0006 | 0.1385 |
| 2.6459 | 9.0 | 747 | 1.2441 | 0.1132 | 0.2009 | 0.108 | 0.0669 | 0.4426 | 0.8451 | 0.1713 | 0.3419 | 0.3688 | 0.2852 | 0.6747 | 0.9208 | 0.4431 | 0.6 | 0.0762 | 0.4437 | 0.0078 | 0.4 | 0.038 | 0.2232 | 0.0011 | 0.1769 |
| 2.6459 | 10.0 | 830 | 1.2353 | 0.1355 | 0.2241 | 0.1414 | 0.081 | 0.3865 | 0.8697 | 0.2051 | 0.3612 | 0.3879 | 0.291 | 0.7353 | 0.9208 | 0.5136 | 0.6275 | 0.0695 | 0.475 | 0.0114 | 0.4727 | 0.0822 | 0.2335 | 0.0008 | 0.1308 |
| 2.6459 | 11.0 | 913 | 1.2549 | 0.1024 | 0.2165 | 0.0832 | 0.0606 | 0.3461 | 0.8273 | 0.1455 | 0.3096 | 0.326 | 0.2378 | 0.6262 | 0.875 | 0.4035 | 0.5152 | 0.0322 | 0.4062 | 0.0087 | 0.3364 | 0.0659 | 0.203 | 0.0015 | 0.1692 |
| 2.6459 | 12.0 | 996 | 1.1793 | 0.1868 | 0.2971 | 0.199 | 0.1109 | 0.4837 | 0.8912 | 0.2007 | 0.3837 | 0.3907 | 0.2894 | 0.755 | 0.925 | 0.5445 | 0.6427 | 0.2159 | 0.4187 | 0.0164 | 0.5 | 0.1561 | 0.2537 | 0.0009 | 0.1385 |
| 1.197 | 13.0 | 1079 | 1.1864 | 0.1813 | 0.2901 | 0.1777 | 0.1047 | 0.4798 | 0.895 | 0.2218 | 0.3909 | 0.4018 | 0.3163 | 0.7097 | 0.9208 | 0.5205 | 0.6129 | 0.2186 | 0.4938 | 0.0227 | 0.4545 | 0.1435 | 0.2555 | 0.0013 | 0.1923 |
| 1.197 | 14.0 | 1162 | 1.2058 | 0.1626 | 0.2839 | 0.1641 | 0.1064 | 0.456 | 0.877 | 0.219 | 0.3737 | 0.3857 | 0.3036 | 0.6867 | 0.9125 | 0.5327 | 0.6247 | 0.1378 | 0.4125 | 0.0343 | 0.4455 | 0.1068 | 0.2152 | 0.0015 | 0.2308 |
| 1.197 | 15.0 | 1245 | 1.1169 | 0.1872 | 0.299 | 0.1926 | 0.1114 | 0.4861 | 0.8964 | 0.2142 | 0.3935 | 0.4056 | 0.3183 | 0.7157 | 0.9292 | 0.5492 | 0.6416 | 0.1994 | 0.4 | 0.023 | 0.5273 | 0.1631 | 0.2512 | 0.0015 | 0.2077 |
| 1.197 | 16.0 | 1328 | 1.1178 | 0.1917 | 0.3024 | 0.1915 | 0.1094 | 0.5089 | 0.8875 | 0.2112 | 0.3881 | 0.3943 | 0.3039 | 0.7433 | 0.9208 | 0.5703 | 0.6449 | 0.2019 | 0.425 | 0.0171 | 0.4909 | 0.1677 | 0.2415 | 0.0014 | 0.1692 |
| 1.197 | 17.0 | 1411 | 1.0844 | 0.1877 | 0.3046 | 0.1873 | 0.1144 | 0.5125 | 0.9234 | 0.2118 | 0.4051 | 0.4155 | 0.3145 | 0.7749 | 0.9417 | 0.569 | 0.6551 | 0.1696 | 0.4187 | 0.0266 | 0.4909 | 0.1722 | 0.2896 | 0.0013 | 0.2231 |
| 1.197 | 18.0 | 1494 | 1.0568 | 0.1993 | 0.3077 | 0.2146 | 0.1316 | 0.5029 | 0.9176 | 0.2257 | 0.405 | 0.4184 | 0.3395 | 0.7202 | 0.9458 | 0.5854 | 0.6702 | 0.18 | 0.4313 | 0.0283 | 0.4909 | 0.2013 | 0.2994 | 0.0017 | 0.2 |
| 0.9969 | 19.0 | 1577 | 1.0722 | 0.1971 | 0.3183 | 0.201 | 0.1323 | 0.4862 | 0.9307 | 0.2157 | 0.3928 | 0.406 | 0.3218 | 0.7345 | 0.95 | 0.5986 | 0.673 | 0.1767 | 0.475 | 0.0288 | 0.4364 | 0.1799 | 0.2841 | 0.0015 | 0.1615 |
| 0.9969 | 20.0 | 1660 | 1.0702 | 0.2098 | 0.3401 | 0.2093 | 0.135 | 0.5195 | 0.9179 | 0.2289 | 0.4078 | 0.4228 | 0.3344 | 0.7622 | 0.9417 | 0.5846 | 0.6629 | 0.2246 | 0.5 | 0.0318 | 0.4364 | 0.2059 | 0.2915 | 0.002 | 0.2231 |
| 0.9969 | 21.0 | 1743 | 1.0587 | 0.2084 | 0.3354 | 0.2009 | 0.1408 | 0.5043 | 0.9059 | 0.2252 | 0.4244 | 0.4351 | 0.3412 | 0.7655 | 0.9292 | 0.5983 | 0.677 | 0.1978 | 0.4875 | 0.0402 | 0.4455 | 0.2035 | 0.2963 | 0.0025 | 0.2692 |
| 0.9969 | 22.0 | 1826 | 1.0388 | 0.2072 | 0.324 | 0.2009 | 0.1457 | 0.5005 | 0.9025 | 0.2307 | 0.4229 | 0.4346 | 0.3444 | 0.7712 | 0.9292 | 0.5972 | 0.673 | 0.1531 | 0.4563 | 0.044 | 0.4636 | 0.239 | 0.3262 | 0.0027 | 0.2538 |
| 0.9969 | 23.0 | 1909 | 1.0332 | 0.209 | 0.3291 | 0.2113 | 0.1468 | 0.4986 | 0.9023 | 0.2379 | 0.4323 | 0.4391 | 0.3474 | 0.7586 | 0.9292 | 0.5936 | 0.6697 | 0.1648 | 0.4688 | 0.0452 | 0.4364 | 0.2382 | 0.336 | 0.0035 | 0.2846 |
| 0.9969 | 24.0 | 1992 | 1.0036 | 0.229 | 0.346 | 0.2347 | 0.1555 | 0.5326 | 0.9242 | 0.2418 | 0.4406 | 0.46 | 0.3714 | 0.7665 | 0.9458 | 0.6228 | 0.6983 | 0.2266 | 0.475 | 0.0378 | 0.4455 | 0.2546 | 0.3579 | 0.0033 | 0.3231 |
| 0.8821 | 25.0 | 2075 | 0.9946 | 0.2308 | 0.347 | 0.2345 | 0.16 | 0.5486 | 0.9167 | 0.2406 | 0.4457 | 0.4572 | 0.37 | 0.7728 | 0.9417 | 0.6219 | 0.6966 | 0.2397 | 0.4875 | 0.0363 | 0.4545 | 0.2532 | 0.3628 | 0.0032 | 0.2846 |
| 0.8821 | 26.0 | 2158 | 1.0003 | 0.2298 | 0.3428 | 0.2379 | 0.1521 | 0.5418 | 0.925 | 0.2481 | 0.4578 | 0.468 | 0.3785 | 0.7839 | 0.9458 | 0.6061 | 0.6843 | 0.2329 | 0.5063 | 0.0418 | 0.4727 | 0.2652 | 0.3768 | 0.0032 | 0.3 |
| 0.8821 | 27.0 | 2241 | 0.9896 | 0.2331 | 0.3446 | 0.234 | 0.1524 | 0.5515 | 0.9193 | 0.241 | 0.4543 | 0.4666 | 0.3777 | 0.7772 | 0.9417 | 0.6195 | 0.6938 | 0.2405 | 0.4938 | 0.0354 | 0.4455 | 0.2668 | 0.3848 | 0.0034 | 0.3154 |
| 0.8821 | 28.0 | 2324 | 0.9876 | 0.2329 | 0.3425 | 0.236 | 0.1597 | 0.5443 | 0.9217 | 0.2404 | 0.4529 | 0.4646 | 0.3746 | 0.776 | 0.9458 | 0.6215 | 0.6949 | 0.2299 | 0.4875 | 0.0407 | 0.4455 | 0.2693 | 0.3872 | 0.0033 | 0.3077 |
| 0.8821 | 29.0 | 2407 | 0.9894 | 0.2346 | 0.3462 | 0.2338 | 0.1591 | 0.5479 | 0.9194 | 0.24 | 0.4556 | 0.4662 | 0.3767 | 0.7772 | 0.9417 | 0.6212 | 0.6955 | 0.2374 | 0.4875 | 0.041 | 0.4455 | 0.27 | 0.3872 | 0.0033 | 0.3154 |
| 0.8821 | 30.0 | 2490 | 0.9886 | 0.2337 | 0.3446 | 0.233 | 0.1595 | 0.5443 | 0.9195 | 0.2413 | 0.4546 | 0.4658 | 0.3762 | 0.7772 | 0.9417 | 0.62 | 0.6933 | 0.2334 | 0.4938 | 0.0406 | 0.4364 | 0.2714 | 0.3902 | 0.0032 | 0.3154 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.4.1+cu118
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"automobile",
"bike",
"motorbike",
"traffic_light",
"traffic_sign"
] |
svetadomoi/rtdetr-v2-r34-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r34-cppe5-finetune-2
This model is a fine-tuned version of [PekingU/rtdetr_v2_r34vd](https://huggingface.co/PekingU/rtdetr_v2_r34vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 8.3132
- Map: 0.3055
- Map 50: 0.5372
- Map 75: 0.3008
- Map Small: 0.1094
- Map Medium: 0.241
- Map Large: 0.3888
- Mar 1: 0.2863
- Mar 10: 0.4999
- Mar 100: 0.578
- Mar Small: 0.3884
- Mar Medium: 0.4832
- Mar Large: 0.7099
- Map Coverall: 0.5596
- Mar 100 Coverall: 0.7275
- Map Face Shield: 0.2013
- Mar 100 Face Shield: 0.6342
- Map Gloves: 0.2742
- Mar 100 Gloves: 0.5192
- Map Goggles: 0.146
- Mar 100 Goggles: 0.4769
- Map Mask: 0.3462
- Mar 100 Mask: 0.5324
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 21.6622 | 0.0574 | 0.1029 | 0.0519 | 0.0001 | 0.0074 | 0.0702 | 0.0808 | 0.1764 | 0.2386 | 0.0255 | 0.1314 | 0.3738 | 0.263 | 0.6131 | 0.0098 | 0.1962 | 0.0056 | 0.1446 | 0.0002 | 0.0692 | 0.0085 | 0.1698 |
| No log | 2.0 | 214 | 12.3093 | 0.1473 | 0.2769 | 0.1347 | 0.04 | 0.108 | 0.2006 | 0.1941 | 0.3837 | 0.4464 | 0.1571 | 0.3411 | 0.6232 | 0.3946 | 0.6973 | 0.0871 | 0.438 | 0.067 | 0.392 | 0.0198 | 0.3308 | 0.1682 | 0.3742 |
| No log | 3.0 | 321 | 9.8181 | 0.2151 | 0.3903 | 0.2003 | 0.0862 | 0.1856 | 0.2692 | 0.2513 | 0.4746 | 0.5437 | 0.2899 | 0.4531 | 0.6932 | 0.4504 | 0.7221 | 0.1277 | 0.5734 | 0.1439 | 0.471 | 0.0493 | 0.4523 | 0.304 | 0.4996 |
| No log | 4.0 | 428 | 9.0262 | 0.2471 | 0.44 | 0.2372 | 0.0808 | 0.2084 | 0.3213 | 0.2685 | 0.4722 | 0.5494 | 0.2294 | 0.4611 | 0.7006 | 0.5037 | 0.7387 | 0.147 | 0.5823 | 0.2081 | 0.4938 | 0.0661 | 0.4338 | 0.3107 | 0.4982 |
| 27.2158 | 5.0 | 535 | 8.6126 | 0.276 | 0.4857 | 0.2769 | 0.104 | 0.2124 | 0.3508 | 0.2741 | 0.4908 | 0.5667 | 0.3254 | 0.4704 | 0.6999 | 0.5335 | 0.7261 | 0.156 | 0.6139 | 0.2473 | 0.5156 | 0.0958 | 0.4523 | 0.3474 | 0.5253 |
| 27.2158 | 6.0 | 642 | 8.4669 | 0.2826 | 0.5022 | 0.2795 | 0.095 | 0.2191 | 0.3575 | 0.2766 | 0.4948 | 0.5704 | 0.3249 | 0.461 | 0.7107 | 0.5455 | 0.7356 | 0.1524 | 0.6215 | 0.2571 | 0.5125 | 0.1208 | 0.4646 | 0.3372 | 0.5178 |
| 27.2158 | 7.0 | 749 | 8.3188 | 0.3003 | 0.5202 | 0.2958 | 0.1107 | 0.2348 | 0.3887 | 0.2879 | 0.5044 | 0.5766 | 0.3707 | 0.4823 | 0.7142 | 0.5545 | 0.7288 | 0.2019 | 0.6329 | 0.2655 | 0.5263 | 0.1341 | 0.4631 | 0.3453 | 0.532 |
| 27.2158 | 8.0 | 856 | 8.3084 | 0.2972 | 0.5265 | 0.2912 | 0.107 | 0.2409 | 0.3732 | 0.2811 | 0.5029 | 0.5811 | 0.3453 | 0.4772 | 0.7138 | 0.5617 | 0.7351 | 0.1649 | 0.6367 | 0.2773 | 0.5192 | 0.1337 | 0.4769 | 0.3485 | 0.5378 |
| 27.2158 | 9.0 | 963 | 8.2764 | 0.3064 | 0.5313 | 0.3049 | 0.1068 | 0.2421 | 0.3871 | 0.284 | 0.5073 | 0.5802 | 0.373 | 0.4783 | 0.7108 | 0.5621 | 0.7333 | 0.182 | 0.6291 | 0.2765 | 0.5299 | 0.1631 | 0.4738 | 0.3483 | 0.5347 |
| 11.8005 | 10.0 | 1070 | 8.3132 | 0.3055 | 0.5372 | 0.3008 | 0.1094 | 0.241 | 0.3888 | 0.2863 | 0.4999 | 0.578 | 0.3884 | 0.4832 | 0.7099 | 0.5596 | 0.7275 | 0.2013 | 0.6342 | 0.2742 | 0.5192 | 0.146 | 0.4769 | 0.3462 | 0.5324 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
svetadomoi/rtdetr-v2-r18-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r18-cppe5-finetune-2
This model is a fine-tuned version of [PekingU/rtdetr_v2_r18vd](https://huggingface.co/PekingU/rtdetr_v2_r18vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 6.2464
- Map: 0.391
- Map 50: 0.5884
- Map 75: 0.4136
- Map Small: 0.1258
- Map Medium: 0.2954
- Map Large: 0.54
- Mar 1: 0.3316
- Mar 10: 0.6539
- Mar 100: 0.7039
- Mar Small: 0.2625
- Mar Medium: 0.6011
- Mar Large: 0.8306
- Map Coverall: 0.5645
- Mar 100 Coverall: 0.8333
- Map Face Shield: 0.2243
- Mar 100 Face Shield: 0.7118
- Map Gloves: 0.3913
- Mar 100 Gloves: 0.6458
- Map Goggles: 0.2728
- Mar 100 Goggles: 0.6069
- Map Mask: 0.5023
- Mar 100 Mask: 0.7216
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 13.7317 | 0.0582 | 0.1117 | 0.0465 | 0.0008 | 0.0241 | 0.0634 | 0.0748 | 0.167 | 0.2365 | 0.0744 | 0.1806 | 0.3026 | 0.2735 | 0.5595 | 0.0006 | 0.1063 | 0.0041 | 0.2277 | 0.0005 | 0.0754 | 0.0122 | 0.2138 |
| No log | 2.0 | 214 | 9.1284 | 0.1153 | 0.2246 | 0.1016 | 0.0266 | 0.0796 | 0.1376 | 0.1715 | 0.3715 | 0.4531 | 0.1995 | 0.3849 | 0.5849 | 0.3802 | 0.6851 | 0.013 | 0.4291 | 0.0516 | 0.3871 | 0.0213 | 0.2923 | 0.1104 | 0.472 |
| No log | 3.0 | 321 | 7.9830 | 0.1638 | 0.3049 | 0.1557 | 0.0511 | 0.1136 | 0.2051 | 0.2104 | 0.4223 | 0.5024 | 0.199 | 0.4148 | 0.6338 | 0.4245 | 0.7185 | 0.0237 | 0.4911 | 0.0831 | 0.4232 | 0.0875 | 0.3354 | 0.2 | 0.5436 |
| No log | 4.0 | 428 | 7.6252 | 0.2065 | 0.3634 | 0.2029 | 0.1038 | 0.1285 | 0.2666 | 0.2349 | 0.4338 | 0.5114 | 0.2823 | 0.4169 | 0.6348 | 0.5146 | 0.7284 | 0.032 | 0.5025 | 0.1197 | 0.4295 | 0.1068 | 0.3462 | 0.2594 | 0.5507 |
| 19.8442 | 5.0 | 535 | 7.3826 | 0.2303 | 0.3983 | 0.2243 | 0.0944 | 0.1554 | 0.3224 | 0.254 | 0.4599 | 0.5318 | 0.2796 | 0.4541 | 0.6628 | 0.5438 | 0.7239 | 0.0556 | 0.5519 | 0.1415 | 0.4437 | 0.154 | 0.3954 | 0.2563 | 0.544 |
| 19.8442 | 6.0 | 642 | 7.2892 | 0.2359 | 0.4115 | 0.2391 | 0.084 | 0.1601 | 0.3395 | 0.2517 | 0.4673 | 0.5366 | 0.2873 | 0.4483 | 0.6742 | 0.5377 | 0.7311 | 0.0545 | 0.5684 | 0.1443 | 0.4464 | 0.1482 | 0.3923 | 0.2948 | 0.5449 |
| 19.8442 | 7.0 | 749 | 7.1910 | 0.2478 | 0.4306 | 0.2442 | 0.0709 | 0.1583 | 0.3803 | 0.2556 | 0.4735 | 0.5404 | 0.3185 | 0.4515 | 0.6852 | 0.5472 | 0.7275 | 0.065 | 0.5506 | 0.1771 | 0.4665 | 0.1536 | 0.3985 | 0.2962 | 0.5591 |
| 19.8442 | 8.0 | 856 | 7.1982 | 0.255 | 0.4381 | 0.2561 | 0.0743 | 0.1666 | 0.3783 | 0.2673 | 0.4773 | 0.5454 | 0.2991 | 0.4583 | 0.6846 | 0.5432 | 0.7315 | 0.0775 | 0.5544 | 0.1728 | 0.4714 | 0.1789 | 0.4138 | 0.3028 | 0.5556 |
| 19.8442 | 9.0 | 963 | 7.1636 | 0.2549 | 0.4427 | 0.2567 | 0.0713 | 0.1779 | 0.3697 | 0.2688 | 0.4821 | 0.5511 | 0.3067 | 0.4679 | 0.6859 | 0.5414 | 0.7252 | 0.0722 | 0.5646 | 0.1728 | 0.4732 | 0.1802 | 0.4338 | 0.308 | 0.5587 |
| 10.3959 | 10.0 | 1070 | 7.1785 | 0.247 | 0.4264 | 0.2567 | 0.0553 | 0.1709 | 0.3652 | 0.269 | 0.4752 | 0.5467 | 0.2664 | 0.4633 | 0.6939 | 0.5355 | 0.7342 | 0.069 | 0.5835 | 0.1719 | 0.4625 | 0.1548 | 0.3923 | 0.3037 | 0.5609 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
berng/detr-resnet-50-hardhat-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-hardhat-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the anindya64/hardhat dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 1000
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Tokenizers 0.21.1
| [
"head",
"helmet",
"person"
] |
hemanthgaddey/detr-custom-deepmoon |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
Rajerswari/detr-finetuned-cppe-5-10k-steps |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-finetuned-cppe-5-10k-steps
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6238
- Map: 0.1619
- Map 50: 0.3226
- Map 75: 0.1429
- Map Small: 0.0501
- Map Medium: 0.129
- Map Large: 0.227
- Mar 1: 0.1773
- Mar 10: 0.3167
- Mar 100: 0.3392
- Mar Small: 0.128
- Mar Medium: 0.2626
- Mar Large: 0.4711
- Map Coverall: 0.4278
- Mar 100 Coverall: 0.6532
- Map Face Shield: 0.1078
- Mar 100 Face Shield: 0.2937
- Map Gloves: 0.0679
- Mar 100 Gloves: 0.2991
- Map Goggles: 0.0102
- Mar 100 Goggles: 0.0985
- Map Mask: 0.1959
- Mar 100 Mask: 0.3516
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Coverall | Map Face Shield | Map Gloves | Map Goggles | Map Mask | Map Large | Map Medium | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Coverall | Mar 100 Face Shield | Mar 100 Gloves | Mar 100 Goggles | Mar 100 Mask | Mar Large | Mar Medium | Mar Small |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:------------:|:---------------:|:----------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:----------------:|:-------------------:|:--------------:|:---------------:|:------------:|:---------:|:----------:|:---------:|
| 2.5499 | 1.0 | 213 | 2.3248 | 0.0367 | 0.0798 | 0.0325 | 0.1697 | 0.0 | 0.0078 | 0.0 | 0.0058 | 0.0417 | 0.0114 | 0.0018 | 0.0555 | 0.1278 | 0.1648 | 0.5104 | 0.0 | 0.1813 | 0.0 | 0.1324 | 0.2026 | 0.1044 | 0.0416 |
| 2.1119 | 2.0 | 426 | 2.0867 | 0.0493 | 0.1064 | 0.0379 | 0.1978 | 0.0 | 0.0253 | 0.0 | 0.0236 | 0.0574 | 0.0304 | 0.0072 | 0.0805 | 0.1667 | 0.2069 | 0.5874 | 0.0 | 0.204 | 0.0 | 0.2431 | 0.2325 | 0.1572 | 0.073 |
| 2.0052 | 3.0 | 639 | 2.1689 | 0.0563 | 0.1279 | 0.0441 | 0.0154 | 0.0414 | 0.0688 | 0.0818 | 0.1611 | 0.1793 | 0.0616 | 0.131 | 0.2199 | 0.2039 | 0.4757 | 0.0 | 0.0 | 0.0158 | 0.1942 | 0.0 | 0.0 | 0.0618 | 0.2267 |
| 1.9373 | 4.0 | 852 | 1.9264 | 0.0813 | 0.1755 | 0.0679 | 0.0125 | 0.056 | 0.0953 | 0.0916 | 0.1816 | 0.206 | 0.0662 | 0.1446 | 0.2549 | 0.3464 | 0.6302 | 0.0 | 0.0 | 0.0245 | 0.2004 | 0.0 | 0.0 | 0.0357 | 0.1996 |
| 1.8396 | 5.0 | 1065 | 1.8418 | 0.0958 | 0.208 | 0.0731 | 0.0194 | 0.0751 | 0.1163 | 0.116 | 0.2195 | 0.2399 | 0.095 | 0.1802 | 0.2939 | 0.3278 | 0.6054 | 0.0228 | 0.1127 | 0.031 | 0.2071 | 0.0 | 0.0 | 0.0975 | 0.2742 |
| 1.7659 | 6.0 | 1278 | 1.8737 | 0.1004 | 0.2399 | 0.0791 | 0.0377 | 0.0913 | 0.1207 | 0.1226 | 0.2204 | 0.2382 | 0.0808 | 0.1912 | 0.295 | 0.2906 | 0.5856 | 0.0439 | 0.1165 | 0.0497 | 0.2281 | 0.0 | 0.0 | 0.1178 | 0.2609 |
| 1.6415 | 7.0 | 1491 | 1.7200 | 0.1305 | 0.2822 | 0.105 | 0.044 | 0.1026 | 0.172 | 0.1505 | 0.2689 | 0.2891 | 0.1047 | 0.2239 | 0.386 | 0.3916 | 0.6257 | 0.0516 | 0.1962 | 0.0526 | 0.2554 | 0.0059 | 0.0523 | 0.1508 | 0.316 |
| 1.6405 | 8.0 | 1704 | 1.6820 | 0.1411 | 0.2866 | 0.1303 | 0.0542 | 0.1162 | 0.1854 | 0.1572 | 0.2867 | 0.3088 | 0.1204 | 0.2432 | 0.4042 | 0.4147 | 0.6347 | 0.0549 | 0.2468 | 0.0592 | 0.271 | 0.0053 | 0.0631 | 0.1713 | 0.3284 |
| 1.5513 | 9.0 | 1917 | 1.6380 | 0.1546 | 0.3144 | 0.1307 | 0.0569 | 0.1215 | 0.2131 | 0.17 | 0.3014 | 0.323 | 0.112 | 0.2563 | 0.4431 | 0.4352 | 0.6414 | 0.0804 | 0.2544 | 0.0657 | 0.2871 | 0.0062 | 0.0846 | 0.1853 | 0.3476 |
| 1.5564 | 10.0 | 2130 | 1.6238 | 0.1619 | 0.3226 | 0.1429 | 0.0501 | 0.129 | 0.227 | 0.1773 | 0.3167 | 0.3392 | 0.128 | 0.2626 | 0.4711 | 0.4278 | 0.6532 | 0.1078 | 0.2937 | 0.0679 | 0.2991 | 0.0102 | 0.0985 | 0.1959 | 0.3516 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.7.0+cu118
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
davanstrien/dfine-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dfine-test
This model is a fine-tuned version of [ustc-community/dfine-small-coco](https://huggingface.co/ustc-community/dfine-small-coco) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0724
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 20.1936 | 1.0 | 712 | 2.6821 |
| 19.1183 | 2.0 | 1424 | 2.3533 |
| 16.1951 | 3.0 | 2136 | 1.7233 |
| 14.4265 | 4.0 | 2848 | 1.9414 |
| 14.3184 | 5.0 | 3560 | 2.3461 |
| 14.5909 | 6.0 | 4272 | 2.0509 |
| 14.6728 | 7.0 | 4984 | 2.0568 |
| 14.3686 | 8.0 | 5696 | 2.1048 |
| 14.6144 | 9.0 | 6408 | 2.0980 |
| 14.4749 | 10.0 | 7120 | 2.0721 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"0",
"1",
"2",
"3",
"4",
"5",
"6"
] |
BjngChjjljng/detr-fold-0 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
BjngChjjljng/DETR-fold0-50epoch |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
BjngChjjljng/DETR-fisheye-combine-10epoch |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
BjngChjjljng/DETR-fisheye-combine-40epoch |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
zeeshanmle3945/d-fine-m-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# d-fine-m-cppe5-finetune-2
This model is a fine-tuned version of [ustc-community/dfine-large-coco](https://huggingface.co/ustc-community/dfine-large-coco) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2336
- Map: 0.4365
- Map 50: 0.7271
- Map 75: 0.4035
- Map Small: 0.3811
- Map Medium: 0.5725
- Map Large: -1.0
- Mar 1: 0.3431
- Mar 10: 0.634
- Mar 100: 0.6851
- Mar Small: 0.6263
- Mar Medium: 0.7875
- Mar Large: -1.0
- Map Damaged: 0.4225
- Mar 100 Damaged: 0.6429
- Map Normal: 0.4505
- Mar 100 Normal: 0.7273
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Damaged | Mar 100 Damaged | Map Normal | Mar 100 Normal |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:----------:|:--------------:|
| No log | 1.0 | 7 | 5.4287 | 0.0334 | 0.0549 | 0.0398 | 0.0576 | 0.1954 | -1.0 | 0.0353 | 0.1007 | 0.2054 | 0.197 | 0.4833 | -1.0 | 0.007 | 0.1667 | 0.0599 | 0.2441 |
| No log | 2.0 | 14 | 5.0129 | 0.0347 | 0.0729 | 0.029 | 0.0736 | 0.1519 | -1.0 | 0.0221 | 0.1966 | 0.2662 | 0.252 | 0.5667 | -1.0 | 0.0103 | 0.2 | 0.059 | 0.3324 |
| No log | 3.0 | 21 | 4.8564 | 0.0246 | 0.075 | 0.0088 | 0.0574 | 0.0531 | -1.0 | 0.0 | 0.1855 | 0.287 | 0.2722 | 0.5667 | -1.0 | 0.0159 | 0.1917 | 0.0332 | 0.3824 |
| No log | 4.0 | 28 | 4.6438 | 0.0315 | 0.0733 | 0.0145 | 0.068 | 0.1254 | -1.0 | 0.0059 | 0.185 | 0.3049 | 0.2914 | 0.5833 | -1.0 | 0.0231 | 0.2333 | 0.04 | 0.3765 |
| No log | 5.0 | 35 | 4.4245 | 0.0365 | 0.0877 | 0.0246 | 0.0865 | 0.1342 | -1.0 | 0.0059 | 0.2137 | 0.3287 | 0.3015 | 0.6 | -1.0 | 0.0259 | 0.275 | 0.047 | 0.3824 |
| No log | 6.0 | 42 | 4.1167 | 0.0211 | 0.0422 | 0.0233 | 0.0641 | 0.1162 | -1.0 | 0.0118 | 0.1186 | 0.2272 | 0.2035 | 0.4667 | -1.0 | 0.0091 | 0.125 | 0.0332 | 0.3294 |
| No log | 7.0 | 49 | 3.8273 | 0.0253 | 0.0483 | 0.0302 | 0.0678 | 0.2761 | -1.0 | 0.0162 | 0.1586 | 0.2554 | 0.2116 | 0.6167 | -1.0 | 0.019 | 0.2167 | 0.0317 | 0.2941 |
| No log | 8.0 | 56 | 3.6119 | 0.0182 | 0.0389 | 0.0154 | 0.0481 | 0.2772 | -1.0 | 0.0059 | 0.1304 | 0.2581 | 0.1879 | 0.7 | -1.0 | 0.0181 | 0.225 | 0.0183 | 0.2912 |
| No log | 9.0 | 63 | 3.5401 | 0.0295 | 0.0587 | 0.0194 | 0.092 | 0.1699 | -1.0 | 0.0 | 0.1985 | 0.3576 | 0.304 | 0.6667 | -1.0 | 0.0289 | 0.3417 | 0.0302 | 0.3735 |
| No log | 10.0 | 70 | 3.0790 | 0.0426 | 0.0788 | 0.044 | 0.1145 | 0.1245 | -1.0 | 0.0147 | 0.1782 | 0.3576 | 0.3354 | 0.55 | -1.0 | 0.0236 | 0.2917 | 0.0615 | 0.4235 |
| No log | 11.0 | 77 | 2.8301 | 0.048 | 0.0919 | 0.0449 | 0.1321 | 0.0732 | -1.0 | 0.0147 | 0.151 | 0.3902 | 0.3419 | 0.65 | -1.0 | 0.0243 | 0.3333 | 0.0716 | 0.4471 |
| No log | 12.0 | 84 | 2.5556 | 0.059 | 0.0977 | 0.0706 | 0.1338 | 0.0886 | -1.0 | 0.0235 | 0.2637 | 0.4463 | 0.3975 | 0.7667 | -1.0 | 0.0279 | 0.375 | 0.0902 | 0.5176 |
| No log | 13.0 | 91 | 2.2365 | 0.0856 | 0.1322 | 0.1125 | 0.2129 | 0.0788 | -1.0 | 0.0235 | 0.2042 | 0.4811 | 0.452 | 0.65 | -1.0 | 0.0152 | 0.3417 | 0.156 | 0.6206 |
| No log | 14.0 | 98 | 2.1760 | 0.0781 | 0.1325 | 0.0952 | 0.1923 | 0.054 | -1.0 | 0.0368 | 0.1865 | 0.4566 | 0.4278 | 0.6167 | -1.0 | 0.0089 | 0.275 | 0.1473 | 0.6382 |
| No log | 15.0 | 105 | 2.0960 | 0.0874 | 0.1516 | 0.1002 | 0.1648 | 0.0212 | -1.0 | 0.0353 | 0.1662 | 0.4287 | 0.4359 | 0.4833 | -1.0 | 0.0056 | 0.225 | 0.1691 | 0.6324 |
| No log | 16.0 | 112 | 2.1255 | 0.1149 | 0.1903 | 0.1325 | 0.2015 | 0.1092 | -1.0 | 0.0235 | 0.2456 | 0.4581 | 0.4162 | 0.6333 | -1.0 | 0.0082 | 0.225 | 0.2216 | 0.6912 |
| No log | 17.0 | 119 | 2.1264 | 0.1148 | 0.1904 | 0.1358 | 0.2085 | 0.181 | -1.0 | 0.0088 | 0.327 | 0.5108 | 0.4535 | 0.7333 | -1.0 | 0.0193 | 0.3333 | 0.2103 | 0.6882 |
| No log | 18.0 | 126 | 1.9834 | 0.1364 | 0.2214 | 0.1578 | 0.2227 | 0.2601 | -1.0 | 0.0 | 0.3054 | 0.5801 | 0.5889 | 0.5833 | -1.0 | 0.0268 | 0.425 | 0.2459 | 0.7353 |
| No log | 19.0 | 133 | 1.8985 | 0.1459 | 0.2374 | 0.1391 | 0.199 | 0.0642 | -1.0 | 0.0412 | 0.2306 | 0.4444 | 0.4359 | 0.5167 | -1.0 | 0.0063 | 0.1917 | 0.2855 | 0.6971 |
| No log | 20.0 | 140 | 1.8476 | 0.2201 | 0.3492 | 0.2357 | 0.2643 | 0.1465 | -1.0 | 0.0676 | 0.3846 | 0.5882 | 0.5788 | 0.65 | -1.0 | 0.0208 | 0.45 | 0.4194 | 0.7265 |
| No log | 21.0 | 147 | 2.0165 | 0.1769 | 0.29 | 0.1981 | 0.2495 | 0.4362 | -1.0 | 0.0191 | 0.3657 | 0.5135 | 0.5 | 0.5833 | -1.0 | 0.0126 | 0.2917 | 0.3412 | 0.7353 |
| No log | 22.0 | 154 | 1.8840 | 0.181 | 0.2791 | 0.2002 | 0.244 | 0.2688 | -1.0 | 0.0324 | 0.3926 | 0.6179 | 0.5934 | 0.7167 | -1.0 | 0.0228 | 0.4917 | 0.3393 | 0.7441 |
| No log | 23.0 | 161 | 1.7835 | 0.2615 | 0.4098 | 0.2719 | 0.2779 | 0.4627 | -1.0 | 0.0632 | 0.4044 | 0.5958 | 0.5576 | 0.7333 | -1.0 | 0.0269 | 0.4417 | 0.4961 | 0.75 |
| No log | 24.0 | 168 | 1.6965 | 0.2772 | 0.4489 | 0.3035 | 0.2761 | 0.4897 | -1.0 | 0.075 | 0.449 | 0.6529 | 0.6495 | 0.6833 | -1.0 | 0.0358 | 0.55 | 0.5186 | 0.7559 |
| No log | 25.0 | 175 | 1.8051 | 0.2083 | 0.327 | 0.2425 | 0.2625 | 0.5626 | -1.0 | 0.0397 | 0.4277 | 0.6627 | 0.6551 | 0.75 | -1.0 | 0.041 | 0.5667 | 0.3757 | 0.7588 |
| No log | 26.0 | 182 | 1.7892 | 0.2495 | 0.3881 | 0.268 | 0.2711 | 0.2367 | -1.0 | 0.0779 | 0.4804 | 0.6654 | 0.648 | 0.7833 | -1.0 | 0.0487 | 0.575 | 0.4502 | 0.7559 |
| No log | 27.0 | 189 | 1.6875 | 0.2152 | 0.3626 | 0.2407 | 0.2245 | 0.4162 | -1.0 | 0.0721 | 0.4228 | 0.6699 | 0.6596 | 0.7167 | -1.0 | 0.0401 | 0.575 | 0.3902 | 0.7647 |
| No log | 28.0 | 196 | 1.5021 | 0.2585 | 0.4091 | 0.2992 | 0.2602 | 0.2923 | -1.0 | 0.0618 | 0.4696 | 0.7314 | 0.7283 | 0.75 | -1.0 | 0.0519 | 0.6833 | 0.4652 | 0.7794 |
| No log | 29.0 | 203 | 1.6260 | 0.2337 | 0.3626 | 0.2731 | 0.2383 | 0.2293 | -1.0 | 0.0662 | 0.5145 | 0.6517 | 0.6192 | 0.7667 | -1.0 | 0.0476 | 0.5417 | 0.4198 | 0.7618 |
| No log | 30.0 | 210 | 1.5923 | 0.2584 | 0.4095 | 0.2863 | 0.2534 | 0.6228 | -1.0 | 0.075 | 0.513 | 0.6667 | 0.6449 | 0.8 | -1.0 | 0.054 | 0.5833 | 0.4628 | 0.75 |
| No log | 31.0 | 217 | 1.3863 | 0.3067 | 0.4785 | 0.3098 | 0.3137 | 0.549 | -1.0 | 0.114 | 0.4971 | 0.7287 | 0.7283 | 0.7833 | -1.0 | 0.0965 | 0.675 | 0.5168 | 0.7824 |
| No log | 32.0 | 224 | 1.3657 | 0.3012 | 0.469 | 0.3407 | 0.2912 | 0.6131 | -1.0 | 0.1169 | 0.5507 | 0.7358 | 0.7369 | 0.7833 | -1.0 | 0.1156 | 0.6833 | 0.4867 | 0.7882 |
| No log | 33.0 | 231 | 1.5260 | 0.2572 | 0.4584 | 0.2362 | 0.2675 | 0.3052 | -1.0 | 0.1025 | 0.5515 | 0.6934 | 0.6864 | 0.8 | -1.0 | 0.107 | 0.675 | 0.4075 | 0.7118 |
| No log | 34.0 | 238 | 1.4021 | 0.2973 | 0.4746 | 0.3055 | 0.3356 | 0.5159 | -1.0 | 0.1213 | 0.5282 | 0.7201 | 0.7237 | 0.7667 | -1.0 | 0.1615 | 0.6667 | 0.4331 | 0.7735 |
| No log | 35.0 | 245 | 1.3941 | 0.2873 | 0.457 | 0.3431 | 0.2672 | 0.4816 | -1.0 | 0.0995 | 0.5414 | 0.7025 | 0.6944 | 0.8 | -1.0 | 0.1216 | 0.6667 | 0.453 | 0.7382 |
| No log | 36.0 | 252 | 1.3307 | 0.3157 | 0.5048 | 0.3613 | 0.3135 | 0.5915 | -1.0 | 0.1257 | 0.4904 | 0.7066 | 0.7071 | 0.7333 | -1.0 | 0.1968 | 0.675 | 0.4346 | 0.7382 |
| No log | 37.0 | 259 | 1.5590 | 0.291 | 0.4629 | 0.2967 | 0.2851 | 0.5128 | -1.0 | 0.0993 | 0.5578 | 0.7164 | 0.7197 | 0.7333 | -1.0 | 0.1639 | 0.6917 | 0.4181 | 0.7412 |
| No log | 38.0 | 266 | 1.3372 | 0.3179 | 0.5071 | 0.3317 | 0.2999 | 0.342 | -1.0 | 0.1159 | 0.5373 | 0.7007 | 0.6955 | 0.75 | -1.0 | 0.2407 | 0.675 | 0.3951 | 0.7265 |
| No log | 39.0 | 273 | 1.4323 | 0.3263 | 0.5236 | 0.3459 | 0.2973 | 0.4375 | -1.0 | 0.1488 | 0.5596 | 0.688 | 0.6838 | 0.7833 | -1.0 | 0.2593 | 0.6583 | 0.3932 | 0.7176 |
| No log | 40.0 | 280 | 1.3842 | 0.3433 | 0.5479 | 0.3753 | 0.3573 | 0.4017 | -1.0 | 0.1211 | 0.6066 | 0.6953 | 0.6929 | 0.7333 | -1.0 | 0.2 | 0.6583 | 0.4867 | 0.7324 |
| No log | 41.0 | 287 | 1.3547 | 0.3622 | 0.5522 | 0.4525 | 0.3451 | 0.3479 | -1.0 | 0.0963 | 0.6473 | 0.7221 | 0.7212 | 0.75 | -1.0 | 0.2097 | 0.7 | 0.5147 | 0.7441 |
| No log | 42.0 | 294 | 1.3390 | 0.3382 | 0.548 | 0.3632 | 0.3625 | 0.3271 | -1.0 | 0.1125 | 0.5838 | 0.7206 | 0.7197 | 0.75 | -1.0 | 0.2777 | 0.7 | 0.3987 | 0.7412 |
| No log | 43.0 | 301 | 1.3209 | 0.4168 | 0.6584 | 0.4652 | 0.4005 | 0.3415 | -1.0 | 0.1662 | 0.6076 | 0.7096 | 0.7045 | 0.75 | -1.0 | 0.3673 | 0.675 | 0.4664 | 0.7441 |
| No log | 44.0 | 308 | 1.3387 | 0.4075 | 0.662 | 0.4451 | 0.4079 | 0.4524 | -1.0 | 0.1794 | 0.6054 | 0.6877 | 0.6793 | 0.8 | -1.0 | 0.4433 | 0.6667 | 0.3716 | 0.7088 |
| No log | 45.0 | 315 | 1.2775 | 0.4181 | 0.6525 | 0.4332 | 0.402 | 0.4182 | -1.0 | 0.1466 | 0.65 | 0.7083 | 0.7061 | 0.7833 | -1.0 | 0.3603 | 0.6667 | 0.4759 | 0.75 |
| No log | 46.0 | 322 | 1.3260 | 0.418 | 0.6461 | 0.4934 | 0.3563 | 0.734 | -1.0 | 0.1757 | 0.6404 | 0.7191 | 0.7071 | 0.7833 | -1.0 | 0.3163 | 0.7 | 0.5197 | 0.7382 |
| No log | 47.0 | 329 | 1.3401 | 0.4192 | 0.6723 | 0.478 | 0.3677 | 0.5333 | -1.0 | 0.1549 | 0.6466 | 0.6919 | 0.6864 | 0.75 | -1.0 | 0.3797 | 0.675 | 0.4586 | 0.7088 |
| No log | 48.0 | 336 | 1.2702 | 0.4219 | 0.659 | 0.4297 | 0.3928 | 0.4873 | -1.0 | 0.1551 | 0.6483 | 0.725 | 0.7131 | 0.7833 | -1.0 | 0.368 | 0.7 | 0.4759 | 0.75 |
| No log | 49.0 | 343 | 1.2773 | 0.411 | 0.6646 | 0.4301 | 0.381 | 0.6014 | -1.0 | 0.1507 | 0.6203 | 0.723 | 0.7025 | 0.8167 | -1.0 | 0.3831 | 0.7167 | 0.439 | 0.7294 |
| No log | 50.0 | 350 | 1.2819 | 0.3927 | 0.6282 | 0.4219 | 0.3538 | 0.6117 | -1.0 | 0.1551 | 0.6103 | 0.7103 | 0.7035 | 0.7667 | -1.0 | 0.361 | 0.7 | 0.4243 | 0.7206 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"damaged",
"normal"
] |
urretxo/yolos-tiny-raccoon-detector |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolos-tiny-raccoon-detector
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.1
| [
"raccoon",
"person",
"skunk"
] |
deon03/deter-IDD20k |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"traffic light",
"person",
"train",
"rectification border",
"autorickshaw",
"vehicle fallback",
"polegroup",
"caravan",
"truck",
"out of roi",
"wall",
"bicycle",
"curb",
"traffic sign",
"drivable fallback",
"parking",
"ego vehicle",
"pole",
"fallback background",
"fence",
"car",
"obs-str-bar-fallback",
"vegetation",
"bridge",
"sidewalk",
"unlabeled",
"animal",
"motorcycle",
"rail track",
"tunnel",
"ground",
"rider",
"road",
"sky",
"trailer",
"non-drivable fallback",
"billboard",
"bus",
"building",
"guard rail",
"license plate"
] |
Nihel13/aii |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"table",
"table rotated"
] |
Nihel13/aii2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"table",
"table rotated"
] |
Nihel13/aii3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"table",
"table rotated"
] |
ajithkb/detr-finetuned-cppe-5-10k-steps |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-finetuned-cppe-5-10k-steps
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2733
- Map: 0.2965
- Map 50: 0.5888
- Map 75: 0.2549
- Map Small: 0.085
- Map Medium: 0.2348
- Map Large: 0.4635
- Mar 1: 0.2926
- Mar 10: 0.4628
- Mar 100: 0.4771
- Mar Small: 0.1972
- Mar Medium: 0.4214
- Mar Large: 0.6518
- Map Coverall: 0.5413
- Mar 100 Coverall: 0.6838
- Map Face Shield: 0.2526
- Mar 100 Face Shield: 0.4696
- Map Gloves: 0.2117
- Mar 100 Gloves: 0.4036
- Map Goggles: 0.1686
- Mar 100 Goggles: 0.4169
- Map Mask: 0.3083
- Mar 100 Mask: 0.4116
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| 2.9916 | 1.0 | 107 | 2.3798 | 0.0207 | 0.0465 | 0.0156 | 0.0025 | 0.0171 | 0.0226 | 0.0542 | 0.1129 | 0.1459 | 0.045 | 0.1086 | 0.1896 | 0.0949 | 0.3703 | 0.0 | 0.0 | 0.0021 | 0.1455 | 0.0 | 0.0 | 0.0066 | 0.2138 |
| 2.1118 | 2.0 | 214 | 2.2127 | 0.0433 | 0.0992 | 0.0362 | 0.0069 | 0.0396 | 0.0488 | 0.0612 | 0.1542 | 0.184 | 0.0695 | 0.1388 | 0.2132 | 0.1884 | 0.5167 | 0.0 | 0.0 | 0.0055 | 0.1571 | 0.0 | 0.0 | 0.0227 | 0.2462 |
| 1.966 | 3.0 | 321 | 2.1864 | 0.0394 | 0.0956 | 0.0315 | 0.0141 | 0.0377 | 0.0469 | 0.057 | 0.1419 | 0.1702 | 0.0537 | 0.1194 | 0.2138 | 0.1491 | 0.4977 | 0.0 | 0.0 | 0.0111 | 0.1237 | 0.0 | 0.0 | 0.0369 | 0.2298 |
| 1.8256 | 4.0 | 428 | 2.1195 | 0.0703 | 0.1626 | 0.0496 | 0.0133 | 0.0516 | 0.0855 | 0.0861 | 0.1734 | 0.2015 | 0.0733 | 0.1527 | 0.2385 | 0.2673 | 0.5793 | 0.0 | 0.0 | 0.0275 | 0.1946 | 0.0 | 0.0 | 0.0567 | 0.2333 |
| 1.8436 | 5.0 | 535 | 2.0064 | 0.0727 | 0.1655 | 0.0484 | 0.0208 | 0.0538 | 0.0909 | 0.0845 | 0.1843 | 0.2026 | 0.0687 | 0.1459 | 0.2537 | 0.2407 | 0.5559 | 0.0107 | 0.0241 | 0.0208 | 0.1522 | 0.0 | 0.0 | 0.091 | 0.2809 |
| 1.7839 | 6.0 | 642 | 1.9382 | 0.095 | 0.2243 | 0.0766 | 0.0187 | 0.0753 | 0.1129 | 0.0997 | 0.2116 | 0.227 | 0.0801 | 0.177 | 0.2746 | 0.3235 | 0.5847 | 0.0243 | 0.1215 | 0.0387 | 0.1942 | 0.0 | 0.0 | 0.0881 | 0.2347 |
| 1.7793 | 7.0 | 749 | 2.0159 | 0.0993 | 0.2193 | 0.0844 | 0.0138 | 0.0822 | 0.1267 | 0.109 | 0.2014 | 0.2158 | 0.0485 | 0.1595 | 0.2898 | 0.3168 | 0.573 | 0.009 | 0.0709 | 0.0317 | 0.1598 | 0.002 | 0.0046 | 0.1368 | 0.2707 |
| 1.6682 | 8.0 | 856 | 1.8843 | 0.1193 | 0.251 | 0.1034 | 0.046 | 0.096 | 0.1632 | 0.1403 | 0.2527 | 0.2721 | 0.1071 | 0.2293 | 0.3346 | 0.3373 | 0.5793 | 0.0484 | 0.2443 | 0.0536 | 0.2156 | 0.0105 | 0.0431 | 0.1467 | 0.2782 |
| 1.5992 | 9.0 | 963 | 1.7163 | 0.13 | 0.2917 | 0.1109 | 0.0485 | 0.1064 | 0.1829 | 0.1504 | 0.2702 | 0.2897 | 0.1265 | 0.2203 | 0.4034 | 0.3907 | 0.605 | 0.0522 | 0.2608 | 0.0489 | 0.2339 | 0.0129 | 0.0723 | 0.1453 | 0.2764 |
| 1.5705 | 10.0 | 1070 | 1.8223 | 0.1188 | 0.2687 | 0.0982 | 0.0336 | 0.1011 | 0.1673 | 0.1506 | 0.2584 | 0.2765 | 0.1133 | 0.2172 | 0.3697 | 0.3468 | 0.5626 | 0.0509 | 0.1962 | 0.0432 | 0.2232 | 0.0178 | 0.1138 | 0.1354 | 0.2867 |
| 1.5646 | 11.0 | 1177 | 1.8070 | 0.1276 | 0.28 | 0.1026 | 0.0396 | 0.1042 | 0.1678 | 0.1568 | 0.2738 | 0.2874 | 0.0919 | 0.2256 | 0.3972 | 0.3954 | 0.6104 | 0.0503 | 0.219 | 0.0451 | 0.1902 | 0.0204 | 0.1569 | 0.1268 | 0.2604 |
| 1.5597 | 12.0 | 1284 | 1.7073 | 0.1384 | 0.3048 | 0.1109 | 0.0372 | 0.11 | 0.2093 | 0.1687 | 0.3119 | 0.326 | 0.0985 | 0.2638 | 0.4789 | 0.4113 | 0.6144 | 0.0611 | 0.2975 | 0.0593 | 0.2335 | 0.016 | 0.1938 | 0.1441 | 0.2907 |
| 1.5311 | 13.0 | 1391 | 1.6969 | 0.1418 | 0.3228 | 0.1037 | 0.041 | 0.1189 | 0.2127 | 0.1644 | 0.3219 | 0.3378 | 0.1096 | 0.2698 | 0.4917 | 0.3827 | 0.6149 | 0.0651 | 0.3329 | 0.0605 | 0.2496 | 0.0281 | 0.1662 | 0.1725 | 0.3253 |
| 1.4938 | 14.0 | 1498 | 1.6642 | 0.143 | 0.334 | 0.1047 | 0.0395 | 0.1192 | 0.2081 | 0.1726 | 0.3124 | 0.3267 | 0.1149 | 0.2483 | 0.4744 | 0.3883 | 0.6126 | 0.0734 | 0.3203 | 0.0609 | 0.2036 | 0.026 | 0.1862 | 0.1666 | 0.3107 |
| 1.4657 | 15.0 | 1605 | 1.6948 | 0.1364 | 0.3063 | 0.1131 | 0.0337 | 0.1013 | 0.2102 | 0.1555 | 0.3133 | 0.3364 | 0.1096 | 0.2581 | 0.4803 | 0.4141 | 0.6297 | 0.0425 | 0.3177 | 0.0666 | 0.2625 | 0.0186 | 0.1923 | 0.14 | 0.2796 |
| 1.4562 | 16.0 | 1712 | 1.6193 | 0.1418 | 0.3303 | 0.1095 | 0.0511 | 0.1269 | 0.2024 | 0.1864 | 0.3335 | 0.3475 | 0.1638 | 0.2885 | 0.4642 | 0.3925 | 0.586 | 0.0768 | 0.3595 | 0.0695 | 0.2629 | 0.0194 | 0.2231 | 0.1507 | 0.3058 |
| 1.43 | 17.0 | 1819 | 1.6391 | 0.145 | 0.3296 | 0.1157 | 0.0463 | 0.1137 | 0.2198 | 0.185 | 0.3437 | 0.365 | 0.1456 | 0.3134 | 0.5049 | 0.3985 | 0.6216 | 0.0628 | 0.3354 | 0.0813 | 0.3022 | 0.0263 | 0.2738 | 0.1561 | 0.292 |
| 1.4022 | 18.0 | 1926 | 1.6259 | 0.1433 | 0.3396 | 0.1087 | 0.0728 | 0.1105 | 0.2262 | 0.1864 | 0.3285 | 0.3485 | 0.1529 | 0.2787 | 0.4878 | 0.3654 | 0.6023 | 0.0708 | 0.3253 | 0.0843 | 0.2866 | 0.0187 | 0.2323 | 0.1772 | 0.296 |
| 1.355 | 19.0 | 2033 | 1.5473 | 0.1642 | 0.3599 | 0.135 | 0.0857 | 0.1245 | 0.2476 | 0.2 | 0.3577 | 0.3764 | 0.1384 | 0.3008 | 0.5502 | 0.4262 | 0.6356 | 0.059 | 0.3532 | 0.0858 | 0.3143 | 0.0465 | 0.2585 | 0.2036 | 0.3204 |
| 1.3404 | 20.0 | 2140 | 1.5296 | 0.1764 | 0.385 | 0.1342 | 0.0676 | 0.1438 | 0.2758 | 0.2176 | 0.3689 | 0.3839 | 0.1691 | 0.3191 | 0.5462 | 0.433 | 0.6302 | 0.1159 | 0.3709 | 0.0974 | 0.3071 | 0.0778 | 0.3185 | 0.1577 | 0.2929 |
| 1.3367 | 21.0 | 2247 | 1.5219 | 0.1862 | 0.387 | 0.1502 | 0.0444 | 0.1448 | 0.2963 | 0.2081 | 0.3752 | 0.3897 | 0.1477 | 0.3255 | 0.5503 | 0.449 | 0.6207 | 0.1136 | 0.3797 | 0.1141 | 0.3451 | 0.0434 | 0.2708 | 0.2107 | 0.332 |
| 1.2998 | 22.0 | 2354 | 1.4768 | 0.1853 | 0.393 | 0.1535 | 0.0445 | 0.148 | 0.2998 | 0.213 | 0.3762 | 0.3963 | 0.1822 | 0.3353 | 0.5332 | 0.4589 | 0.6207 | 0.1081 | 0.3873 | 0.1085 | 0.3562 | 0.0433 | 0.2862 | 0.2075 | 0.3311 |
| 1.2803 | 23.0 | 2461 | 1.4946 | 0.1907 | 0.4068 | 0.1517 | 0.0497 | 0.1561 | 0.2981 | 0.2056 | 0.3706 | 0.3887 | 0.1585 | 0.3412 | 0.5334 | 0.4496 | 0.6126 | 0.1129 | 0.3582 | 0.1335 | 0.35 | 0.0413 | 0.2862 | 0.2163 | 0.3364 |
| 1.2791 | 24.0 | 2568 | 1.4827 | 0.1897 | 0.3987 | 0.16 | 0.0537 | 0.1406 | 0.3015 | 0.2125 | 0.381 | 0.4019 | 0.1595 | 0.35 | 0.5525 | 0.4633 | 0.6342 | 0.0952 | 0.3987 | 0.1261 | 0.3643 | 0.0598 | 0.2892 | 0.2042 | 0.3231 |
| 1.2687 | 25.0 | 2675 | 1.4795 | 0.1916 | 0.4021 | 0.1626 | 0.0603 | 0.1453 | 0.3003 | 0.2281 | 0.3934 | 0.4079 | 0.1611 | 0.3539 | 0.572 | 0.4529 | 0.6356 | 0.0956 | 0.3797 | 0.1434 | 0.342 | 0.053 | 0.3569 | 0.2131 | 0.3253 |
| 1.229 | 26.0 | 2782 | 1.4817 | 0.1922 | 0.4226 | 0.153 | 0.0566 | 0.1579 | 0.3048 | 0.2213 | 0.3825 | 0.4028 | 0.1948 | 0.3408 | 0.551 | 0.4338 | 0.6324 | 0.1391 | 0.381 | 0.1195 | 0.3554 | 0.0495 | 0.3231 | 0.219 | 0.3222 |
| 1.2537 | 27.0 | 2889 | 1.5223 | 0.1891 | 0.4338 | 0.1469 | 0.0775 | 0.146 | 0.3004 | 0.2045 | 0.3696 | 0.3838 | 0.1806 | 0.3195 | 0.5276 | 0.4444 | 0.6279 | 0.1133 | 0.3418 | 0.1295 | 0.3246 | 0.0538 | 0.3108 | 0.2043 | 0.3142 |
| 1.2377 | 28.0 | 2996 | 1.4548 | 0.201 | 0.438 | 0.1538 | 0.0666 | 0.1556 | 0.3186 | 0.2286 | 0.3815 | 0.3944 | 0.1411 | 0.3428 | 0.5578 | 0.4559 | 0.6464 | 0.1395 | 0.3684 | 0.1193 | 0.3237 | 0.0437 | 0.2954 | 0.2466 | 0.3382 |
| 1.217 | 29.0 | 3103 | 1.4579 | 0.1994 | 0.4129 | 0.176 | 0.0595 | 0.169 | 0.2965 | 0.2188 | 0.3871 | 0.4006 | 0.1797 | 0.3454 | 0.5306 | 0.4727 | 0.6586 | 0.1088 | 0.3747 | 0.1282 | 0.35 | 0.075 | 0.2846 | 0.2123 | 0.3351 |
| 1.2252 | 30.0 | 3210 | 1.4552 | 0.2073 | 0.4341 | 0.1633 | 0.0631 | 0.1543 | 0.3287 | 0.2294 | 0.3874 | 0.4038 | 0.1423 | 0.3425 | 0.5571 | 0.4684 | 0.6405 | 0.1528 | 0.4051 | 0.1395 | 0.3496 | 0.0438 | 0.2831 | 0.2319 | 0.3409 |
| 1.1804 | 31.0 | 3317 | 1.4339 | 0.2028 | 0.4352 | 0.16 | 0.0653 | 0.1578 | 0.3171 | 0.2269 | 0.3964 | 0.4141 | 0.152 | 0.3696 | 0.5676 | 0.4541 | 0.6419 | 0.1316 | 0.4139 | 0.1457 | 0.358 | 0.0667 | 0.3292 | 0.2159 | 0.3276 |
| 1.196 | 32.0 | 3424 | 1.4619 | 0.2181 | 0.4529 | 0.1874 | 0.0569 | 0.1666 | 0.346 | 0.2385 | 0.402 | 0.42 | 0.174 | 0.355 | 0.5965 | 0.4675 | 0.6144 | 0.1746 | 0.4 | 0.1536 | 0.3638 | 0.0662 | 0.3785 | 0.2285 | 0.3431 |
| 1.1698 | 33.0 | 3531 | 1.4126 | 0.2124 | 0.4458 | 0.1643 | 0.0633 | 0.164 | 0.3296 | 0.2287 | 0.3946 | 0.411 | 0.1707 | 0.3507 | 0.5705 | 0.4923 | 0.6613 | 0.1324 | 0.3848 | 0.1536 | 0.3656 | 0.0601 | 0.2969 | 0.2237 | 0.3462 |
| 1.143 | 34.0 | 3638 | 1.4011 | 0.2225 | 0.4598 | 0.1867 | 0.0671 | 0.175 | 0.3485 | 0.239 | 0.4124 | 0.4257 | 0.1549 | 0.3674 | 0.6062 | 0.4852 | 0.6477 | 0.1527 | 0.4177 | 0.1693 | 0.354 | 0.0755 | 0.3769 | 0.2296 | 0.332 |
| 1.1527 | 35.0 | 3745 | 1.4047 | 0.2309 | 0.4622 | 0.1979 | 0.0689 | 0.168 | 0.3785 | 0.2431 | 0.415 | 0.4277 | 0.1665 | 0.3677 | 0.5928 | 0.5132 | 0.6559 | 0.1427 | 0.3924 | 0.1703 | 0.3808 | 0.0946 | 0.3662 | 0.2337 | 0.3431 |
| 1.1219 | 36.0 | 3852 | 1.4185 | 0.2224 | 0.4747 | 0.1906 | 0.0727 | 0.1709 | 0.3493 | 0.2326 | 0.3971 | 0.4124 | 0.1584 | 0.355 | 0.5743 | 0.4772 | 0.6423 | 0.167 | 0.3899 | 0.1535 | 0.3219 | 0.0574 | 0.34 | 0.2569 | 0.368 |
| 1.1115 | 37.0 | 3959 | 1.3970 | 0.2271 | 0.4682 | 0.1883 | 0.071 | 0.1725 | 0.3602 | 0.2514 | 0.4163 | 0.4357 | 0.1589 | 0.3851 | 0.5976 | 0.4954 | 0.6577 | 0.1483 | 0.4013 | 0.1626 | 0.3754 | 0.0907 | 0.3969 | 0.2388 | 0.3471 |
| 1.0974 | 38.0 | 4066 | 1.4054 | 0.231 | 0.4814 | 0.1923 | 0.0625 | 0.1778 | 0.378 | 0.2464 | 0.4218 | 0.4397 | 0.1703 | 0.3886 | 0.6013 | 0.5061 | 0.6532 | 0.1707 | 0.4367 | 0.1685 | 0.3879 | 0.087 | 0.38 | 0.2229 | 0.3404 |
| 1.0815 | 39.0 | 4173 | 1.4458 | 0.2318 | 0.4829 | 0.1897 | 0.0721 | 0.1782 | 0.384 | 0.2433 | 0.4168 | 0.4381 | 0.1661 | 0.3886 | 0.6148 | 0.4742 | 0.655 | 0.1857 | 0.4127 | 0.174 | 0.3732 | 0.0922 | 0.3969 | 0.2328 | 0.3529 |
| 1.0897 | 40.0 | 4280 | 1.3788 | 0.2299 | 0.4735 | 0.1929 | 0.0679 | 0.1631 | 0.3756 | 0.2416 | 0.4173 | 0.4351 | 0.1662 | 0.3589 | 0.6283 | 0.5099 | 0.6712 | 0.1595 | 0.4203 | 0.1617 | 0.3723 | 0.0732 | 0.3431 | 0.2452 | 0.3689 |
| 1.0615 | 41.0 | 4387 | 1.3891 | 0.2411 | 0.5032 | 0.2139 | 0.0751 | 0.1771 | 0.3803 | 0.2494 | 0.4131 | 0.4292 | 0.187 | 0.3604 | 0.6034 | 0.519 | 0.664 | 0.1938 | 0.4354 | 0.1612 | 0.3504 | 0.0892 | 0.36 | 0.2421 | 0.336 |
| 1.0573 | 42.0 | 4494 | 1.3824 | 0.2419 | 0.4938 | 0.2121 | 0.068 | 0.1744 | 0.3942 | 0.2634 | 0.4166 | 0.4287 | 0.1677 | 0.3598 | 0.611 | 0.509 | 0.6667 | 0.1917 | 0.4316 | 0.1661 | 0.3607 | 0.1 | 0.3385 | 0.2427 | 0.3458 |
| 1.0504 | 43.0 | 4601 | 1.3565 | 0.2497 | 0.5184 | 0.2224 | 0.0703 | 0.1896 | 0.3926 | 0.2513 | 0.4333 | 0.4496 | 0.2086 | 0.3955 | 0.6028 | 0.5132 | 0.6662 | 0.2108 | 0.4418 | 0.1847 | 0.3951 | 0.0962 | 0.3862 | 0.2435 | 0.3587 |
| 1.0575 | 44.0 | 4708 | 1.3194 | 0.2326 | 0.4773 | 0.201 | 0.062 | 0.18 | 0.384 | 0.2408 | 0.4312 | 0.4467 | 0.2068 | 0.3814 | 0.6097 | 0.5039 | 0.6725 | 0.1953 | 0.462 | 0.1625 | 0.367 | 0.0723 | 0.3785 | 0.229 | 0.3533 |
| 1.0356 | 45.0 | 4815 | 1.3503 | 0.2355 | 0.493 | 0.2068 | 0.0704 | 0.1795 | 0.3794 | 0.2598 | 0.4328 | 0.4489 | 0.1803 | 0.3783 | 0.6254 | 0.5124 | 0.6752 | 0.1886 | 0.4658 | 0.1516 | 0.3518 | 0.0917 | 0.3954 | 0.2331 | 0.3564 |
| 1.0766 | 46.0 | 4922 | 1.3880 | 0.2407 | 0.5086 | 0.205 | 0.0848 | 0.1864 | 0.3725 | 0.2517 | 0.4316 | 0.4475 | 0.1847 | 0.3949 | 0.6098 | 0.5322 | 0.6734 | 0.172 | 0.4709 | 0.1544 | 0.3313 | 0.0981 | 0.4123 | 0.2467 | 0.3498 |
| 1.0632 | 47.0 | 5029 | 1.3330 | 0.2473 | 0.5065 | 0.2145 | 0.0583 | 0.1901 | 0.3987 | 0.2548 | 0.4249 | 0.4443 | 0.172 | 0.3909 | 0.6116 | 0.5287 | 0.6626 | 0.2018 | 0.4544 | 0.1714 | 0.3795 | 0.0818 | 0.3677 | 0.2531 | 0.3573 |
| 1.0451 | 48.0 | 5136 | 1.3742 | 0.2405 | 0.4981 | 0.2093 | 0.0741 | 0.1883 | 0.3839 | 0.2573 | 0.4241 | 0.4352 | 0.1595 | 0.3878 | 0.6145 | 0.5253 | 0.6815 | 0.1927 | 0.4342 | 0.1545 | 0.3388 | 0.0889 | 0.3708 | 0.2412 | 0.3507 |
| 1.0421 | 49.0 | 5243 | 1.3462 | 0.254 | 0.5327 | 0.2251 | 0.0803 | 0.2014 | 0.4042 | 0.2603 | 0.4233 | 0.4433 | 0.1951 | 0.388 | 0.6098 | 0.5287 | 0.673 | 0.2031 | 0.4291 | 0.1752 | 0.3647 | 0.0941 | 0.3723 | 0.2687 | 0.3773 |
| 1.0075 | 50.0 | 5350 | 1.3627 | 0.2554 | 0.5047 | 0.2309 | 0.0597 | 0.1887 | 0.4272 | 0.2636 | 0.4368 | 0.4541 | 0.17 | 0.3863 | 0.6523 | 0.5321 | 0.6703 | 0.2103 | 0.4772 | 0.182 | 0.3728 | 0.0993 | 0.3831 | 0.2534 | 0.3671 |
| 1.0206 | 51.0 | 5457 | 1.3867 | 0.2514 | 0.512 | 0.214 | 0.0628 | 0.1821 | 0.4166 | 0.2599 | 0.4278 | 0.4428 | 0.1467 | 0.3731 | 0.6462 | 0.5356 | 0.6788 | 0.1927 | 0.4468 | 0.1813 | 0.3558 | 0.0991 | 0.3723 | 0.2482 | 0.36 |
| 1.0196 | 52.0 | 5564 | 1.3444 | 0.2562 | 0.5201 | 0.2122 | 0.0562 | 0.1917 | 0.4182 | 0.2598 | 0.4333 | 0.4469 | 0.1661 | 0.3853 | 0.6292 | 0.5361 | 0.673 | 0.1917 | 0.4228 | 0.1788 | 0.3723 | 0.116 | 0.4123 | 0.2585 | 0.3542 |
| 0.9788 | 53.0 | 5671 | 1.3625 | 0.2484 | 0.5059 | 0.2031 | 0.063 | 0.1887 | 0.4126 | 0.2663 | 0.4392 | 0.4544 | 0.1656 | 0.3975 | 0.6338 | 0.5098 | 0.6734 | 0.1791 | 0.4468 | 0.1718 | 0.3612 | 0.1273 | 0.4108 | 0.2541 | 0.38 |
| 1.0195 | 54.0 | 5778 | 1.3207 | 0.2604 | 0.5269 | 0.2224 | 0.069 | 0.2027 | 0.4196 | 0.2626 | 0.4382 | 0.4527 | 0.1912 | 0.3909 | 0.6223 | 0.5341 | 0.6892 | 0.1927 | 0.457 | 0.1783 | 0.3696 | 0.1288 | 0.3785 | 0.2683 | 0.3693 |
| 0.9709 | 55.0 | 5885 | 1.3276 | 0.2648 | 0.5331 | 0.2205 | 0.0629 | 0.2026 | 0.443 | 0.2763 | 0.4414 | 0.4555 | 0.1709 | 0.4012 | 0.6281 | 0.5389 | 0.6923 | 0.1999 | 0.4443 | 0.1873 | 0.3759 | 0.1313 | 0.3938 | 0.2664 | 0.3711 |
| 0.9539 | 56.0 | 5992 | 1.3213 | 0.2621 | 0.5399 | 0.2236 | 0.0608 | 0.2069 | 0.4189 | 0.2664 | 0.4313 | 0.4497 | 0.2057 | 0.3859 | 0.6112 | 0.5355 | 0.6788 | 0.182 | 0.4304 | 0.1968 | 0.3982 | 0.1361 | 0.3815 | 0.2599 | 0.3596 |
| 0.9704 | 57.0 | 6099 | 1.3051 | 0.2687 | 0.5342 | 0.2298 | 0.0807 | 0.2104 | 0.4223 | 0.2824 | 0.4485 | 0.4641 | 0.1862 | 0.4063 | 0.6406 | 0.5348 | 0.6919 | 0.2088 | 0.4722 | 0.1978 | 0.3888 | 0.121 | 0.3938 | 0.2812 | 0.3738 |
| 0.9689 | 58.0 | 6206 | 1.3027 | 0.2675 | 0.5398 | 0.2294 | 0.0731 | 0.2088 | 0.4298 | 0.2793 | 0.4454 | 0.459 | 0.1659 | 0.4005 | 0.6364 | 0.5396 | 0.6874 | 0.1856 | 0.4595 | 0.189 | 0.3795 | 0.139 | 0.3892 | 0.2842 | 0.3796 |
| 0.9491 | 59.0 | 6313 | 1.3377 | 0.2575 | 0.5237 | 0.2173 | 0.0655 | 0.1992 | 0.412 | 0.265 | 0.4424 | 0.4569 | 0.2028 | 0.3954 | 0.6316 | 0.5307 | 0.668 | 0.1743 | 0.4532 | 0.195 | 0.3835 | 0.1042 | 0.4 | 0.2835 | 0.38 |
| 0.9635 | 60.0 | 6420 | 1.3185 | 0.265 | 0.5431 | 0.2265 | 0.0715 | 0.2135 | 0.4079 | 0.2812 | 0.4446 | 0.4581 | 0.1858 | 0.3986 | 0.6338 | 0.5458 | 0.6833 | 0.1812 | 0.4494 | 0.1877 | 0.3701 | 0.1196 | 0.3862 | 0.2909 | 0.4013 |
| 0.9593 | 61.0 | 6527 | 1.3583 | 0.2649 | 0.5335 | 0.2213 | 0.07 | 0.217 | 0.4138 | 0.2731 | 0.4422 | 0.4544 | 0.1639 | 0.3965 | 0.6381 | 0.539 | 0.677 | 0.195 | 0.4557 | 0.1766 | 0.3562 | 0.1344 | 0.4092 | 0.2795 | 0.3738 |
| 0.933 | 62.0 | 6634 | 1.3281 | 0.2729 | 0.5496 | 0.2352 | 0.0762 | 0.2242 | 0.4251 | 0.2746 | 0.4517 | 0.463 | 0.1973 | 0.4129 | 0.6345 | 0.5414 | 0.6874 | 0.2034 | 0.457 | 0.1926 | 0.371 | 0.1363 | 0.4108 | 0.2909 | 0.3889 |
| 0.9589 | 63.0 | 6741 | 1.3673 | 0.2617 | 0.5234 | 0.2331 | 0.0794 | 0.2033 | 0.4155 | 0.2597 | 0.4321 | 0.4496 | 0.1788 | 0.3914 | 0.6124 | 0.5325 | 0.6676 | 0.186 | 0.4772 | 0.1923 | 0.3504 | 0.1145 | 0.3815 | 0.2833 | 0.3711 |
| 0.9557 | 64.0 | 6848 | 1.2880 | 0.2704 | 0.5518 | 0.2401 | 0.0795 | 0.2233 | 0.4197 | 0.2705 | 0.4506 | 0.4692 | 0.1936 | 0.4216 | 0.6227 | 0.5492 | 0.6842 | 0.2033 | 0.4671 | 0.1963 | 0.3879 | 0.1148 | 0.4185 | 0.2887 | 0.3884 |
| 0.9207 | 65.0 | 6955 | 1.3327 | 0.2744 | 0.5535 | 0.2341 | 0.0692 | 0.2246 | 0.4331 | 0.2826 | 0.4535 | 0.4698 | 0.1961 | 0.4125 | 0.6362 | 0.5375 | 0.6811 | 0.1987 | 0.4684 | 0.2095 | 0.3978 | 0.1394 | 0.4154 | 0.2868 | 0.3862 |
| 0.9078 | 66.0 | 7062 | 1.3275 | 0.2615 | 0.5261 | 0.2218 | 0.0622 | 0.1935 | 0.4505 | 0.2698 | 0.4451 | 0.462 | 0.1656 | 0.4076 | 0.6531 | 0.5365 | 0.6752 | 0.1989 | 0.4785 | 0.1834 | 0.3929 | 0.1158 | 0.3877 | 0.2727 | 0.3756 |
| 0.9286 | 67.0 | 7169 | 1.3468 | 0.2635 | 0.5273 | 0.2271 | 0.0778 | 0.2009 | 0.434 | 0.2806 | 0.4361 | 0.4506 | 0.171 | 0.3927 | 0.6282 | 0.5425 | 0.6928 | 0.2065 | 0.4519 | 0.1856 | 0.3795 | 0.1034 | 0.3446 | 0.2795 | 0.384 |
| 0.918 | 68.0 | 7276 | 1.3399 | 0.2619 | 0.5326 | 0.2171 | 0.0731 | 0.1994 | 0.4347 | 0.2872 | 0.4402 | 0.4614 | 0.1772 | 0.4053 | 0.6393 | 0.5374 | 0.6883 | 0.2019 | 0.4582 | 0.1865 | 0.383 | 0.1157 | 0.3923 | 0.2681 | 0.3853 |
| 0.9117 | 69.0 | 7383 | 1.3267 | 0.2575 | 0.5197 | 0.2169 | 0.0737 | 0.2062 | 0.4202 | 0.2769 | 0.4475 | 0.4665 | 0.1975 | 0.4126 | 0.6317 | 0.5341 | 0.6833 | 0.2002 | 0.4684 | 0.1777 | 0.3924 | 0.1094 | 0.4046 | 0.2658 | 0.3836 |
| 0.8936 | 70.0 | 7490 | 1.3458 | 0.2572 | 0.525 | 0.2233 | 0.0594 | 0.1985 | 0.4259 | 0.282 | 0.4446 | 0.4629 | 0.1609 | 0.4085 | 0.6491 | 0.5157 | 0.6743 | 0.1914 | 0.4747 | 0.1781 | 0.383 | 0.1315 | 0.3954 | 0.2692 | 0.3871 |
| 0.896 | 71.0 | 7597 | 1.3293 | 0.2648 | 0.5316 | 0.2153 | 0.0647 | 0.2137 | 0.4176 | 0.2743 | 0.449 | 0.4654 | 0.1767 | 0.4172 | 0.6278 | 0.5403 | 0.6874 | 0.2002 | 0.4835 | 0.1774 | 0.3768 | 0.125 | 0.3846 | 0.2812 | 0.3947 |
| 0.8923 | 72.0 | 7704 | 1.3015 | 0.2709 | 0.5507 | 0.2271 | 0.083 | 0.2053 | 0.4296 | 0.2824 | 0.4503 | 0.4652 | 0.1908 | 0.4006 | 0.6412 | 0.5475 | 0.6901 | 0.2006 | 0.4443 | 0.1862 | 0.4013 | 0.1426 | 0.3985 | 0.2775 | 0.392 |
| 0.8798 | 73.0 | 7811 | 1.3031 | 0.2762 | 0.5469 | 0.2388 | 0.0779 | 0.2202 | 0.436 | 0.2835 | 0.4514 | 0.4681 | 0.1769 | 0.41 | 0.6419 | 0.5424 | 0.6829 | 0.2074 | 0.4582 | 0.2024 | 0.4062 | 0.1443 | 0.4 | 0.2844 | 0.3933 |
| 0.8847 | 74.0 | 7918 | 1.3144 | 0.268 | 0.5423 | 0.2231 | 0.0607 | 0.2132 | 0.4356 | 0.2825 | 0.4432 | 0.4594 | 0.1679 | 0.402 | 0.6463 | 0.5228 | 0.6626 | 0.2193 | 0.4684 | 0.1887 | 0.3826 | 0.1295 | 0.3985 | 0.2799 | 0.3849 |
| 0.8807 | 75.0 | 8025 | 1.3156 | 0.2786 | 0.5623 | 0.223 | 0.0799 | 0.2261 | 0.4339 | 0.28 | 0.4606 | 0.4783 | 0.1901 | 0.4283 | 0.6482 | 0.5193 | 0.6739 | 0.2351 | 0.4975 | 0.2034 | 0.408 | 0.1433 | 0.4185 | 0.2917 | 0.3938 |
| 0.8688 | 76.0 | 8132 | 1.2824 | 0.2785 | 0.5646 | 0.2328 | 0.0845 | 0.2194 | 0.4509 | 0.2801 | 0.4503 | 0.4689 | 0.1796 | 0.4177 | 0.6467 | 0.5337 | 0.6761 | 0.2323 | 0.4835 | 0.1959 | 0.404 | 0.1395 | 0.3846 | 0.291 | 0.3964 |
| 0.8766 | 77.0 | 8239 | 1.2948 | 0.279 | 0.5517 | 0.2413 | 0.0735 | 0.2307 | 0.4358 | 0.2809 | 0.4505 | 0.4694 | 0.1925 | 0.4164 | 0.6332 | 0.5427 | 0.682 | 0.2324 | 0.4684 | 0.1924 | 0.3964 | 0.1428 | 0.4 | 0.2849 | 0.4 |
| 0.8724 | 78.0 | 8346 | 1.3041 | 0.2797 | 0.5574 | 0.25 | 0.0824 | 0.226 | 0.4462 | 0.2828 | 0.4567 | 0.4742 | 0.2142 | 0.4164 | 0.6516 | 0.5322 | 0.677 | 0.2371 | 0.4785 | 0.1844 | 0.4098 | 0.1493 | 0.4031 | 0.2956 | 0.4027 |
| 0.859 | 79.0 | 8453 | 1.2883 | 0.2823 | 0.558 | 0.2424 | 0.0663 | 0.2175 | 0.455 | 0.289 | 0.4525 | 0.4694 | 0.199 | 0.4096 | 0.6468 | 0.553 | 0.6865 | 0.2147 | 0.4557 | 0.1929 | 0.404 | 0.1615 | 0.4062 | 0.2893 | 0.3947 |
| 0.8585 | 80.0 | 8560 | 1.2863 | 0.2819 | 0.56 | 0.2468 | 0.0881 | 0.226 | 0.4642 | 0.2869 | 0.4554 | 0.4732 | 0.2034 | 0.4168 | 0.6486 | 0.5333 | 0.6838 | 0.2378 | 0.4671 | 0.1948 | 0.4085 | 0.1424 | 0.4 | 0.3012 | 0.4067 |
| 0.8454 | 81.0 | 8667 | 1.2876 | 0.284 | 0.5586 | 0.2495 | 0.0808 | 0.2252 | 0.4634 | 0.2885 | 0.4597 | 0.4767 | 0.1803 | 0.425 | 0.664 | 0.5307 | 0.6838 | 0.2354 | 0.4709 | 0.2103 | 0.4098 | 0.1428 | 0.4077 | 0.3007 | 0.4116 |
| 0.8513 | 82.0 | 8774 | 1.2763 | 0.2848 | 0.5614 | 0.2489 | 0.0834 | 0.2375 | 0.4469 | 0.2822 | 0.4581 | 0.473 | 0.2089 | 0.4262 | 0.6339 | 0.5428 | 0.6788 | 0.2372 | 0.4582 | 0.2026 | 0.4103 | 0.1401 | 0.4 | 0.3013 | 0.4178 |
| 0.8371 | 83.0 | 8881 | 1.2659 | 0.2884 | 0.5674 | 0.252 | 0.089 | 0.2261 | 0.457 | 0.2812 | 0.4588 | 0.4706 | 0.1991 | 0.4131 | 0.6465 | 0.5492 | 0.6896 | 0.2394 | 0.4405 | 0.203 | 0.4062 | 0.1429 | 0.4046 | 0.3075 | 0.412 |
| 0.8459 | 84.0 | 8988 | 1.2862 | 0.2877 | 0.5667 | 0.2514 | 0.0876 | 0.2176 | 0.4602 | 0.2839 | 0.4587 | 0.4707 | 0.1898 | 0.4192 | 0.6436 | 0.5562 | 0.6959 | 0.2376 | 0.4519 | 0.2045 | 0.3879 | 0.1435 | 0.4062 | 0.2969 | 0.4116 |
| 0.8232 | 85.0 | 9095 | 1.2927 | 0.2878 | 0.5693 | 0.251 | 0.0894 | 0.2314 | 0.4523 | 0.2877 | 0.4589 | 0.4759 | 0.2037 | 0.4273 | 0.6478 | 0.5332 | 0.677 | 0.2346 | 0.4582 | 0.2112 | 0.404 | 0.1518 | 0.4215 | 0.3081 | 0.4187 |
| 0.8476 | 86.0 | 9202 | 1.2649 | 0.288 | 0.5683 | 0.2496 | 0.0723 | 0.2332 | 0.4597 | 0.2896 | 0.4624 | 0.4774 | 0.1947 | 0.4314 | 0.6485 | 0.5407 | 0.6838 | 0.2411 | 0.4759 | 0.206 | 0.4049 | 0.1508 | 0.4092 | 0.3013 | 0.4133 |
| 0.8191 | 87.0 | 9309 | 1.2794 | 0.2913 | 0.5706 | 0.2503 | 0.0819 | 0.2303 | 0.4605 | 0.2871 | 0.4619 | 0.4766 | 0.1945 | 0.4228 | 0.6558 | 0.5484 | 0.6869 | 0.2381 | 0.4658 | 0.2186 | 0.4071 | 0.1435 | 0.4138 | 0.3082 | 0.4093 |
| 0.8345 | 88.0 | 9416 | 1.2816 | 0.2887 | 0.568 | 0.2535 | 0.083 | 0.2213 | 0.4599 | 0.2912 | 0.4631 | 0.4765 | 0.1943 | 0.4239 | 0.647 | 0.5451 | 0.6847 | 0.2384 | 0.4671 | 0.2093 | 0.4071 | 0.1427 | 0.4092 | 0.3079 | 0.4142 |
| 0.8313 | 89.0 | 9523 | 1.2714 | 0.2885 | 0.5718 | 0.2531 | 0.087 | 0.2287 | 0.451 | 0.2879 | 0.4624 | 0.475 | 0.2005 | 0.4227 | 0.6405 | 0.5375 | 0.691 | 0.2392 | 0.457 | 0.2122 | 0.4031 | 0.1485 | 0.4185 | 0.3052 | 0.4053 |
| 0.8305 | 90.0 | 9630 | 1.2669 | 0.2939 | 0.5665 | 0.2578 | 0.0852 | 0.2304 | 0.4619 | 0.2927 | 0.4614 | 0.4737 | 0.1981 | 0.4197 | 0.6446 | 0.5448 | 0.6901 | 0.2471 | 0.4722 | 0.2069 | 0.3978 | 0.1628 | 0.4031 | 0.3077 | 0.4053 |
| 0.8059 | 91.0 | 9737 | 1.2680 | 0.2961 | 0.5782 | 0.2599 | 0.0836 | 0.2329 | 0.4598 | 0.29 | 0.4648 | 0.4782 | 0.1917 | 0.4262 | 0.6443 | 0.5487 | 0.686 | 0.2416 | 0.4785 | 0.2124 | 0.3969 | 0.1682 | 0.4154 | 0.3093 | 0.4142 |
| 0.8096 | 92.0 | 9844 | 1.2710 | 0.2943 | 0.5708 | 0.2545 | 0.087 | 0.2293 | 0.4586 | 0.2935 | 0.4668 | 0.4794 | 0.1969 | 0.4281 | 0.6435 | 0.5454 | 0.6851 | 0.2481 | 0.481 | 0.2074 | 0.4022 | 0.163 | 0.42 | 0.3076 | 0.4084 |
| 0.8043 | 93.0 | 9951 | 1.2747 | 0.2884 | 0.5648 | 0.2437 | 0.0865 | 0.2269 | 0.4489 | 0.2894 | 0.4615 | 0.4745 | 0.1989 | 0.4222 | 0.6391 | 0.5403 | 0.6784 | 0.2408 | 0.4722 | 0.204 | 0.3973 | 0.1516 | 0.4231 | 0.3054 | 0.4013 |
| 0.7964 | 94.0 | 10058 | 1.2638 | 0.2906 | 0.5734 | 0.2484 | 0.0881 | 0.2312 | 0.4542 | 0.29 | 0.4613 | 0.4738 | 0.1999 | 0.417 | 0.6446 | 0.5392 | 0.6775 | 0.2467 | 0.4785 | 0.2084 | 0.404 | 0.1538 | 0.4062 | 0.3048 | 0.4031 |
| 0.8133 | 95.0 | 10165 | 1.2622 | 0.3009 | 0.5852 | 0.2613 | 0.0871 | 0.2313 | 0.4755 | 0.2928 | 0.4669 | 0.4796 | 0.2031 | 0.4208 | 0.6534 | 0.5452 | 0.6829 | 0.2581 | 0.4848 | 0.2118 | 0.4058 | 0.1772 | 0.4092 | 0.312 | 0.4151 |
| 0.8176 | 96.0 | 10272 | 1.2640 | 0.2956 | 0.5816 | 0.2523 | 0.0897 | 0.2321 | 0.4621 | 0.2904 | 0.4618 | 0.4741 | 0.2036 | 0.4181 | 0.6464 | 0.5422 | 0.6784 | 0.256 | 0.4646 | 0.2087 | 0.4049 | 0.1638 | 0.4138 | 0.307 | 0.4089 |
| 0.7903 | 97.0 | 10379 | 1.2567 | 0.3006 | 0.5885 | 0.2594 | 0.0918 | 0.2321 | 0.4755 | 0.294 | 0.4705 | 0.4846 | 0.2013 | 0.4258 | 0.6651 | 0.5444 | 0.6883 | 0.2582 | 0.481 | 0.2132 | 0.4116 | 0.177 | 0.4277 | 0.3104 | 0.4142 |
| 0.8068 | 98.0 | 10486 | 1.2697 | 0.296 | 0.585 | 0.2573 | 0.0895 | 0.2316 | 0.464 | 0.2909 | 0.4663 | 0.4802 | 0.2035 | 0.4213 | 0.6567 | 0.5387 | 0.6851 | 0.2574 | 0.4772 | 0.2116 | 0.4058 | 0.1638 | 0.4215 | 0.3085 | 0.4111 |
| 0.7942 | 99.0 | 10593 | 1.2727 | 0.2972 | 0.5896 | 0.2589 | 0.0846 | 0.2341 | 0.4657 | 0.2927 | 0.4649 | 0.479 | 0.1962 | 0.422 | 0.657 | 0.5424 | 0.6851 | 0.2527 | 0.4696 | 0.2101 | 0.4049 | 0.1715 | 0.4215 | 0.3096 | 0.4138 |
| 0.8099 | 100.0 | 10700 | 1.2733 | 0.2965 | 0.5888 | 0.2549 | 0.085 | 0.2348 | 0.4635 | 0.2926 | 0.4628 | 0.4771 | 0.1972 | 0.4214 | 0.6518 | 0.5413 | 0.6838 | 0.2526 | 0.4696 | 0.2117 | 0.4036 | 0.1686 | 0.4169 | 0.3083 | 0.4116 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 2.21.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
deon03/deter-IDD20k-2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"curb",
"bicycle",
"traffic light",
"out of roi",
"pole",
"sidewalk",
"sky",
"guard rail",
"vehicle fallback",
"animal",
"fallback background",
"license plate",
"person",
"trailer",
"train",
"non-drivable fallback",
"parking",
"vegetation",
"traffic sign",
"obs-str-bar-fallback",
"ground",
"rectification border",
"tunnel",
"ego vehicle",
"rider",
"motorcycle",
"caravan",
"bus",
"building",
"polegroup",
"fence",
"billboard",
"drivable fallback",
"wall",
"bridge",
"unlabeled",
"car",
"road",
"rail track",
"truck",
"autorickshaw"
] |
urretxo/practica_2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# practica_2
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.1
| [
"raccoon",
"person",
"skunk"
] |
moha-drk/practica_detr |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# practica_detr
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.1
| [
"background",
"kangaroo"
] |
Ducco/rtdetr-v2-r50-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r50-cppe5-finetune-2
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 10.6505
- Map: 0.4323
- Map 50: 0.8963
- Map 75: 0.3219
- Map Small: 0.3756
- Map Medium: 0.5476
- Map Large: 0.7106
- Mar 1: 0.2852
- Mar 10: 0.4788
- Mar 100: 0.5715
- Mar Small: 0.5305
- Mar Medium: 0.6698
- Mar Large: 0.7583
- Map Football: 0.4833
- Mar 100 Football: 0.5867
- Map Player: 0.3814
- Mar 100 Player: 0.5564
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 300
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Football | Mar 100 Football | Map Player | Mar 100 Player |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:----------:|:--------------:|
| No log | 1.0 | 62 | 17.5576 | 0.1072 | 0.2555 | 0.0663 | 0.0878 | 0.1914 | 0.1325 | 0.0543 | 0.208 | 0.3404 | 0.2858 | 0.4734 | 0.5447 | 0.0102 | 0.2166 | 0.2042 | 0.4642 |
| No log | 2.0 | 124 | 9.5985 | 0.3615 | 0.8054 | 0.2547 | 0.3082 | 0.5041 | 0.5671 | 0.2189 | 0.4346 | 0.5265 | 0.4739 | 0.6782 | 0.6455 | 0.3255 | 0.4828 | 0.3975 | 0.5702 |
| No log | 3.0 | 186 | 9.6654 | 0.3647 | 0.8181 | 0.2656 | 0.3115 | 0.5271 | 0.6 | 0.2353 | 0.4404 | 0.5339 | 0.4772 | 0.685 | 0.7447 | 0.3696 | 0.5225 | 0.3597 | 0.5452 |
| No log | 4.0 | 248 | 10.0423 | 0.3637 | 0.8276 | 0.2328 | 0.3011 | 0.5362 | 0.6091 | 0.2209 | 0.4219 | 0.5103 | 0.4515 | 0.674 | 0.7477 | 0.345 | 0.474 | 0.3823 | 0.5467 |
| No log | 5.0 | 310 | 10.2540 | 0.3952 | 0.8224 | 0.3201 | 0.3304 | 0.5617 | 0.6733 | 0.2556 | 0.4589 | 0.548 | 0.4924 | 0.6929 | 0.7598 | 0.4203 | 0.5509 | 0.3702 | 0.5452 |
| No log | 6.0 | 372 | 10.4936 | 0.3862 | 0.8134 | 0.3167 | 0.3148 | 0.5569 | 0.6616 | 0.2459 | 0.4368 | 0.5254 | 0.4659 | 0.6818 | 0.797 | 0.3995 | 0.5089 | 0.373 | 0.5418 |
| No log | 7.0 | 434 | 10.6991 | 0.4119 | 0.8405 | 0.3383 | 0.348 | 0.5668 | 0.6627 | 0.2615 | 0.4528 | 0.5379 | 0.4824 | 0.6929 | 0.7515 | 0.4282 | 0.5219 | 0.3955 | 0.5539 |
| No log | 8.0 | 496 | 10.7472 | 0.4216 | 0.8338 | 0.3662 | 0.3592 | 0.5737 | 0.6884 | 0.2668 | 0.4624 | 0.5513 | 0.4966 | 0.701 | 0.7561 | 0.4488 | 0.5438 | 0.3943 | 0.5588 |
| 19.6326 | 9.0 | 558 | 10.9720 | 0.3984 | 0.8353 | 0.3183 | 0.3307 | 0.5564 | 0.7087 | 0.2605 | 0.4449 | 0.5241 | 0.4658 | 0.6661 | 0.797 | 0.4371 | 0.5302 | 0.3598 | 0.518 |
| 19.6326 | 10.0 | 620 | 10.9521 | 0.4117 | 0.8392 | 0.3436 | 0.3458 | 0.569 | 0.6785 | 0.2584 | 0.4495 | 0.5315 | 0.4753 | 0.6815 | 0.75 | 0.436 | 0.5195 | 0.3873 | 0.5435 |
| 19.6326 | 11.0 | 682 | 11.0008 | 0.4158 | 0.8449 | 0.3501 | 0.3509 | 0.5711 | 0.6707 | 0.2626 | 0.4517 | 0.5369 | 0.4837 | 0.6821 | 0.7447 | 0.4374 | 0.5237 | 0.3942 | 0.5502 |
| 19.6326 | 12.0 | 744 | 11.0544 | 0.411 | 0.8363 | 0.3415 | 0.3436 | 0.5727 | 0.6689 | 0.259 | 0.4496 | 0.5331 | 0.4775 | 0.6857 | 0.7439 | 0.4336 | 0.5225 | 0.3883 | 0.5438 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"football",
"player"
] |
guiiwfz/zafira-person |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"person",
"dog"
] |
guiiwfz/zafira |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
hxwk507/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8041
- Map: 0.4041
- Map 50: 0.8246
- Map 75: 0.3402
- Map Small: 0.3079
- Map Medium: 0.3562
- Map Large: 0.6364
- Mar 1: 0.1839
- Mar 10: 0.4794
- Mar 100: 0.5657
- Mar Small: 0.4329
- Mar Medium: 0.5174
- Mar Large: 0.7856
- Map Hardhat: 0.4075
- Mar 100 Hardhat: 0.5473
- Map No-hardhat: 0.4007
- Mar 100 No-hardhat: 0.5842
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Hardhat | Mar 100 Hardhat | Map No-hardhat | Mar 100 No-hardhat |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------------:|:------------------:|
| No log | 1.0 | 125 | 1.1847 | 0.0822 | 0.1954 | 0.0553 | 0.0796 | 0.0937 | 0.2324 | 0.1312 | 0.3509 | 0.4354 | 0.2086 | 0.3775 | 0.6788 | 0.1426 | 0.5655 | 0.0218 | 0.3053 |
| No log | 2.0 | 250 | 1.0931 | 0.1205 | 0.2648 | 0.0965 | 0.1277 | 0.1601 | 0.1647 | 0.1587 | 0.3817 | 0.4625 | 0.2886 | 0.435 | 0.6636 | 0.1868 | 0.5618 | 0.0542 | 0.3632 |
| No log | 3.0 | 375 | 1.1882 | 0.1449 | 0.3805 | 0.0834 | 0.0775 | 0.1623 | 0.3615 | 0.1042 | 0.3262 | 0.4281 | 0.3086 | 0.347 | 0.5962 | 0.2321 | 0.5036 | 0.0577 | 0.3526 |
| 1.472 | 4.0 | 500 | 1.0418 | 0.2419 | 0.6022 | 0.1416 | 0.1374 | 0.3265 | 0.3723 | 0.1444 | 0.4138 | 0.4827 | 0.2986 | 0.5035 | 0.6114 | 0.2844 | 0.5127 | 0.1994 | 0.4526 |
| 1.472 | 5.0 | 625 | 1.0232 | 0.2349 | 0.5815 | 0.1564 | 0.1736 | 0.3006 | 0.3826 | 0.1395 | 0.3992 | 0.4974 | 0.2686 | 0.5149 | 0.6689 | 0.2891 | 0.5527 | 0.1807 | 0.4421 |
| 1.472 | 6.0 | 750 | 0.9985 | 0.293 | 0.6561 | 0.2108 | 0.2129 | 0.3184 | 0.426 | 0.1604 | 0.4373 | 0.5125 | 0.3129 | 0.5225 | 0.6879 | 0.3452 | 0.5618 | 0.2407 | 0.4632 |
| 1.472 | 7.0 | 875 | 0.9616 | 0.3145 | 0.7258 | 0.2615 | 0.2491 | 0.3435 | 0.4999 | 0.1381 | 0.4634 | 0.5455 | 0.3171 | 0.5568 | 0.7386 | 0.3164 | 0.5436 | 0.3126 | 0.5474 |
| 0.9939 | 8.0 | 1000 | 0.9688 | 0.3194 | 0.7461 | 0.1786 | 0.1922 | 0.3171 | 0.576 | 0.16 | 0.4325 | 0.5252 | 0.2943 | 0.5194 | 0.7462 | 0.2982 | 0.4873 | 0.3405 | 0.5632 |
| 0.9939 | 9.0 | 1125 | 0.9211 | 0.3572 | 0.7888 | 0.2944 | 0.2046 | 0.3196 | 0.5454 | 0.1482 | 0.4632 | 0.5346 | 0.2571 | 0.5625 | 0.7303 | 0.3545 | 0.5218 | 0.3599 | 0.5474 |
| 0.9939 | 10.0 | 1250 | 0.9664 | 0.3463 | 0.7569 | 0.2541 | 0.2503 | 0.3473 | 0.4924 | 0.1611 | 0.4344 | 0.5116 | 0.31 | 0.499 | 0.7258 | 0.3423 | 0.5127 | 0.3504 | 0.5105 |
| 0.9939 | 11.0 | 1375 | 0.9463 | 0.3261 | 0.8263 | 0.2012 | 0.2582 | 0.2676 | 0.5931 | 0.1643 | 0.4153 | 0.5134 | 0.32 | 0.4437 | 0.7629 | 0.3386 | 0.5164 | 0.3136 | 0.5105 |
| 0.8775 | 12.0 | 1500 | 0.9153 | 0.3571 | 0.7972 | 0.2882 | 0.2397 | 0.3273 | 0.5803 | 0.1587 | 0.4377 | 0.5556 | 0.3571 | 0.566 | 0.7189 | 0.3347 | 0.5164 | 0.3795 | 0.5947 |
| 0.8775 | 13.0 | 1625 | 0.9063 | 0.3512 | 0.8299 | 0.2471 | 0.2342 | 0.3119 | 0.6117 | 0.1695 | 0.4222 | 0.5016 | 0.2786 | 0.4672 | 0.7439 | 0.3422 | 0.4927 | 0.3602 | 0.5105 |
| 0.8775 | 14.0 | 1750 | 0.9384 | 0.3351 | 0.7633 | 0.2393 | 0.1938 | 0.3064 | 0.5551 | 0.1723 | 0.4257 | 0.5105 | 0.3371 | 0.4718 | 0.7076 | 0.3496 | 0.5 | 0.3206 | 0.5211 |
| 0.8775 | 15.0 | 1875 | 0.8734 | 0.3836 | 0.8279 | 0.3055 | 0.2541 | 0.348 | 0.614 | 0.1748 | 0.4373 | 0.531 | 0.3729 | 0.4941 | 0.7386 | 0.3671 | 0.5145 | 0.4002 | 0.5474 |
| 0.7888 | 16.0 | 2000 | 0.8470 | 0.3763 | 0.8437 | 0.2603 | 0.2854 | 0.3289 | 0.5821 | 0.1822 | 0.4556 | 0.5455 | 0.4314 | 0.4861 | 0.7568 | 0.3894 | 0.5436 | 0.3633 | 0.5474 |
| 0.7888 | 17.0 | 2125 | 0.8579 | 0.3708 | 0.8189 | 0.2792 | 0.2701 | 0.2976 | 0.6115 | 0.185 | 0.443 | 0.5206 | 0.3957 | 0.4633 | 0.7326 | 0.3986 | 0.5255 | 0.343 | 0.5158 |
| 0.7888 | 18.0 | 2250 | 0.8404 | 0.3714 | 0.7962 | 0.2522 | 0.2531 | 0.3327 | 0.6139 | 0.1778 | 0.4587 | 0.5433 | 0.3729 | 0.5084 | 0.7455 | 0.3709 | 0.5182 | 0.3719 | 0.5684 |
| 0.7888 | 19.0 | 2375 | 0.8268 | 0.3997 | 0.8285 | 0.3108 | 0.2915 | 0.3526 | 0.5942 | 0.1829 | 0.4882 | 0.5481 | 0.4157 | 0.4986 | 0.7644 | 0.4014 | 0.5436 | 0.3979 | 0.5526 |
| 0.7048 | 20.0 | 2500 | 0.8091 | 0.4209 | 0.8122 | 0.4316 | 0.2668 | 0.377 | 0.6569 | 0.1964 | 0.4669 | 0.5568 | 0.4 | 0.5206 | 0.7462 | 0.4154 | 0.54 | 0.4265 | 0.5737 |
| 0.7048 | 21.0 | 2625 | 0.8206 | 0.416 | 0.8227 | 0.303 | 0.3221 | 0.3747 | 0.6208 | 0.1839 | 0.4811 | 0.5401 | 0.3729 | 0.4992 | 0.747 | 0.4198 | 0.5382 | 0.4121 | 0.5421 |
| 0.7048 | 22.0 | 2750 | 0.8108 | 0.4266 | 0.8502 | 0.4021 | 0.3038 | 0.3847 | 0.6317 | 0.1965 | 0.4688 | 0.5534 | 0.4257 | 0.5125 | 0.7515 | 0.4264 | 0.5436 | 0.4269 | 0.5632 |
| 0.7048 | 23.0 | 2875 | 0.8239 | 0.4103 | 0.8158 | 0.3492 | 0.2874 | 0.3626 | 0.6316 | 0.1919 | 0.4572 | 0.5533 | 0.4114 | 0.5152 | 0.7462 | 0.417 | 0.5382 | 0.4036 | 0.5684 |
| 0.6439 | 24.0 | 3000 | 0.8092 | 0.4077 | 0.825 | 0.3504 | 0.3205 | 0.3525 | 0.6074 | 0.1893 | 0.4883 | 0.5641 | 0.4357 | 0.5228 | 0.7652 | 0.4129 | 0.5545 | 0.4026 | 0.5737 |
| 0.6439 | 25.0 | 3125 | 0.8076 | 0.4104 | 0.8432 | 0.3547 | 0.316 | 0.3559 | 0.6302 | 0.1893 | 0.4689 | 0.5535 | 0.4429 | 0.5027 | 0.7515 | 0.4187 | 0.5491 | 0.4021 | 0.5579 |
| 0.6439 | 26.0 | 3250 | 0.7988 | 0.4133 | 0.837 | 0.3469 | 0.3285 | 0.3631 | 0.6222 | 0.2035 | 0.4849 | 0.573 | 0.4643 | 0.5166 | 0.7902 | 0.4219 | 0.5618 | 0.4048 | 0.5842 |
| 0.6439 | 27.0 | 3375 | 0.8015 | 0.4082 | 0.832 | 0.3262 | 0.3104 | 0.3619 | 0.62 | 0.1699 | 0.4785 | 0.5693 | 0.4429 | 0.5174 | 0.7902 | 0.4165 | 0.5491 | 0.3998 | 0.5895 |
| 0.6079 | 28.0 | 3500 | 0.8043 | 0.4064 | 0.824 | 0.3412 | 0.3052 | 0.3588 | 0.6301 | 0.1857 | 0.4785 | 0.5631 | 0.4329 | 0.5111 | 0.7856 | 0.4145 | 0.5473 | 0.3982 | 0.5789 |
| 0.6079 | 29.0 | 3625 | 0.8043 | 0.403 | 0.8246 | 0.3378 | 0.3082 | 0.3533 | 0.6358 | 0.1839 | 0.4794 | 0.5667 | 0.44 | 0.5174 | 0.7856 | 0.4076 | 0.5491 | 0.3984 | 0.5842 |
| 0.6079 | 30.0 | 3750 | 0.8041 | 0.4041 | 0.8246 | 0.3402 | 0.3079 | 0.3562 | 0.6364 | 0.1839 | 0.4794 | 0.5657 | 0.4329 | 0.5174 | 0.7856 | 0.4075 | 0.5473 | 0.4007 | 0.5842 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"hardhat",
"no-hardhat"
] |
mrdbourke/rt_detrv2_finetuned_trashify_box_detector_v1 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rt_detrv2_finetuned_trashify_box_detector_v1
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 8.9000
- Map: 0.5134
- Map 50: 0.6917
- Map 75: 0.5749
- Map Small: 0.4
- Map Medium: 0.2845
- Map Large: 0.5538
- Mar 1: 0.5482
- Mar 10: 0.7189
- Mar 100: 0.7663
- Mar Small: 0.4
- Mar Medium: 0.533
- Mar Large: 0.7931
- Map Bin: 0.7876
- Mar 100 Bin: 0.8879
- Map Hand: 0.5723
- Mar 100 Hand: 0.8118
- Map Not Bin: 0.1797
- Mar 100 Not Bin: 0.6857
- Map Not Hand: -1.0
- Mar 100 Not Hand: -1.0
- Map Not Trash: 0.2679
- Mar 100 Not Trash: 0.625
- Map Trash: 0.6726
- Mar 100 Trash: 0.7876
- Map Trash Arm: 0.6
- Mar 100 Trash Arm: 0.8
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Bin | Mar 100 Bin | Map Hand | Mar 100 Hand | Map Not Bin | Mar 100 Not Bin | Map Not Hand | Mar 100 Not Hand | Map Not Trash | Mar 100 Not Trash | Map Trash | Mar 100 Trash | Map Trash Arm | Mar 100 Trash Arm |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:--------:|:------------:|:-----------:|:---------------:|:------------:|:----------------:|:-------------:|:-----------------:|:---------:|:-------------:|:-------------:|:-----------------:|
| 75.2499 | 1.0 | 50 | 17.5113 | 0.2036 | 0.293 | 0.2137 | 0.0 | 0.0349 | 0.2153 | 0.2926 | 0.4248 | 0.508 | 0.0 | 0.1244 | 0.5579 | 0.5792 | 0.8312 | 0.2434 | 0.7696 | 0.0044 | 0.3429 | -1.0 | -1.0 | 0.0107 | 0.4639 | 0.3837 | 0.6407 | 0.0 | 0.0 |
| 23.852 | 2.0 | 100 | 11.4502 | 0.2711 | 0.3799 | 0.3015 | 0.05 | 0.1059 | 0.2818 | 0.3735 | 0.5918 | 0.6483 | 0.35 | 0.3608 | 0.6945 | 0.6972 | 0.9035 | 0.2595 | 0.8088 | 0.0109 | 0.5643 | -1.0 | -1.0 | 0.031 | 0.5958 | 0.6088 | 0.7504 | 0.0192 | 0.2667 |
| 18.2873 | 3.0 | 150 | 10.0729 | 0.4112 | 0.5678 | 0.4869 | 0.3655 | 0.2303 | 0.432 | 0.4785 | 0.6951 | 0.7657 | 0.45 | 0.4551 | 0.7968 | 0.7569 | 0.905 | 0.3534 | 0.8343 | 0.0278 | 0.6571 | -1.0 | -1.0 | 0.1497 | 0.6236 | 0.6421 | 0.7743 | 0.5371 | 0.8 |
| 15.8982 | 4.0 | 200 | 9.4929 | 0.48 | 0.6555 | 0.5578 | 0.4 | 0.2552 | 0.5051 | 0.524 | 0.7099 | 0.7588 | 0.4 | 0.4597 | 0.7931 | 0.753 | 0.8936 | 0.5989 | 0.8353 | 0.1333 | 0.6429 | -1.0 | -1.0 | 0.1993 | 0.6319 | 0.6537 | 0.7823 | 0.542 | 0.7667 |
| 14.6758 | 5.0 | 250 | 9.4786 | 0.47 | 0.6472 | 0.5411 | 0.4 | 0.2494 | 0.5009 | 0.5346 | 0.6907 | 0.7252 | 0.4 | 0.3784 | 0.7732 | 0.7641 | 0.8766 | 0.5657 | 0.8029 | 0.1636 | 0.5571 | -1.0 | -1.0 | 0.2588 | 0.6083 | 0.6364 | 0.7726 | 0.4312 | 0.7333 |
| 13.5443 | 6.0 | 300 | 9.2135 | 0.495 | 0.6699 | 0.5594 | 0.35 | 0.347 | 0.5225 | 0.5432 | 0.7086 | 0.7602 | 0.35 | 0.5625 | 0.7905 | 0.7808 | 0.895 | 0.5788 | 0.8157 | 0.1336 | 0.6286 | -1.0 | -1.0 | 0.2336 | 0.6208 | 0.6626 | 0.8009 | 0.5804 | 0.8 |
| 12.828 | 7.0 | 350 | 8.9653 | 0.5041 | 0.6851 | 0.5799 | 0.35 | 0.2242 | 0.5328 | 0.543 | 0.7152 | 0.7596 | 0.35 | 0.5034 | 0.7952 | 0.7919 | 0.8922 | 0.5883 | 0.8127 | 0.1407 | 0.6643 | -1.0 | -1.0 | 0.2459 | 0.6264 | 0.6884 | 0.7956 | 0.5692 | 0.7667 |
| 12.1564 | 8.0 | 400 | 8.8797 | 0.509 | 0.683 | 0.5708 | 0.35 | 0.2002 | 0.542 | 0.5565 | 0.7412 | 0.7722 | 0.35 | 0.5267 | 0.8006 | 0.782 | 0.8879 | 0.6009 | 0.8137 | 0.1517 | 0.6857 | -1.0 | -1.0 | 0.2626 | 0.6278 | 0.6564 | 0.785 | 0.6003 | 0.8333 |
| 11.5731 | 9.0 | 450 | 9.0043 | 0.5126 | 0.692 | 0.5879 | 0.4 | 0.2861 | 0.5548 | 0.5454 | 0.7211 | 0.7714 | 0.4 | 0.5199 | 0.8015 | 0.7828 | 0.8823 | 0.5674 | 0.8176 | 0.2052 | 0.6929 | -1.0 | -1.0 | 0.2661 | 0.6139 | 0.6843 | 0.7885 | 0.5698 | 0.8333 |
| 11.2251 | 10.0 | 500 | 8.9000 | 0.5134 | 0.6917 | 0.5749 | 0.4 | 0.2845 | 0.5538 | 0.5482 | 0.7189 | 0.7663 | 0.4 | 0.533 | 0.7931 | 0.7876 | 0.8879 | 0.5723 | 0.8118 | 0.1797 | 0.6857 | -1.0 | -1.0 | 0.2679 | 0.625 | 0.6726 | 0.7876 | 0.6 | 0.8 |
### Framework versions
- Transformers 4.52.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"bin",
"hand",
"not_bin",
"not_hand",
"not_trash",
"trash",
"trash_arm"
] |
maverickyip/detr-resnet-50-isom5240 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
HichTala/diffusiondet-dota |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"plane",
"ship",
"storage-tank",
"baseball-diamond",
"tennis-court",
"basketball-court",
"ground-track-field",
"harbor",
"bridge",
"small-vehicle",
"large-vehicle",
"roundabout",
"swimming-pool",
"helicopter",
"soccer-ball-field",
"container-crane"
] |
alexlop/detr-finetuned-custom-coco |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"vascular pattern normal",
"vascular pattern obliterated",
"vascular pattern patchy obliteration",
"erosions",
"deep ulcer",
"superficial ulcer",
"marked erythema",
"moderate erythema",
"mucosal bleeding",
"luminal mild bleeding",
"luminal severe bleeding",
"low friability",
"moderate friability",
"severe friability",
"no object"
] |
Nihel13/tatr_model |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"table",
"table rotated"
] |
godminhkhoa/rtdetr-v2-r50-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r50-cppe5-finetune-2
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 9.6769
- Map: 0.5368
- Map 50: 0.8312
- Map 75: 0.5962
- Map Small: 0.5364
- Map Medium: 0.441
- Map Large: 0.7689
- Mar 1: 0.3954
- Mar 10: 0.6567
- Mar 100: 0.6967
- Mar Small: 0.6067
- Mar Medium: 0.6153
- Mar Large: 0.8557
- Map Coverall: 0.5756
- Mar 100 Coverall: 0.7821
- Map Face Shield: 0.6521
- Mar 100 Face Shield: 0.8059
- Map Gloves: 0.4261
- Mar 100 Gloves: 0.5627
- Map Goggles: 0.4722
- Mar 100 Goggles: 0.6897
- Map Mask: 0.5578
- Mar 100 Mask: 0.6431
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 25.4682 | 0.0465 | 0.0795 | 0.0437 | 0.0028 | 0.0102 | 0.0671 | 0.0688 | 0.1912 | 0.2765 | 0.1031 | 0.1968 | 0.5097 | 0.2085 | 0.5748 | 0.002 | 0.243 | 0.0029 | 0.1746 | 0.001 | 0.1492 | 0.0184 | 0.2409 |
| No log | 2.0 | 214 | 15.4462 | 0.1823 | 0.3465 | 0.1617 | 0.0681 | 0.1115 | 0.2591 | 0.2151 | 0.4239 | 0.4884 | 0.2985 | 0.4232 | 0.7013 | 0.4591 | 0.6716 | 0.0879 | 0.5013 | 0.0658 | 0.3906 | 0.0495 | 0.4308 | 0.2489 | 0.4476 |
| No log | 3.0 | 321 | 12.7644 | 0.2555 | 0.4657 | 0.2476 | 0.0847 | 0.191 | 0.4466 | 0.2627 | 0.4528 | 0.5148 | 0.2473 | 0.4805 | 0.7435 | 0.5487 | 0.7185 | 0.1397 | 0.5468 | 0.1609 | 0.4183 | 0.1243 | 0.4138 | 0.3041 | 0.4764 |
| No log | 4.0 | 428 | 12.1356 | 0.2919 | 0.5558 | 0.2588 | 0.1466 | 0.2271 | 0.5099 | 0.285 | 0.4706 | 0.5293 | 0.3241 | 0.4886 | 0.729 | 0.5238 | 0.6986 | 0.2541 | 0.6101 | 0.1794 | 0.4281 | 0.1888 | 0.4508 | 0.3136 | 0.4587 |
| 36.0648 | 5.0 | 535 | 11.8591 | 0.3218 | 0.5856 | 0.2997 | 0.1436 | 0.2566 | 0.5379 | 0.3028 | 0.4821 | 0.5359 | 0.3055 | 0.5098 | 0.7267 | 0.5498 | 0.7104 | 0.2962 | 0.6139 | 0.1811 | 0.4147 | 0.2364 | 0.46 | 0.3456 | 0.4804 |
| 36.0648 | 6.0 | 642 | 11.7190 | 0.3157 | 0.5685 | 0.2997 | 0.1426 | 0.2557 | 0.5212 | 0.2823 | 0.4795 | 0.5398 | 0.3338 | 0.4924 | 0.7223 | 0.5437 | 0.7099 | 0.2111 | 0.5886 | 0.2288 | 0.4482 | 0.255 | 0.4677 | 0.34 | 0.4844 |
| 36.0648 | 7.0 | 749 | 11.9062 | 0.3212 | 0.5979 | 0.2974 | 0.1367 | 0.259 | 0.5387 | 0.2942 | 0.4765 | 0.5428 | 0.3495 | 0.5085 | 0.7153 | 0.5386 | 0.6905 | 0.303 | 0.6139 | 0.2067 | 0.4522 | 0.209 | 0.4754 | 0.3487 | 0.4818 |
| 36.0648 | 8.0 | 856 | 11.6933 | 0.3183 | 0.5969 | 0.2901 | 0.1414 | 0.2541 | 0.54 | 0.2977 | 0.483 | 0.5404 | 0.3387 | 0.4954 | 0.7399 | 0.5594 | 0.705 | 0.2812 | 0.6 | 0.2174 | 0.4429 | 0.208 | 0.4785 | 0.3253 | 0.4756 |
| 36.0648 | 9.0 | 963 | 11.6233 | 0.3202 | 0.5826 | 0.3226 | 0.1359 | 0.2583 | 0.5543 | 0.3035 | 0.4884 | 0.5462 | 0.3383 | 0.5126 | 0.7321 | 0.5472 | 0.6937 | 0.2746 | 0.619 | 0.2086 | 0.4545 | 0.2237 | 0.4754 | 0.347 | 0.4884 |
| 15.3215 | 10.0 | 1070 | 11.4090 | 0.3421 | 0.6207 | 0.3279 | 0.1389 | 0.2796 | 0.5757 | 0.316 | 0.4935 | 0.5477 | 0.3142 | 0.5067 | 0.745 | 0.5714 | 0.6991 | 0.3224 | 0.6278 | 0.2331 | 0.4371 | 0.2325 | 0.4754 | 0.3511 | 0.4991 |
| 15.3215 | 11.0 | 1177 | 11.5408 | 0.3436 | 0.6394 | 0.3212 | 0.1544 | 0.2759 | 0.5751 | 0.3175 | 0.5003 | 0.5554 | 0.3501 | 0.5119 | 0.7464 | 0.5444 | 0.7059 | 0.329 | 0.5949 | 0.2259 | 0.4571 | 0.2647 | 0.5108 | 0.354 | 0.5084 |
| 15.3215 | 12.0 | 1284 | 11.7707 | 0.3296 | 0.6169 | 0.3078 | 0.1525 | 0.2669 | 0.5336 | 0.3037 | 0.4793 | 0.5399 | 0.3164 | 0.4945 | 0.7323 | 0.5477 | 0.6892 | 0.32 | 0.6101 | 0.2295 | 0.429 | 0.21 | 0.4862 | 0.3408 | 0.4849 |
| 15.3215 | 13.0 | 1391 | 11.6683 | 0.3415 | 0.6231 | 0.3243 | 0.1447 | 0.2795 | 0.5746 | 0.3181 | 0.4946 | 0.5536 | 0.3625 | 0.5028 | 0.7387 | 0.5571 | 0.6901 | 0.3245 | 0.6114 | 0.2424 | 0.4585 | 0.2356 | 0.5092 | 0.3479 | 0.4987 |
| 15.3215 | 14.0 | 1498 | 11.7344 | 0.3305 | 0.6156 | 0.3133 | 0.1478 | 0.2877 | 0.5388 | 0.3194 | 0.4876 | 0.5468 | 0.3387 | 0.5232 | 0.7116 | 0.5236 | 0.7113 | 0.327 | 0.6 | 0.2255 | 0.4563 | 0.2373 | 0.4738 | 0.3391 | 0.4924 |
| 13.4858 | 15.0 | 1605 | 11.6264 | 0.3307 | 0.6056 | 0.3161 | 0.1213 | 0.2799 | 0.5711 | 0.3242 | 0.4933 | 0.5506 | 0.3454 | 0.5066 | 0.7434 | 0.5581 | 0.7144 | 0.3034 | 0.5962 | 0.2163 | 0.4701 | 0.2598 | 0.4862 | 0.3159 | 0.4862 |
| 13.4858 | 16.0 | 1712 | 11.5521 | 0.3287 | 0.6044 | 0.3125 | 0.1686 | 0.2751 | 0.5519 | 0.3171 | 0.4922 | 0.5484 | 0.3757 | 0.4978 | 0.7246 | 0.5635 | 0.7162 | 0.3018 | 0.5861 | 0.234 | 0.4714 | 0.236 | 0.48 | 0.3084 | 0.4884 |
| 13.4858 | 17.0 | 1819 | 11.7578 | 0.3382 | 0.6292 | 0.3237 | 0.164 | 0.281 | 0.5516 | 0.3215 | 0.4924 | 0.548 | 0.3353 | 0.5037 | 0.7302 | 0.5505 | 0.709 | 0.3225 | 0.6076 | 0.2187 | 0.4402 | 0.2704 | 0.5031 | 0.3286 | 0.48 |
| 13.4858 | 18.0 | 1926 | 11.5963 | 0.3454 | 0.6381 | 0.3218 | 0.1607 | 0.2921 | 0.5647 | 0.3294 | 0.4951 | 0.5507 | 0.3516 | 0.489 | 0.7565 | 0.5498 | 0.705 | 0.348 | 0.6051 | 0.2177 | 0.45 | 0.2613 | 0.5138 | 0.3504 | 0.4796 |
| 12.4151 | 19.0 | 2033 | 11.5293 | 0.3469 | 0.6347 | 0.3237 | 0.1415 | 0.2967 | 0.5694 | 0.323 | 0.4923 | 0.5415 | 0.3126 | 0.4894 | 0.733 | 0.5663 | 0.7032 | 0.345 | 0.5975 | 0.2389 | 0.4469 | 0.2667 | 0.4846 | 0.3175 | 0.4756 |
| 12.4151 | 20.0 | 2140 | 11.5551 | 0.3414 | 0.6306 | 0.3143 | 0.1716 | 0.2916 | 0.5768 | 0.3312 | 0.4969 | 0.5521 | 0.3257 | 0.513 | 0.7292 | 0.5629 | 0.7243 | 0.3179 | 0.5633 | 0.2498 | 0.4696 | 0.267 | 0.52 | 0.3095 | 0.4831 |
| 12.4151 | 21.0 | 2247 | 11.9833 | 0.3286 | 0.6184 | 0.2991 | 0.1597 | 0.277 | 0.533 | 0.3224 | 0.4898 | 0.5452 | 0.3502 | 0.5003 | 0.7228 | 0.5478 | 0.6955 | 0.2979 | 0.5899 | 0.2414 | 0.4638 | 0.2361 | 0.4923 | 0.3197 | 0.4844 |
| 12.4151 | 22.0 | 2354 | 11.9215 | 0.3408 | 0.6259 | 0.3184 | 0.142 | 0.2864 | 0.5548 | 0.3264 | 0.4893 | 0.5399 | 0.3216 | 0.4872 | 0.744 | 0.5429 | 0.6923 | 0.3578 | 0.619 | 0.2483 | 0.4585 | 0.2269 | 0.4569 | 0.3282 | 0.4729 |
| 12.4151 | 23.0 | 2461 | 12.0853 | 0.3304 | 0.6162 | 0.3031 | 0.1564 | 0.2852 | 0.5542 | 0.3198 | 0.4856 | 0.5275 | 0.309 | 0.4927 | 0.7118 | 0.5404 | 0.7041 | 0.3271 | 0.5886 | 0.242 | 0.4237 | 0.2325 | 0.4492 | 0.3097 | 0.472 |
| 11.6364 | 24.0 | 2568 | 11.8409 | 0.3344 | 0.622 | 0.3186 | 0.1689 | 0.2871 | 0.5485 | 0.3217 | 0.4938 | 0.5446 | 0.3457 | 0.4936 | 0.7208 | 0.5455 | 0.7126 | 0.2952 | 0.5899 | 0.2615 | 0.4638 | 0.2534 | 0.4862 | 0.3164 | 0.4707 |
| 11.6364 | 25.0 | 2675 | 12.1816 | 0.3201 | 0.5981 | 0.3021 | 0.1342 | 0.2717 | 0.5455 | 0.3151 | 0.4803 | 0.5303 | 0.3205 | 0.4817 | 0.7252 | 0.5315 | 0.7023 | 0.2957 | 0.5911 | 0.2163 | 0.4415 | 0.2379 | 0.4462 | 0.3188 | 0.4707 |
| 11.6364 | 26.0 | 2782 | 11.9448 | 0.3291 | 0.6113 | 0.2964 | 0.1687 | 0.2751 | 0.5635 | 0.3163 | 0.4875 | 0.5385 | 0.3434 | 0.4928 | 0.7221 | 0.5433 | 0.6919 | 0.2817 | 0.5797 | 0.2582 | 0.4746 | 0.2367 | 0.4708 | 0.3254 | 0.4756 |
| 11.6364 | 27.0 | 2889 | 11.9042 | 0.322 | 0.6094 | 0.2899 | 0.1286 | 0.2739 | 0.5564 | 0.3211 | 0.4919 | 0.5404 | 0.3371 | 0.4771 | 0.7404 | 0.5306 | 0.7005 | 0.3051 | 0.5975 | 0.2411 | 0.4509 | 0.228 | 0.4769 | 0.3052 | 0.4764 |
| 11.6364 | 28.0 | 2996 | 12.1391 | 0.3242 | 0.6003 | 0.3057 | 0.1342 | 0.2714 | 0.5507 | 0.3144 | 0.483 | 0.5308 | 0.3222 | 0.4825 | 0.7136 | 0.5356 | 0.6986 | 0.298 | 0.5848 | 0.2454 | 0.4487 | 0.2544 | 0.4538 | 0.2875 | 0.468 |
| 11.0017 | 29.0 | 3103 | 12.0627 | 0.3371 | 0.6166 | 0.3215 | 0.1445 | 0.2846 | 0.5479 | 0.3212 | 0.4874 | 0.5441 | 0.3267 | 0.5051 | 0.7284 | 0.5411 | 0.7077 | 0.3292 | 0.6038 | 0.2528 | 0.4451 | 0.2572 | 0.4877 | 0.3052 | 0.4764 |
| 11.0017 | 30.0 | 3210 | 12.3028 | 0.3353 | 0.6079 | 0.3192 | 0.158 | 0.2837 | 0.5625 | 0.315 | 0.4875 | 0.5332 | 0.2965 | 0.4917 | 0.7294 | 0.5347 | 0.6905 | 0.3534 | 0.6 | 0.246 | 0.4464 | 0.242 | 0.4662 | 0.3001 | 0.4631 |
| 11.0017 | 31.0 | 3317 | 11.9750 | 0.339 | 0.6148 | 0.325 | 0.1401 | 0.2827 | 0.5603 | 0.3195 | 0.4821 | 0.5328 | 0.2975 | 0.4866 | 0.7262 | 0.5451 | 0.7063 | 0.3469 | 0.5848 | 0.2502 | 0.4522 | 0.2412 | 0.4585 | 0.3115 | 0.4622 |
| 11.0017 | 32.0 | 3424 | 12.0644 | 0.3361 | 0.6158 | 0.3151 | 0.1374 | 0.2836 | 0.5539 | 0.3197 | 0.4886 | 0.5368 | 0.2821 | 0.5061 | 0.7212 | 0.5472 | 0.6982 | 0.3281 | 0.5886 | 0.2436 | 0.4612 | 0.2432 | 0.4615 | 0.3186 | 0.4747 |
| 10.4746 | 33.0 | 3531 | 11.9360 | 0.3323 | 0.615 | 0.3027 | 0.1597 | 0.2821 | 0.5495 | 0.3162 | 0.4863 | 0.5366 | 0.294 | 0.4971 | 0.7228 | 0.531 | 0.7032 | 0.3396 | 0.5949 | 0.2532 | 0.4705 | 0.2255 | 0.4477 | 0.3121 | 0.4667 |
| 10.4746 | 34.0 | 3638 | 11.7375 | 0.3393 | 0.6215 | 0.3145 | 0.1483 | 0.2853 | 0.5579 | 0.326 | 0.4915 | 0.5427 | 0.3199 | 0.5005 | 0.7224 | 0.5444 | 0.6968 | 0.3343 | 0.6076 | 0.2515 | 0.4638 | 0.2479 | 0.4615 | 0.3183 | 0.4836 |
| 10.4746 | 35.0 | 3745 | 11.9828 | 0.3282 | 0.605 | 0.3017 | 0.1392 | 0.2717 | 0.5518 | 0.3145 | 0.4801 | 0.5305 | 0.2918 | 0.4795 | 0.7182 | 0.5313 | 0.6973 | 0.3315 | 0.5835 | 0.2456 | 0.4616 | 0.2217 | 0.4446 | 0.3108 | 0.4653 |
| 10.4746 | 36.0 | 3852 | 11.8752 | 0.3302 | 0.6169 | 0.31 | 0.1597 | 0.2683 | 0.5583 | 0.3127 | 0.4814 | 0.5278 | 0.297 | 0.4793 | 0.709 | 0.5367 | 0.6968 | 0.3315 | 0.5848 | 0.2429 | 0.4402 | 0.2198 | 0.4477 | 0.32 | 0.4693 |
| 10.4746 | 37.0 | 3959 | 11.9312 | 0.3304 | 0.6097 | 0.3073 | 0.1444 | 0.2765 | 0.5464 | 0.3185 | 0.4809 | 0.536 | 0.3159 | 0.4909 | 0.7086 | 0.5382 | 0.6923 | 0.3265 | 0.6013 | 0.2545 | 0.454 | 0.213 | 0.4538 | 0.3198 | 0.4787 |
| 9.9527 | 38.0 | 4066 | 11.9053 | 0.3355 | 0.6135 | 0.3116 | 0.1443 | 0.283 | 0.5541 | 0.3188 | 0.4856 | 0.5333 | 0.3009 | 0.4804 | 0.7224 | 0.5382 | 0.6919 | 0.3493 | 0.5962 | 0.2481 | 0.4527 | 0.2271 | 0.4523 | 0.3151 | 0.4733 |
| 9.9527 | 39.0 | 4173 | 11.9321 | 0.331 | 0.6118 | 0.3094 | 0.1417 | 0.2754 | 0.5537 | 0.313 | 0.4817 | 0.5325 | 0.3038 | 0.4847 | 0.7139 | 0.538 | 0.6995 | 0.3312 | 0.5949 | 0.2491 | 0.4549 | 0.2289 | 0.4415 | 0.3079 | 0.4716 |
| 9.9527 | 40.0 | 4280 | 11.8940 | 0.3317 | 0.6135 | 0.3061 | 0.1432 | 0.276 | 0.5536 | 0.3136 | 0.4796 | 0.5283 | 0.2983 | 0.477 | 0.715 | 0.5378 | 0.7009 | 0.3298 | 0.5823 | 0.2506 | 0.4496 | 0.2308 | 0.4446 | 0.3095 | 0.464 |
### Framework versions
- Transformers 4.52.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
goodcasper/rtdetr-finetuned-kvasir-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-finetuned-kvasir-test
This model is a fine-tuned version of [PekingU/rtdetr_v2_r18vd](https://huggingface.co/PekingU/rtdetr_v2_r18vd) on the goodcasper/kvasir dataset.
It achieves the following results on the evaluation set:
- Loss: 14.8933
- Map: 0.0002
- Map 50: 0.0005
- Map 75: 0.0001
- Map Small: 0.0
- Map Medium: 0.0004
- Map Large: 0.0006
- Mar 1: 0.0299
- Mar 10: 0.0624
- Mar 100: 0.1045
- Mar Small: 0.0
- Mar Medium: 0.1529
- Mar Large: 0.0518
- Map Ampulla of vater: -1.0
- Mar 100 Ampulla of vater: -1.0
- Map Angiectasia: 0.0
- Mar 100 Angiectasia: 0.0
- Map Blood - fresh: 0.0
- Mar 100 Blood - fresh: 0.0
- Map Blood - hematin: 0.0004
- Mar 100 Blood - hematin: 0.2
- Map Erosion: 0.0
- Mar 100 Erosion: 0.0
- Map Erythema: 0.0004
- Mar 100 Erythema: 0.5571
- Map Foreign body: 0.0
- Mar 100 Foreign body: 0.0071
- Map Ileocecal valve: -1.0
- Mar 100 Ileocecal valve: -1.0
- Map Lymphangiectasia: 0.0011
- Mar 100 Lymphangiectasia: 0.1759
- Map Normal clean mucosa: -1.0
- Mar 100 Normal clean mucosa: -1.0
- Map Polyp: 0.0
- Mar 100 Polyp: 0.0
- Map Pylorus: -1.0
- Mar 100 Pylorus: -1.0
- Map Reduced mucosal view: -1.0
- Mar 100 Reduced mucosal view: -1.0
- Map Ulcer: 0.0
- Mar 100 Ulcer: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 12
- eval_batch_size: 12
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 120.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Ampulla of vater | Mar 100 Ampulla of vater | Map Angiectasia | Mar 100 Angiectasia | Map Blood - fresh | Mar 100 Blood - fresh | Map Blood - hematin | Mar 100 Blood - hematin | Map Erosion | Mar 100 Erosion | Map Erythema | Mar 100 Erythema | Map Foreign body | Mar 100 Foreign body | Map Ileocecal valve | Mar 100 Ileocecal valve | Map Lymphangiectasia | Mar 100 Lymphangiectasia | Map Normal clean mucosa | Mar 100 Normal clean mucosa | Map Polyp | Mar 100 Polyp | Map Pylorus | Mar 100 Pylorus | Map Reduced mucosal view | Mar 100 Reduced mucosal view | Map Ulcer | Mar 100 Ulcer |
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:---------------:|:-------------------:|:-----------------:|:---------------------:|:-------------------:|:-----------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------------:|:--------------------:|:-------------------:|:-----------------------:|:--------------------:|:------------------------:|:-----------------------:|:---------------------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------------------:|:----------------------------:|:---------:|:-------------:|
| 23.0656 | 1.0 | 1130 | 12.8286 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0002 | 0.0003 | 0.0435 | 0.1464 | 0.2434 | 0.0 | 0.1973 | 0.2904 | -1.0 | -1.0 | 0.0003 | 0.353 | 0.0009 | 0.6233 | 0.0 | 0.0 | 0.0 | 0.0346 | 0.0 | 0.0 | 0.0002 | 0.3417 | -1.0 | -1.0 | 0.0001 | 0.1569 | -1.0 | -1.0 | 0.0001 | 0.3714 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.3094 |
| 11.4993 | 2.0 | 2260 | 12.7716 | 0.012 | 0.0217 | 0.0124 | 0.0 | 0.014 | 0.0196 | 0.148 | 0.2772 | 0.3332 | 0.0083 | 0.28 | 0.3574 | -1.0 | -1.0 | 0.0027 | 0.484 | 0.003 | 0.5372 | 0.0 | 0.0 | 0.0004 | 0.2981 | 0.0001 | 0.05 | 0.0994 | 0.4845 | -1.0 | -1.0 | 0.0009 | 0.3707 | -1.0 | -1.0 | 0.0013 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.0741 |
| 10.1596 | 3.0 | 3390 | 13.5678 | 0.001 | 0.0019 | 0.0009 | 0.0 | 0.0011 | 0.0014 | 0.1885 | 0.2807 | 0.325 | 0.0 | 0.2348 | 0.4219 | -1.0 | -1.0 | 0.0018 | 0.451 | 0.0031 | 0.7395 | 0.0 | 0.0 | 0.0004 | 0.3519 | 0.0001 | 0.1143 | 0.0033 | 0.5381 | -1.0 | -1.0 | 0.0 | 0.0448 | -1.0 | -1.0 | 0.0005 | 0.6857 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 9.5887 | 4.0 | 4520 | 13.4943 | 0.0763 | 0.1217 | 0.0757 | 0.0 | 0.0589 | 0.1644 | 0.2046 | 0.331 | 0.3635 | 0.0 | 0.33 | 0.3914 | 0.0569 | 0.569 | 0.2351 | 0.7349 | 0.0 | 0.0 | 0.0055 | 0.4635 | 0.0 | 0.0 | 0.3087 | 0.719 | -1.0 | -1.0 | 0.0804 | 0.5034 | -1.0 | -1.0 | 0.0005 | 0.2571 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0247 |
| 9.0855 | 5.0 | 5650 | 12.8922 | 0.0226 | 0.0338 | 0.0242 | 0.0 | 0.0484 | 0.0158 | 0.2178 | 0.3741 | 0.4743 | 0.1042 | 0.4772 | 0.4636 | -1.0 | -1.0 | 0.0092 | 0.587 | 0.0034 | 0.7419 | 0.0001 | 0.2 | 0.0008 | 0.55 | 0.0 | 0.05 | 0.1248 | 0.6905 | -1.0 | -1.0 | 0.0638 | 0.7259 | -1.0 | -1.0 | 0.0005 | 0.5143 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0004 | 0.2094 |
| 8.7544 | 6.0 | 6780 | 13.2963 | 0.036 | 0.0644 | 0.0373 | 0.0 | 0.0503 | 0.0453 | 0.1654 | 0.3183 | 0.3796 | 0.05 | 0.3867 | 0.4199 | 0.0235 | 0.617 | 0.0113 | 0.6884 | 0.0 | 0.0 | 0.0009 | 0.4019 | 0.0 | 0.0786 | 0.0834 | 0.6774 | -1.0 | -1.0 | 0.2051 | 0.6845 | -1.0 | -1.0 | 0.0001 | 0.2143 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.0541 |
| 8.4693 | 7.0 | 7910 | 13.7101 | 0.0081 | 0.0115 | 0.0078 | 0.0001 | 0.0116 | 0.012 | 0.2236 | 0.3891 | 0.4653 | 0.1083 | 0.3974 | 0.437 | -1.0 | -1.0 | 0.0121 | 0.678 | 0.0433 | 0.6907 | 0.0 | 0.0 | 0.0006 | 0.3365 | 0.0009 | 0.3429 | 0.0058 | 0.6964 | -1.0 | -1.0 | 0.0093 | 0.7034 | -1.0 | -1.0 | 0.0005 | 0.5571 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0004 | 0.1824 |
| 8.3685 | 8.0 | 9040 | 13.6692 | 0.0184 | 0.0337 | 0.0207 | 0.0 | 0.0317 | 0.0261 | 0.2478 | 0.3994 | 0.4885 | 0.0583 | 0.4261 | 0.488 | -1.0 | -1.0 | 0.0712 | 0.678 | 0.0027 | 0.6395 | 0.0 | 0.0 | 0.0015 | 0.5712 | 0.0013 | 0.35 | 0.0671 | 0.6833 | -1.0 | -1.0 | 0.0208 | 0.7069 | -1.0 | -1.0 | 0.0011 | 0.6571 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0002 | 0.1106 |
| 8.103 | 9.0 | 10170 | 13.8539 | 0.0074 | 0.0141 | 0.007 | 0.0 | 0.0127 | 0.02 | 0.1581 | 0.4004 | 0.4628 | 0.05 | 0.4423 | 0.4492 | -1.0 | -1.0 | 0.0345 | 0.6 | 0.0072 | 0.6884 | 0.0 | 0.0 | 0.0011 | 0.45 | 0.0041 | 0.3429 | 0.011 | 0.6488 | -1.0 | -1.0 | 0.0078 | 0.6259 | -1.0 | -1.0 | 0.0004 | 0.4857 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0009 | 0.3235 |
| 8.006 | 10.0 | 11300 | 12.9860 | 0.0048 | 0.0108 | 0.0028 | 0.0 | 0.0061 | 0.0155 | 0.1911 | 0.3426 | 0.4245 | 0.1 | 0.413 | 0.41 | -1.0 | -1.0 | 0.028 | 0.515 | 0.001 | 0.7233 | 0.0 | 0.0 | 0.0003 | 0.3192 | 0.0001 | 0.2643 | 0.0124 | 0.6726 | -1.0 | -1.0 | 0.001 | 0.4672 | -1.0 | -1.0 | 0.0004 | 0.5 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0004 | 0.3588 |
| 7.8882 | 11.0 | 12430 | 13.5480 | 0.0008 | 0.0024 | 0.0005 | 0.0 | 0.0023 | 0.0005 | 0.094 | 0.1139 | 0.1226 | 0.0 | 0.0312 | 0.1299 | -1.0 | -1.0 | 0.0018 | 0.042 | 0.0011 | 0.4442 | 0.0 | 0.0 | 0.0 | 0.0154 | 0.0001 | 0.05 | 0.0015 | 0.0952 | 0.0023 | 0.031 | -1.0 | -1.0 | 0.0007 | 0.4 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0259 |
| 7.8637 | 12.0 | 13560 | 13.1137 | 0.0103 | 0.0173 | 0.0109 | 0.0 | 0.0247 | 0.0061 | 0.1256 | 0.2193 | 0.289 | 0.0 | 0.2732 | 0.2931 | -1.0 | -1.0 | 0.0026 | 0.375 | 0.0014 | 0.4744 | 0.0 | 0.05 | 0.0011 | 0.3058 | 0.0027 | 0.3357 | 0.0804 | 0.3643 | -1.0 | -1.0 | 0.0032 | 0.2828 | -1.0 | -1.0 | 0.0002 | 0.1429 | -1.0 | -1.0 | -1.0 | -1.0 | 0.001 | 0.2706 |
| 7.6162 | 13.0 | 14690 | 13.6007 | 0.0006 | 0.0013 | 0.0005 | 0.0 | 0.0012 | 0.0013 | 0.1325 | 0.2767 | 0.4051 | 0.0083 | 0.414 | 0.4031 | -1.0 | -1.0 | 0.0014 | 0.533 | 0.0004 | 0.4744 | 0.0 | 0.0 | 0.0011 | 0.5308 | 0.0002 | 0.4214 | 0.0007 | 0.5107 | -1.0 | -1.0 | 0.0006 | 0.5052 | 0.0001 | 0.2 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0007 | 0.4706 |
| 7.6058 | 14.0 | 15820 | 14.1225 | 0.0019 | 0.0041 | 0.0015 | 0.0 | 0.0036 | 0.0021 | 0.1857 | 0.3594 | 0.4348 | 0.0667 | 0.4327 | 0.4148 | -1.0 | -1.0 | 0.0047 | 0.621 | 0.0013 | 0.4023 | 0.0 | 0.05 | 0.001 | 0.2673 | 0.0009 | 0.6143 | 0.0035 | 0.5131 | -1.0 | -1.0 | 0.0025 | 0.55 | 0.0006 | 0.4286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0023 | 0.4671 |
| 7.5284 | 15.0 | 16950 | 13.9916 | 0.003 | 0.0057 | 0.0032 | 0.0001 | 0.0053 | 0.0031 | 0.0964 | 0.2568 | 0.3347 | 0.1583 | 0.3406 | 0.3148 | -1.0 | -1.0 | 0.0191 | 0.677 | 0.0015 | 0.586 | 0.0 | 0.075 | 0.0016 | 0.4365 | 0.0002 | 0.2 | 0.0032 | 0.5548 | 0.0016 | 0.4534 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0294 |
| 7.3309 | 16.0 | 18080 | 14.0685 | 0.0008 | 0.0018 | 0.0006 | 0.0001 | 0.0014 | 0.0012 | 0.0768 | 0.2363 | 0.3649 | 0.125 | 0.3749 | 0.3369 | -1.0 | -1.0 | 0.0029 | 0.614 | 0.001 | 0.6977 | 0.0 | 0.025 | 0.0006 | 0.3346 | 0.0001 | 0.3143 | 0.0013 | 0.5167 | -1.0 | -1.0 | 0.0009 | 0.4552 | 0.0 | 0.1429 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0003 | 0.1835 |
| 7.346 | 17.0 | 19210 | 13.6449 | 0.0056 | 0.0131 | 0.0013 | 0.0001 | 0.0095 | 0.0023 | 0.0941 | 0.2553 | 0.3142 | 0.1833 | 0.3827 | 0.2142 | -1.0 | -1.0 | 0.0457 | 0.661 | 0.0003 | 0.2419 | 0.0 | 0.05 | 0.0008 | 0.3423 | 0.0002 | 0.3714 | 0.0014 | 0.4536 | -1.0 | -1.0 | 0.0009 | 0.3914 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0014 | 0.3165 |
| 7.0734 | 18.0 | 20340 | 14.4955 | 0.0019 | 0.0036 | 0.0023 | 0.0 | 0.004 | 0.0021 | 0.1271 | 0.3806 | 0.4575 | 0.1 | 0.4243 | 0.4525 | -1.0 | -1.0 | 0.0043 | 0.641 | 0.001 | 0.6767 | 0.0 | 0.05 | 0.0014 | 0.5404 | 0.0002 | 0.3214 | 0.0076 | 0.4369 | -1.0 | -1.0 | 0.0017 | 0.6345 | 0.0004 | 0.5714 | -1.0 | -1.0 | 0.0007 | 0.2447 |
| 7.1618 | 19.0 | 21470 | 14.8730 | 0.0003 | 0.0008 | 0.0002 | 0.0 | 0.0013 | 0.0007 | 0.0366 | 0.1972 | 0.2822 | 0.0667 | 0.3443 | 0.2106 | -1.0 | -1.0 | 0.001 | 0.292 | 0.0003 | 0.407 | 0.0 | 0.125 | 0.0003 | 0.3327 | 0.0001 | 0.4357 | 0.0002 | 0.2024 | -1.0 | -1.0 | 0.0006 | 0.6172 | 0.0 | 0.0286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.0988 |
| 7.1004 | 20.0 | 22600 | 14.1832 | 0.002 | 0.0028 | 0.0023 | 0.0 | 0.0031 | 0.0035 | 0.0354 | 0.2085 | 0.2809 | 0.0833 | 0.2512 | 0.2418 | -1.0 | -1.0 | 0.0007 | 0.443 | 0.0003 | 0.4 | 0.0 | 0.0 | 0.0002 | 0.2558 | 0.0 | 0.15 | 0.0 | 0.0667 | -1.0 | -1.0 | 0.0005 | 0.5431 | 0.0002 | 0.4857 | -1.0 | -1.0 | 0.016 | 0.1835 |
| 7.2783 | 21.0 | 23730 | 13.9560 | 0.0001 | 0.0004 | 0.0001 | 0.0 | 0.0002 | 0.0001 | 0.0343 | 0.0843 | 0.1149 | 0.0 | 0.1094 | 0.1154 | -1.0 | -1.0 | 0.0003 | 0.192 | 0.0002 | 0.2465 | 0.0 | 0.0 | 0.0 | 0.075 | 0.0 | 0.1071 | 0.0 | 0.0095 | -1.0 | -1.0 | 0.0003 | 0.331 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0003 | 0.0729 |
| 7.0712 | 22.0 | 24860 | 13.7040 | 0.0002 | 0.0006 | 0.0001 | 0.0 | 0.0004 | 0.0003 | 0.0308 | 0.1331 | 0.1993 | 0.0 | 0.1819 | 0.2177 | -1.0 | -1.0 | 0.001 | 0.443 | 0.0001 | 0.2209 | 0.0 | 0.125 | 0.0001 | 0.1346 | 0.0001 | 0.2286 | 0.0 | 0.0238 | 0.0003 | 0.3155 | 0.0 | 0.2857 | -1.0 | -1.0 | 0.0 | 0.0165 |
| 6.9433 | 23.0 | 25990 | 13.4392 | 0.0001 | 0.0005 | 0.0 | 0.0 | 0.0003 | 0.0002 | 0.0467 | 0.1145 | 0.1248 | 0.0 | 0.1174 | 0.0822 | -1.0 | -1.0 | 0.0004 | 0.217 | 0.0 | 0.0884 | 0.0 | 0.0 | 0.0001 | 0.1115 | 0.0001 | 0.1357 | 0.0 | 0.0155 | -1.0 | -1.0 | 0.0005 | 0.2931 | -1.0 | -1.0 | 0.0 | 0.2 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.0624 |
| 6.8614 | 24.0 | 27120 | 13.6936 | 0.0002 | 0.0009 | 0.0001 | 0.0 | 0.0004 | 0.0006 | 0.0286 | 0.0936 | 0.1287 | 0.0 | 0.069 | 0.1425 | -1.0 | -1.0 | 0.0003 | 0.091 | 0.0002 | 0.1372 | 0.0 | 0.0 | 0.0 | 0.0346 | 0.0 | 0.0714 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0008 | 0.3034 | 0.0002 | 0.4714 | -1.0 | -1.0 | 0.0004 | 0.0494 |
| 6.9735 | 25.0 | 28250 | 13.4046 | 0.0044 | 0.0077 | 0.0042 | 0.0 | 0.0064 | 0.0015 | 0.0466 | 0.2859 | 0.3242 | 0.0125 | 0.3453 | 0.2269 | -1.0 | -1.0 | 0.0116 | 0.292 | 0.0003 | 0.2233 | 0.0 | 0.325 | 0.0 | 0.0692 | 0.0006 | 0.3714 | 0.0 | 0.025 | 0.0012 | 0.6466 | 0.0002 | 0.5286 | 0.0259 | 0.4365 |
| 6.8376 | 26.0 | 29380 | 14.1655 | 0.0004 | 0.001 | 0.0003 | 0.0 | 0.0008 | 0.0022 | 0.0569 | 0.2003 | 0.2438 | 0.0375 | 0.2226 | 0.1802 | -1.0 | -1.0 | 0.0001 | 0.05 | 0.0003 | 0.1535 | 0.0 | 0.325 | 0.0 | 0.0346 | 0.0001 | 0.2 | 0.0001 | 0.056 | 0.0023 | 0.5845 | 0.0004 | 0.6714 | -1.0 | -1.0 | 0.0004 | 0.1188 |
| 6.7287 | 27.0 | 30510 | 13.3048 | 0.0097 | 0.0157 | 0.0083 | 0.0 | 0.0158 | 0.0059 | 0.1031 | 0.3368 | 0.362 | 0.0 | 0.3505 | 0.3247 | -1.0 | -1.0 | 0.05 | 0.397 | 0.0006 | 0.2256 | 0.0 | 0.125 | 0.0 | 0.05 | 0.0006 | 0.4286 | 0.0007 | 0.1631 | 0.004 | 0.6776 | 0.0012 | 0.7143 | 0.0304 | 0.4765 |
| 6.7635 | 28.0 | 31640 | 13.4379 | 0.0003 | 0.0007 | 0.0001 | 0.0 | 0.0006 | 0.0008 | 0.0238 | 0.1812 | 0.2376 | 0.0 | 0.1684 | 0.2218 | -1.0 | -1.0 | 0.0003 | 0.161 | 0.0002 | 0.2279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.2071 | 0.0 | 0.0452 | 0.0008 | 0.3862 | 0.0003 | 0.7571 | 0.0006 | 0.3541 |
| 6.6492 | 29.0 | 32770 | 13.8339 | 0.0027 | 0.0036 | 0.0028 | 0.0 | 0.0037 | 0.0046 | 0.0536 | 0.3045 | 0.3476 | 0.0 | 0.3104 | 0.2655 | -1.0 | -1.0 | 0.0006 | 0.238 | 0.0025 | 0.3372 | 0.0 | 0.2 | 0.0001 | 0.1385 | 0.0002 | 0.4429 | 0.0 | 0.0274 | 0.0018 | 0.4914 | 0.0004 | 0.8429 | -1.0 | -1.0 | 0.0188 | 0.4106 |
| 6.6418 | 30.0 | 33900 | 13.8242 | 0.0025 | 0.0046 | 0.0034 | 0.0 | 0.004 | 0.0007 | 0.07 | 0.1488 | 0.1813 | 0.0792 | 0.1671 | 0.0963 | -1.0 | -1.0 | 0.0003 | 0.144 | 0.0002 | 0.1674 | 0.0 | 0.3 | 0.0 | 0.0096 | 0.0 | 0.0929 | 0.0001 | 0.0583 | 0.0001 | 0.1138 | 0.0003 | 0.5571 | -1.0 | -1.0 | 0.0216 | 0.1882 |
| 6.623 | 31.0 | 35030 | 14.0148 | 0.0004 | 0.0006 | 0.0005 | 0.0 | 0.0021 | 0.0002 | 0.0208 | 0.0819 | 0.1131 | 0.025 | 0.1036 | 0.074 | -1.0 | -1.0 | 0.0 | 0.005 | 0.0 | 0.0233 | 0.0 | 0.275 | 0.0 | 0.0192 | 0.0 | 0.0286 | 0.0 | 0.0381 | 0.0 | 0.0086 | 0.0003 | 0.4286 | 0.0033 | 0.1918 |
| 6.4636 | 32.0 | 36160 | 13.8160 | 0.0004 | 0.001 | 0.0004 | 0.0 | 0.002 | 0.0008 | 0.065 | 0.1639 | 0.1959 | 0.125 | 0.2454 | 0.0936 | -1.0 | -1.0 | 0.0001 | 0.075 | 0.0 | 0.0814 | 0.0 | 0.375 | 0.0 | 0.0385 | 0.0001 | 0.1786 | 0.0001 | 0.0476 | 0.0008 | 0.2776 | 0.0001 | 0.3 | -1.0 | -1.0 | 0.0027 | 0.3894 |
| 6.5183 | 33.0 | 37290 | 14.2449 | 0.0013 | 0.0027 | 0.0002 | 0.0 | 0.0028 | 0.004 | 0.026 | 0.1704 | 0.2005 | 0.0417 | 0.2052 | 0.1239 | -1.0 | -1.0 | 0.0004 | 0.165 | 0.0003 | 0.2512 | 0.0 | 0.2 | 0.0001 | 0.1019 | 0.01 | 0.2929 | 0.0001 | 0.0845 | 0.0 | 0.0379 | 0.0001 | 0.3143 | 0.001 | 0.3565 |
| 6.4632 | 34.0 | 38420 | 14.4384 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0004 | 0.0013 | 0.0107 | 0.1623 | 0.2245 | 0.0625 | 0.2376 | 0.1293 | -1.0 | -1.0 | 0.0 | 0.037 | 0.0005 | 0.3512 | 0.0 | 0.375 | 0.0 | 0.0212 | 0.0001 | 0.3143 | 0.0001 | 0.0667 | 0.0 | 0.0621 | 0.0001 | 0.5286 | 0.0006 | 0.2647 |
| 6.5022 | 35.0 | 39550 | 13.6652 | 0.0045 | 0.009 | 0.0 | 0.0 | 0.0057 | 0.0004 | 0.014 | 0.1185 | 0.1345 | 0.0375 | 0.1052 | 0.0733 | -1.0 | -1.0 | 0.0002 | 0.122 | 0.0 | 0.1093 | 0.0 | 0.175 | 0.0 | 0.0212 | 0.0396 | 0.1571 | 0.0001 | 0.1 | 0.0 | 0.0172 | 0.0002 | 0.4 | -1.0 | -1.0 | 0.0001 | 0.1082 |
| 6.3441 | 36.0 | 40680 | 14.4204 | 0.0003 | 0.0005 | 0.0002 | 0.0 | 0.0015 | 0.0002 | 0.0089 | 0.1323 | 0.1613 | 0.0125 | 0.1639 | 0.0737 | -1.0 | -1.0 | 0.0001 | 0.068 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0038 | 0.0007 | 0.2286 | 0.0001 | 0.0857 | 0.0001 | 0.0966 | 0.0001 | 0.4286 | 0.0014 | 0.3153 |
| 6.3272 | 37.0 | 41810 | 14.0379 | 0.0082 | 0.0138 | 0.0038 | 0.0 | 0.0112 | 0.0007 | 0.0709 | 0.1923 | 0.2029 | 0.0 | 0.1685 | 0.1176 | -1.0 | -1.0 | 0.0181 | 0.109 | 0.0003 | 0.1535 | 0.0 | 0.075 | 0.0 | 0.0173 | 0.053 | 0.3 | 0.0005 | 0.1869 | 0.0 | 0.0397 | 0.0007 | 0.7 | -1.0 | -1.0 | 0.0011 | 0.2447 |
| 6.3569 | 38.0 | 42940 | 14.4949 | 0.0036 | 0.009 | 0.0001 | 0.0 | 0.005 | 0.0001 | 0.0272 | 0.0893 | 0.1154 | 0.0125 | 0.1332 | 0.0483 | -1.0 | -1.0 | 0.0003 | 0.025 | 0.0 | 0.0 | 0.0 | 0.25 | 0.0 | 0.0154 | 0.0318 | 0.2357 | 0.0001 | 0.1024 | 0.0 | 0.0138 | 0.0002 | 0.3571 | 0.0 | 0.0388 |
| 6.3652 | 39.0 | 44070 | 15.1024 | 0.0002 | 0.0007 | 0.0001 | 0.0 | 0.0005 | 0.0008 | 0.0395 | 0.1122 | 0.1345 | 0.025 | 0.1781 | 0.0637 | -1.0 | -1.0 | 0.0 | 0.009 | 0.0001 | 0.1093 | 0.0 | 0.325 | 0.0001 | 0.1038 | 0.0 | 0.0643 | 0.001 | 0.2583 | 0.0 | 0.0293 | 0.0001 | 0.1714 | 0.0003 | 0.14 |
| 6.2451 | 40.0 | 45200 | 14.4485 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0008 | 0.0003 | 0.0248 | 0.085 | 0.1097 | 0.0375 | 0.1507 | 0.0346 | -1.0 | -1.0 | 0.0001 | 0.072 | 0.0 | 0.0977 | 0.0 | 0.225 | 0.0 | 0.0635 | 0.0001 | 0.2214 | 0.0 | 0.05 | 0.0002 | 0.1224 | -1.0 | -1.0 | 0.0 | 0.0857 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0002 | 0.0494 |
| 6.219 | 41.0 | 46330 | 14.5542 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0051 | 0.0593 | 0.0811 | 0.0375 | 0.102 | 0.0286 | -1.0 | -1.0 | 0.0 | 0.027 | 0.0 | 0.0116 | 0.0 | 0.275 | 0.0 | 0.0 | 0.0 | 0.15 | 0.0 | 0.0 | 0.0 | 0.0121 | -1.0 | -1.0 | 0.0001 | 0.2286 | -1.0 | -1.0 | 0.0 | 0.0259 |
| 6.2522 | 42.0 | 47460 | 14.5800 | 0.0026 | 0.0061 | 0.0009 | 0.0 | 0.0068 | 0.0007 | 0.0327 | 0.1678 | 0.2275 | 0.1125 | 0.259 | 0.0886 | -1.0 | -1.0 | 0.0001 | 0.071 | 0.0 | 0.0442 | 0.0 | 0.5 | 0.0 | 0.0173 | 0.016 | 0.3143 | 0.0002 | 0.1583 | -1.0 | -1.0 | 0.0008 | 0.3328 | -1.0 | -1.0 | 0.0001 | 0.3143 | 0.0065 | 0.2953 |
| 6.1892 | 43.0 | 48590 | 14.4996 | 0.0045 | 0.009 | 0.0 | 0.0 | 0.0059 | 0.0003 | 0.0245 | 0.1013 | 0.1311 | 0.025 | 0.1694 | 0.0376 | -1.0 | -1.0 | 0.0 | 0.04 | 0.0 | 0.0605 | 0.0 | 0.325 | 0.0 | 0.0 | 0.0398 | 0.3714 | 0.0001 | 0.0536 | 0.0003 | 0.1241 | -1.0 | -1.0 | 0.0001 | 0.1857 | 0.0 | 0.02 |
| 6.1124 | 44.0 | 49720 | 14.6023 | 0.0001 | 0.0002 | 0.0001 | 0.0 | 0.0003 | 0.0008 | 0.0623 | 0.1563 | 0.2015 | 0.1625 | 0.2211 | 0.1339 | -1.0 | -1.0 | 0.0 | 0.04 | 0.0003 | 0.3093 | 0.0001 | 0.45 | 0.0 | 0.0135 | 0.0002 | 0.4071 | 0.0001 | 0.0929 | 0.0001 | 0.0655 | -1.0 | -1.0 | 0.0001 | 0.4 | -1.0 | -1.0 | 0.0 | 0.0353 |
| 6.1198 | 45.0 | 50850 | 14.0431 | 0.0006 | 0.0011 | 0.001 | 0.0 | 0.0074 | 0.0 | 0.0213 | 0.0613 | 0.1115 | 0.0875 | 0.1668 | 0.0059 | -1.0 | -1.0 | 0.0 | 0.016 | 0.0 | 0.0488 | 0.0 | 0.475 | 0.0 | 0.0154 | 0.0054 | 0.2714 | 0.0001 | 0.0619 | 0.0 | 0.0724 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.0424 |
| 6.0799 | 46.0 | 51980 | 13.9452 | 0.0001 | 0.0002 | 0.0001 | 0.0 | 0.0004 | 0.0 | 0.0401 | 0.0694 | 0.0979 | 0.0625 | 0.1263 | 0.0135 | -1.0 | -1.0 | 0.0 | 0.022 | 0.0 | 0.0209 | 0.0 | 0.325 | 0.0 | 0.0 | 0.0001 | 0.1929 | 0.0001 | 0.0405 | -1.0 | -1.0 | 0.0004 | 0.1569 | -1.0 | -1.0 | 0.0 | 0.0857 | -1.0 | -1.0 | 0.0001 | 0.0376 |
| 6.0491 | 47.0 | 53110 | 13.8778 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0135 | 0.0405 | 0.0544 | 0.0375 | 0.0815 | 0.0 | -1.0 | -1.0 | 0.0 | 0.022 | 0.0 | 0.0 | 0.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.1143 | 0.0 | 0.0083 | 0.0001 | 0.0741 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0212 |
| 6.0493 | 48.0 | 54240 | 13.6438 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0286 | 0.0698 | 0.1123 | 0.0875 | 0.1791 | 0.0025 | -1.0 | -1.0 | 0.0 | 0.04 | 0.0 | 0.0302 | 0.0 | 0.55 | 0.0 | 0.0 | 0.0002 | 0.3214 | 0.0 | 0.0083 | 0.0 | 0.031 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0294 |
| 5.9579 | 49.0 | 55370 | 14.1698 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0002 | 0.0005 | 0.0292 | 0.0802 | 0.1152 | 0.05 | 0.1265 | 0.0339 | -1.0 | -1.0 | 0.0 | 0.016 | 0.0 | 0.0 | 0.0 | 0.35 | 0.0 | 0.0 | 0.0001 | 0.2571 | 0.0 | 0.006 | 0.0001 | 0.1259 | -1.0 | -1.0 | 0.0001 | 0.2714 | -1.0 | -1.0 | 0.0 | 0.0106 |
| 5.9985 | 50.0 | 56500 | 13.6975 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.0003 | 0.0001 | 0.0316 | 0.0988 | 0.1157 | 0.0625 | 0.1548 | 0.0247 | -1.0 | -1.0 | 0.0 | 0.015 | 0.0 | 0.0 | 0.0001 | 0.425 | 0.0 | 0.0058 | 0.0001 | 0.2214 | 0.0 | 0.0226 | 0.0002 | 0.1828 | -1.0 | -1.0 | 0.0001 | 0.1143 | 0.0 | 0.0541 |
| 5.9686 | 51.0 | 57630 | 14.7675 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0025 | 0.0444 | 0.0663 | 0.075 | 0.0999 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.475 | 0.0 | 0.0 | 0.0 | 0.0714 | 0.0 | 0.0 | 0.0 | 0.05 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.984 | 52.0 | 58760 | 14.5306 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0003 | 0.0001 | 0.0165 | 0.0894 | 0.1001 | 0.075 | 0.1256 | 0.0143 | -1.0 | -1.0 | 0.0 | 0.007 | 0.0 | 0.0 | 0.0001 | 0.4 | 0.0 | 0.0 | 0.0003 | 0.2929 | 0.0 | 0.0083 | 0.0002 | 0.1069 | -1.0 | -1.0 | 0.0 | 0.0857 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.8086 | 53.0 | 59890 | 14.3081 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0001 | 0.0135 | 0.0582 | 0.0913 | 0.0875 | 0.1096 | 0.0143 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.475 | 0.0 | 0.0 | 0.0001 | 0.2 | 0.0 | 0.0 | 0.0 | 0.0328 | -1.0 | -1.0 | 0.0001 | 0.1143 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.8179 | 54.0 | 61020 | 14.4136 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0127 | 0.0466 | 0.0696 | 0.025 | 0.091 | 0.0143 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.275 | 0.0 | 0.0 | 0.0001 | 0.2286 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0086 | -1.0 | -1.0 | 0.0001 | 0.1143 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.807 | 55.0 | 62150 | 15.0746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0 | 0.0313 | 0.0313 | 0.0 | 0.0375 | 0.0143 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.0 | 0.0 | 0.0 | 0.0429 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.1143 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.8097 | 56.0 | 63280 | 14.8018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0001 | 0.001 | 0.0459 | 0.0515 | 0.0 | 0.0727 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0 | 0.0 | 0.0857 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0241 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.7555 | 57.0 | 64410 | 14.9772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0164 | 0.0275 | 0.0 | 0.0594 | 0.0 | -1.0 | -1.0 | 0.0 | 0.008 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0 | 0.0 | 0.0143 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.7641 | 58.0 | 65540 | 14.5698 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0025 | 0.0287 | 0.037 | 0.0125 | 0.0656 | 0.0 | -1.0 | -1.0 | 0.0 | 0.007 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0 | 0.0001 | 0.0857 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0002 | 0.0153 |
| 5.729 | 59.0 | 66670 | 14.3530 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0086 | 0.028 | 0.0 | 0.0606 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.0224 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0047 |
| 5.706 | 60.0 | 67800 | 14.0699 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0227 | 0.0392 | 0.0503 | 0.0 | 0.0848 | 0.0 | -1.0 | -1.0 | 0.0 | 0.007 | 0.0 | 0.0 | 0.0 | 0.175 | 0.0 | 0.0 | 0.0002 | 0.2286 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.0362 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 5.6547 | 61.0 | 68930 | 14.6443 | 0.0001 | 0.0004 | 0.0 | 0.0 | 0.0004 | 0.0 | 0.0186 | 0.0566 | 0.061 | 0.0 | 0.1024 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.0 | 0.0003 | 0.2714 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0007 | 0.0776 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.6021 | 62.0 | 70060 | 14.8467 | 0.0001 | 0.0001 | 0.0001 | 0.0 | 0.0002 | 0.0001 | 0.0224 | 0.0682 | 0.0745 | 0.0125 | 0.0924 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.175 | 0.0 | 0.0 | 0.0002 | 0.2929 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0002 | 0.0741 | -1.0 | -1.0 | 0.0002 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.6218 | 63.0 | 71190 | 14.4030 | 0.0002 | 0.0006 | 0.0001 | 0.0 | 0.0006 | 0.0008 | 0.04 | 0.0769 | 0.0896 | 0.0375 | 0.1194 | 0.017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.35 | 0.0 | 0.0 | 0.0001 | 0.1643 | 0.0 | 0.0 | 0.0013 | 0.1638 | -1.0 | -1.0 | 0.0004 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.5575 | 64.0 | 72320 | 14.3306 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0251 | 0.0501 | 0.0612 | 0.0125 | 0.0741 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.007 | 0.0 | 0.0 | 0.0 | 0.175 | 0.0 | 0.0 | 0.0002 | 0.2071 | 0.0 | 0.0 | 0.0001 | 0.0328 | -1.0 | -1.0 | 0.0003 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.5542 | 65.0 | 73450 | 14.1196 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0241 | 0.0595 | 0.0678 | 0.0375 | 0.0781 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0 | 0.0 | 0.0 | 0.25 | 0.0 | 0.0 | 0.0001 | 0.1929 | 0.0 | 0.0 | 0.0001 | 0.0328 | -1.0 | -1.0 | 0.0002 | 0.1286 | 0.0 | 0.0 |
| 5.5203 | 66.0 | 74580 | 14.2382 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0001 | 0.0216 | 0.0667 | 0.0725 | 0.0125 | 0.097 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.25 | 0.0 | 0.0019 | 0.0001 | 0.2286 | 0.0 | 0.0 | 0.0 | 0.0431 | -1.0 | -1.0 | 0.0002 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.531 | 67.0 | 75710 | 14.1907 | 0.0001 | 0.0002 | 0.0001 | 0.0 | 0.0004 | 0.0004 | 0.0102 | 0.0612 | 0.0719 | 0.025 | 0.085 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.008 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.0 | 0.0003 | 0.25 | 0.0 | 0.0 | 0.0001 | 0.0603 | -1.0 | -1.0 | 0.0001 | 0.1286 | 0.0 | 0.0 |
| 5.5003 | 68.0 | 76840 | 14.2906 | 0.0004 | 0.0006 | 0.0004 | 0.0 | 0.0007 | 0.0028 | 0.0772 | 0.1291 | 0.1518 | 0.025 | 0.142 | 0.0812 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.275 | 0.0 | 0.0 | 0.0004 | 0.4571 | 0.0 | 0.0 | 0.0004 | 0.1345 | -1.0 | -1.0 | 0.0025 | 0.5 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.4263 | 69.0 | 77970 | 14.6374 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0357 | 0.0768 | 0.08 | 0.0 | 0.095 | 0.0473 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.0 | 0.0 | 0.0003 | 0.4143 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.0517 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.4144 | 70.0 | 79100 | 14.7071 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.001 | 0.0472 | 0.0662 | 0.0 | 0.108 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.0 | 0.0001 | 0.3786 | 0.0 | 0.0 | 0.0 | 0.0172 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.4466 | 71.0 | 80230 | 14.5211 | 0.0001 | 0.0002 | 0.0001 | 0.0 | 0.0002 | 0.0174 | 0.0178 | 0.0982 | 0.1173 | 0.0 | 0.135 | 0.0723 | -1.0 | -1.0 | 0.0 | 0.009 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.5714 | 0.0 | 0.0 | 0.0 | 0.0466 | -1.0 | -1.0 | 0.0007 | 0.2286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.3988 | 72.0 | 81360 | 14.5190 | 0.0002 | 0.0003 | 0.0001 | 0.0 | 0.0003 | 0.0027 | 0.0472 | 0.0919 | 0.1113 | 0.0 | 0.1301 | 0.0321 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.175 | 0.0 | 0.0 | 0.0003 | 0.4571 | 0.0 | 0.006 | 0.0003 | 0.1069 | -1.0 | -1.0 | 0.0008 | 0.2571 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.3631 | 73.0 | 82490 | 14.8369 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0003 | 0.0271 | 0.0668 | 0.0735 | 0.0125 | 0.0954 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.225 | 0.0 | 0.0 | 0.0003 | 0.2857 | 0.0 | 0.0 | 0.0 | 0.0224 | -1.0 | -1.0 | 0.0002 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.3915 | 74.0 | 83620 | 15.2755 | 0.0001 | 0.0002 | 0.0001 | 0.0 | 0.0002 | 0.0012 | 0.0351 | 0.0922 | 0.1009 | 0.0 | 0.138 | 0.0179 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.25 | 0.0 | 0.0 | 0.0006 | 0.5 | 0.0 | 0.0 | 0.0 | 0.0155 | -1.0 | -1.0 | 0.0003 | 0.1429 | 0.0 | 0.0 |
| 5.399 | 75.0 | 84750 | 14.6546 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0094 | 0.0526 | 0.0649 | 0.0 | 0.1073 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.2 | 0.0 | 0.0 | 0.0003 | 0.3786 | 0.0 | 0.006 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.3341 | 76.0 | 85880 | 14.6457 | 0.0001 | 0.0003 | 0.0001 | 0.0 | 0.0001 | 0.0014 | 0.0305 | 0.0631 | 0.0714 | 0.0125 | 0.0923 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.225 | 0.0 | 0.0 | 0.0003 | 0.2786 | 0.0 | 0.0 | 0.0 | 0.0103 | -1.0 | -1.0 | 0.0009 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.2991 | 77.0 | 87010 | 14.7693 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.016 | 0.0557 | 0.0668 | 0.0125 | 0.1026 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.2 | 0.0 | 0.0 | 0.0005 | 0.3857 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0155 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.2078 | 78.0 | 88140 | 14.1360 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0174 | 0.0494 | 0.0593 | 0.0 | 0.0785 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.008 | 0.0 | 0.0 | 0.0 | 0.175 | 0.0 | 0.0 | 0.0001 | 0.2143 | 0.0 | 0.006 | 0.0 | 0.0017 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.2135 | 79.0 | 89270 | 14.2477 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0005 | 0.0279 | 0.063 | 0.0729 | 0.0 | 0.091 | 0.0223 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.0 | 0.0 | 0.0003 | 0.3429 | 0.0 | 0.006 | 0.0001 | 0.0483 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 5.1707 | 80.0 | 90400 | 14.7664 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0107 | 0.025 | 0.0417 | 0.0125 | 0.0719 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0 | 0.0001 | 0.15 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.1876 | 81.0 | 91530 | 14.5303 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0266 | 0.0472 | 0.0536 | 0.0 | 0.0698 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.175 | 0.0 | 0.0 | 0.0002 | 0.1786 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.1259 | 82.0 | 92660 | 14.9224 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.0002 | 0.0004 | 0.0327 | 0.075 | 0.0837 | 0.0 | 0.1176 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.25 | 0.0 | 0.0 | 0.0003 | 0.3429 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0259 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 5.1194 | 83.0 | 93790 | 14.6970 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0139 | 0.0503 | 0.0615 | 0.0 | 0.1053 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.225 | 0.0 | 0.0 | 0.0003 | 0.2643 | 0.0 | 0.0 | 0.0002 | 0.0638 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.0911 | 84.0 | 94920 | 14.6329 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0118 | 0.0515 | 0.0706 | 0.0125 | 0.1128 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.25 | 0.0 | 0.0 | 0.0002 | 0.3643 | 0.0 | 0.0 | 0.0 | 0.0207 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.2024 | 85.0 | 96050 | 14.6733 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0008 | 0.0325 | 0.0464 | 0.025 | 0.0719 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.225 | 0.0 | 0.0 | 0.0002 | 0.1929 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.073 | 86.0 | 97180 | 14.5716 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0307 | 0.066 | 0.0835 | 0.0125 | 0.1063 | 0.0196 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.0 | 0.0002 | 0.3143 | 0.0 | 0.0 | 0.0002 | 0.1086 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 5.0546 | 87.0 | 98310 | 14.2911 | 0.0001 | 0.0003 | 0.0001 | 0.0 | 0.0003 | 0.0001 | 0.0532 | 0.1038 | 0.1127 | 0.0 | 0.1601 | 0.0161 | -1.0 | -1.0 | 0.0001 | 0.049 | 0.0 | 0.0 | 0.0001 | 0.275 | 0.0 | 0.0 | 0.0004 | 0.4143 | 0.0 | 0.0 | 0.0005 | 0.1414 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 5.0524 | 88.0 | 99440 | 14.7944 | 0.0001 | 0.0002 | 0.0001 | 0.0 | 0.0002 | 0.0021 | 0.0264 | 0.0778 | 0.0905 | 0.0 | 0.1312 | 0.0179 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.3 | 0.0 | 0.0 | 0.0002 | 0.2571 | 0.0 | 0.0 | 0.0003 | 0.1086 | -1.0 | -1.0 | 0.0001 | 0.1429 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 5.0537 | 89.0 | 100570 | 14.5410 | 0.0012 | 0.0025 | 0.0001 | 0.0 | 0.0016 | 0.0008 | 0.0295 | 0.0853 | 0.1064 | 0.0 | 0.1494 | 0.0161 | -1.0 | -1.0 | 0.0 | 0.008 | 0.0 | 0.0 | 0.0001 | 0.25 | 0.0 | 0.0 | 0.0004 | 0.4286 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0006 | 0.1362 | -1.0 | -1.0 | 0.0002 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0099 | 0.0059 |
| 4.984 | 90.0 | 101700 | 14.8847 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0004 | 0.0006 | 0.0463 | 0.0789 | 0.1035 | 0.0625 | 0.1207 | 0.0143 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.275 | 0.0 | 0.0 | 0.0003 | 0.3714 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0008 | 0.1586 | -1.0 | -1.0 | 0.0001 | 0.1143 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0003 | 0.0059 |
| 4.8968 | 91.0 | 102830 | 14.9682 | 0.0004 | 0.001 | 0.0002 | 0.0 | 0.0028 | 0.0003 | 0.0544 | 0.0891 | 0.1066 | 0.0125 | 0.1409 | 0.0223 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.225 | 0.0 | 0.0 | 0.0005 | 0.4143 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0014 | 0.1914 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.8949 | 92.0 | 103960 | 14.5189 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0003 | 0.0004 | 0.0536 | 0.0997 | 0.1168 | 0.0 | 0.1325 | 0.0393 | -1.0 | -1.0 | 0.0001 | 0.034 | 0.0 | 0.0 | 0.0 | 0.15 | 0.0 | 0.0 | 0.0002 | 0.3714 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0009 | 0.2121 | -1.0 | -1.0 | 0.0003 | 0.2714 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.896 | 93.0 | 105090 | 15.0880 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0004 | 0.0001 | 0.0248 | 0.0827 | 0.1018 | 0.0125 | 0.1576 | 0.0214 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.3 | 0.0 | 0.0 | 0.0003 | 0.4571 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0005 | 0.1586 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.8753 | 94.0 | 106220 | 14.8092 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0004 | 0.0002 | 0.034 | 0.0731 | 0.0888 | 0.0 | 0.1382 | 0.0071 | -1.0 | -1.0 | 0.0 | 0.008 | 0.0 | 0.0 | 0.0001 | 0.2 | 0.0 | 0.0 | 0.0003 | 0.4143 | 0.0 | 0.0 | -1.0 | -1.0 | 0.001 | 0.1707 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.8306 | 95.0 | 107350 | 15.0052 | 0.0003 | 0.0006 | 0.0001 | 0.0 | 0.0008 | 0.0 | 0.0474 | 0.0699 | 0.0937 | 0.0 | 0.1442 | 0.0125 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.175 | 0.0 | 0.0 | 0.0005 | 0.4714 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0011 | 0.1897 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.8491 | 96.0 | 108480 | 15.0691 | 0.0003 | 0.0004 | 0.0003 | 0.0 | 0.0008 | 0.0 | 0.0498 | 0.0695 | 0.0927 | 0.0 | 0.1456 | 0.0125 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.225 | 0.0 | 0.0 | 0.0005 | 0.4714 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0005 | 0.1379 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.8434 | 97.0 | 109610 | 15.1213 | 0.0005 | 0.001 | 0.0006 | 0.0 | 0.0013 | 0.0004 | 0.0764 | 0.1115 | 0.1283 | 0.0 | 0.1699 | 0.0446 | -1.0 | -1.0 | 0.0 | 0.003 | 0.0 | 0.0 | 0.0016 | 0.175 | 0.0 | 0.0 | 0.0005 | 0.5214 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0025 | 0.3138 | -1.0 | -1.0 | 0.0001 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.8544 | 98.0 | 110740 | 15.1997 | 0.0003 | 0.0006 | 0.0002 | 0.0 | 0.0008 | 0.0005 | 0.0479 | 0.0924 | 0.1217 | 0.0 | 0.1766 | 0.0518 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.5571 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0016 | 0.331 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.768 | 99.0 | 111870 | 14.8149 | 0.0004 | 0.0009 | 0.0002 | 0.0 | 0.0009 | 0.0006 | 0.0522 | 0.0907 | 0.1108 | 0.0 | 0.1596 | 0.05 | -1.0 | -1.0 | 0.0 | 0.009 | 0.0 | 0.0 | 0.0004 | 0.175 | 0.0 | 0.0 | 0.0005 | 0.4929 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0024 | 0.3069 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0003 | 0.0059 |
| 4.7151 | 100.0 | 113000 | 14.6654 | 0.0002 | 0.0006 | 0.0001 | 0.0 | 0.0006 | 0.0004 | 0.0414 | 0.0713 | 0.1024 | 0.0 | 0.1442 | 0.0446 | -1.0 | -1.0 | 0.0 | 0.009 | 0.0 | 0.0 | 0.0002 | 0.125 | 0.0 | 0.0 | 0.0004 | 0.5143 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0016 | 0.2603 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.7191 | 101.0 | 114130 | 14.9368 | 0.0004 | 0.0008 | 0.0002 | 0.0 | 0.0009 | 0.0003 | 0.0429 | 0.0804 | 0.1016 | 0.0 | 0.1462 | 0.0321 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.125 | 0.0 | 0.0 | 0.0004 | 0.5214 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0018 | 0.2552 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.7791 | 102.0 | 115260 | 14.7370 | 0.0002 | 0.0005 | 0.0002 | 0.0 | 0.0005 | 0.0003 | 0.0374 | 0.0924 | 0.1128 | 0.0 | 0.1652 | 0.0482 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.0 | 0.0005 | 0.55 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0015 | 0.2534 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.7545 | 103.0 | 116390 | 14.8525 | 0.0003 | 0.0007 | 0.0002 | 0.0 | 0.0008 | 0.0003 | 0.0341 | 0.0746 | 0.1032 | 0.0 | 0.1501 | 0.0348 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.15 | 0.0 | 0.0 | 0.0004 | 0.55 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0017 | 0.2224 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.7168 | 104.0 | 117520 | 14.8952 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0005 | 0.0001 | 0.0396 | 0.0815 | 0.0983 | 0.0 | 0.145 | 0.0375 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.175 | 0.0 | 0.0 | 0.0006 | 0.55 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0007 | 0.1483 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.6545 | 105.0 | 118650 | 14.8651 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0002 | 0.0001 | 0.0185 | 0.0537 | 0.0914 | 0.0 | 0.1322 | 0.0375 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.15 | 0.0 | 0.0 | 0.0003 | 0.5786 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0002 | 0.0879 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.6944 | 106.0 | 119780 | 14.9816 | 0.0003 | 0.0009 | 0.0002 | 0.0 | 0.0007 | 0.0075 | 0.0523 | 0.0965 | 0.1157 | 0.0 | 0.1454 | 0.067 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.175 | 0.0 | 0.0 | 0.0006 | 0.55 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0014 | 0.1879 | -1.0 | -1.0 | 0.0002 | 0.1286 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.6429 | 107.0 | 120910 | 14.7832 | 0.0003 | 0.0009 | 0.0001 | 0.0 | 0.0006 | 0.002 | 0.0355 | 0.0779 | 0.1001 | 0.0 | 0.1438 | 0.0491 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.175 | 0.0 | 0.0 | 0.0004 | 0.5071 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0016 | 0.219 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.6496 | 108.0 | 122040 | 14.9303 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0005 | 0.0007 | 0.0334 | 0.0705 | 0.0977 | 0.0 | 0.1408 | 0.0464 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.175 | 0.0 | 0.0 | 0.0004 | 0.5214 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0009 | 0.1828 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.5957 | 109.0 | 123170 | 14.7485 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0009 | 0.033 | 0.0756 | 0.1012 | 0.0 | 0.145 | 0.0571 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.175 | 0.0 | 0.0 | 0.0004 | 0.5429 | 0.0 | 0.006 | -1.0 | -1.0 | 0.0011 | 0.181 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.5829 | 110.0 | 124300 | 14.8069 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0005 | 0.0293 | 0.0722 | 0.0962 | 0.0 | 0.1342 | 0.0607 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.15 | 0.0 | 0.0 | 0.0004 | 0.55 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0008 | 0.1655 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.6242 | 111.0 | 125430 | 14.7643 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0006 | 0.0308 | 0.0706 | 0.1063 | 0.0 | 0.151 | 0.0509 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.15 | 0.0 | 0.0 | 0.0003 | 0.5429 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0012 | 0.2569 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.5899 | 112.0 | 126560 | 14.8818 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0003 | 0.0004 | 0.0238 | 0.0686 | 0.1039 | 0.0 | 0.1489 | 0.0571 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.175 | 0.0 | 0.0 | 0.0004 | 0.5643 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0008 | 0.1828 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0059 |
| 4.5274 | 113.0 | 127690 | 14.8934 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0003 | 0.0298 | 0.0706 | 0.1085 | 0.0 | 0.1599 | 0.0455 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.2 | 0.0 | 0.0 | 0.0003 | 0.5571 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0009 | 0.2121 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.4824 | 114.0 | 128820 | 14.8513 | 0.0003 | 0.0007 | 0.0002 | 0.0 | 0.0007 | 0.0012 | 0.0333 | 0.0736 | 0.1079 | 0.0 | 0.1594 | 0.0446 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.55 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0016 | 0.2138 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.4949 | 115.0 | 129950 | 14.8759 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0004 | 0.0003 | 0.033 | 0.0676 | 0.1089 | 0.0 | 0.1618 | 0.0384 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.5571 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0011 | 0.2155 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.5534 | 116.0 | 131080 | 14.8811 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0005 | 0.0303 | 0.0783 | 0.1041 | 0.0 | 0.1526 | 0.0509 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.5571 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0009 | 0.1724 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.4644 | 117.0 | 132210 | 14.8950 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0004 | 0.0002 | 0.0312 | 0.0761 | 0.1062 | 0.0 | 0.1556 | 0.0509 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.5643 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0009 | 0.1845 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.515 | 118.0 | 133340 | 14.8908 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0004 | 0.031 | 0.0666 | 0.1039 | 0.0 | 0.1545 | 0.0384 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.55 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0009 | 0.1776 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.4987 | 119.0 | 134470 | 14.8879 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0005 | 0.0303 | 0.0722 | 0.1015 | 0.0 | 0.1452 | 0.058 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.175 | 0.0 | 0.0 | 0.0004 | 0.5643 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.001 | 0.1672 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 4.4693 | 120.0 | 135600 | 14.8933 | 0.0002 | 0.0005 | 0.0001 | 0.0 | 0.0004 | 0.0006 | 0.0299 | 0.0624 | 0.1045 | 0.0 | 0.1529 | 0.0518 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.2 | 0.0 | 0.0 | 0.0004 | 0.5571 | 0.0 | 0.0071 | -1.0 | -1.0 | 0.0011 | 0.1759 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.5.1
- Datasets 3.2.0
- Tokenizers 0.21.1
| [
"ampulla of vater",
"angiectasia",
"blood - fresh",
"blood - hematin",
"erosion",
"erythema",
"foreign body",
"ileocecal valve",
"lymphangiectasia",
"normal clean mucosa",
"polyp",
"pylorus",
"reduced mucosal view",
"ulcer"
] |
zoros-ai/deta-swin-large-all-objects-epoch-0 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-4 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-5 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-6 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-7 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-8 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-9 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-10 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
zoros-ai/deta-swin-large-all-objects-epoch-11 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal room",
"window",
"door",
"joinery"
] |
goodcasper/rtdetr_r50vd_coco_o365 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr_r50vd_coco_o365
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 27.8159
- Map: 0.016
- Map 50: 0.0242
- Map 75: 0.0174
- Map Small: -1.0
- Map Medium: 0.0
- Map Large: 0.018
- Mar 1: 0.0162
- Mar 10: 0.0162
- Mar 100: 0.0162
- Mar Small: -1.0
- Mar Medium: 0.0
- Mar Large: 0.0182
- Map Ampulla of vater: -1.0
- Mar 100 Ampulla of vater: -1.0
- Map Angiectasia: 0.0
- Mar 100 Angiectasia: 0.0
- Map Blood - fresh: 0.1443
- Mar 100 Blood - fresh: 0.1455
- Map Blood - hematin: 0.0
- Mar 100 Blood - hematin: 0.0
- Map Erosion: 0.0
- Mar 100 Erosion: 0.0
- Map Erythema: 0.0
- Mar 100 Erythema: 0.0
- Map Foreign body: 0.0
- Mar 100 Foreign body: 0.0
- Map Lymphangiectasia: 0.0
- Mar 100 Lymphangiectasia: 0.0
- Map Polyp: 0.0
- Mar 100 Polyp: 0.0
- Map Ulcer: 0.0
- Mar 100 Ulcer: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Ampulla of vater | Mar 100 Ampulla of vater | Map Angiectasia | Mar 100 Angiectasia | Map Blood - fresh | Mar 100 Blood - fresh | Map Blood - hematin | Mar 100 Blood - hematin | Map Erosion | Mar 100 Erosion | Map Erythema | Mar 100 Erythema | Map Foreign body | Mar 100 Foreign body | Map Ileocecal valve | Mar 100 Ileocecal valve | Map Lymphangiectasia | Mar 100 Lymphangiectasia | Map Polyp | Mar 100 Polyp | Map Pylorus | Mar 100 Pylorus | Map Ulcer | Mar 100 Ulcer |
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:---------------:|:-------------------:|:-----------------:|:---------------------:|:-------------------:|:-----------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------------:|:--------------------:|:-------------------:|:-----------------------:|:--------------------:|:------------------------:|:---------:|:-------------:|:-----------:|:---------------:|:---------:|:-------------:|
| 21.0002 | 1.0 | 1800 | 26.0375 | 0.0009 | 0.0015 | 0.0007 | -1.0 | 0.0017 | 0.0011 | 0.0057 | 0.0057 | 0.0057 | -1.0 | 0.0022 | 0.008 | -1.0 | -1.0 | 0.0031 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0027 | 0.0147 | -1.0 | -1.0 | 0.0012 | 0.0143 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0009 | 0.0066 |
| 15.5547 | 2.0 | 3600 | 24.4976 | 0.0135 | 0.0218 | 0.0144 | -1.0 | 0.0119 | 0.0235 | 0.0303 | 0.0311 | 0.0311 | -1.0 | 0.0113 | 0.0583 | -1.0 | -1.0 | 0.0622 | 0.139 | 0.0004 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0437 | 0.0729 | -1.0 | -1.0 | 0.0034 | 0.0214 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0117 | 0.0397 |
| 13.8787 | 3.0 | 5400 | 27.1006 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 13.4717 | 4.0 | 7200 | 26.6639 | 0.0132 | 0.021 | 0.0152 | -1.0 | 0.0038 | 0.0147 | 0.0151 | 0.0151 | 0.0151 | -1.0 | 0.004 | 0.0171 | 0.0165 | 0.0285 | 0.1021 | 0.1073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 12.7677 | 5.0 | 9000 | 24.2837 | 0.0033 | 0.0063 | 0.0036 | -1.0 | 0.0025 | 0.0036 | 0.0036 | 0.0036 | 0.0036 | -1.0 | 0.0022 | 0.0043 | -1.0 | -1.0 | 0.0143 | 0.0163 | 0.0119 | 0.0109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.0054 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 12.2905 | 6.0 | 10800 | 26.7668 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 11.8799 | 7.0 | 12600 | 28.5057 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 11.5336 | 8.0 | 14400 | 27.8159 | 0.016 | 0.0242 | 0.0174 | -1.0 | 0.0 | 0.018 | 0.0162 | 0.0162 | 0.0162 | -1.0 | 0.0 | 0.0182 | -1.0 | -1.0 | 0.0 | 0.0 | 0.1443 | 0.1455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 11.4347 | 9.0 | 16200 | 26.5658 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 11.399 | 10.0 | 18000 | 27.6005 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 11.2479 | 11.0 | 19800 | 26.9804 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 10.671 | 12.0 | 21600 | 29.2107 | 0.0035 | 0.0089 | 0.0021 | -1.0 | 0.0 | 0.0039 | 0.0075 | 0.0077 | 0.0077 | -1.0 | 0.0 | 0.0086 | 0.0 | 0.0 | 0.0313 | 0.0691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 10.6281 | 13.0 | 23400 | 29.4607 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 10.4952 | 14.0 | 25200 | 27.4934 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 10.0251 | 15.0 | 27000 | 28.3769 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.499 | 16.0 | 28800 | 27.2366 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 10.1165 | 17.0 | 30600 | 27.8705 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.8297 | 18.0 | 32400 | 26.7598 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.8481 | 19.0 | 34200 | 28.3368 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.4386 | 20.0 | 36000 | 29.8882 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.2099 | 21.0 | 37800 | 29.6816 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.5721 | 22.0 | 39600 | 26.7578 | 0.0025 | 0.0033 | 0.0022 | -1.0 | 0.0 | 0.0041 | 0.0023 | 0.0023 | 0.0023 | -1.0 | 0.0 | 0.0035 | -1.0 | -1.0 | 0.005 | 0.0041 | 0.0178 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.8532 | 23.0 | 41400 | 26.7121 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.125 | 24.0 | 43200 | 29.2926 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 9.1671 | 25.0 | 45000 | 30.2412 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.6856 | 26.0 | 46800 | 29.1818 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.6695 | 27.0 | 48600 | 30.0137 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.6153 | 28.0 | 50400 | 30.8425 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.3526 | 29.0 | 52200 | 29.4365 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.613 | 30.0 | 54000 | 30.9884 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.09 | 31.0 | 55800 | 33.0556 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.3259 | 32.0 | 57600 | 30.0973 | 0.0015 | 0.0022 | 0.0022 | -1.0 | 0.0 | 0.0017 | 0.0014 | 0.0014 | 0.0014 | -1.0 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0139 | 0.0127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.9391 | 33.0 | 59400 | 29.9558 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.0617 | 34.0 | 61200 | 27.0266 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.4808 | 35.0 | 63000 | 26.7873 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.9955 | 36.0 | 64800 | 26.9714 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.2621 | 37.0 | 66600 | 28.3118 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.8695 | 38.0 | 68400 | 26.1976 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 8.0564 | 39.0 | 70200 | 28.0545 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.8677 | 40.0 | 72000 | 27.3127 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.9055 | 41.0 | 73800 | 29.2290 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.6467 | 42.0 | 75600 | 27.8649 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.5781 | 43.0 | 77400 | 28.6441 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.3713 | 44.0 | 79200 | 30.2219 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.6103 | 45.0 | 81000 | 31.2263 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.6402 | 46.0 | 82800 | 30.6451 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.3907 | 47.0 | 84600 | 32.2594 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.271 | 48.0 | 86400 | 29.9771 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.2283 | 49.0 | 88200 | 31.1746 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.2988 | 50.0 | 90000 | 31.8863 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.2238 | 51.0 | 91800 | 29.2843 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.8274 | 52.0 | 93600 | 28.4396 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.8842 | 53.0 | 95400 | 31.0102 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 7.0953 | 54.0 | 97200 | 29.2886 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.954 | 55.0 | 99000 | 30.2360 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.8262 | 56.0 | 100800 | 29.6451 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.5058 | 57.0 | 102600 | 28.0813 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.4493 | 58.0 | 104400 | 27.2198 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.5762 | 59.0 | 106200 | 26.9053 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.4407 | 60.0 | 108000 | 28.5780 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.3071 | 61.0 | 109800 | 26.6372 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.5321 | 62.0 | 111600 | 26.7619 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.2312 | 63.0 | 113400 | 28.2786 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.2913 | 64.0 | 115200 | 28.2667 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.4635 | 65.0 | 117000 | 28.1047 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.1449 | 66.0 | 118800 | 28.4823 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.0342 | 67.0 | 120600 | 29.9479 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.8726 | 68.0 | 122400 | 27.3161 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7609 | 69.0 | 124200 | 26.3775 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.9319 | 70.0 | 126000 | 27.2571 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7682 | 71.0 | 127800 | 26.6236 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.9883 | 72.0 | 129600 | 28.0503 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7841 | 73.0 | 131400 | 27.5336 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7405 | 74.0 | 133200 | 28.3340 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.8629 | 75.0 | 135000 | 27.0366 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.8325 | 76.0 | 136800 | 27.5247 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7049 | 77.0 | 138600 | 27.6555 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.6417 | 78.0 | 140400 | 26.9032 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.635 | 79.0 | 142200 | 27.5711 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7888 | 80.0 | 144000 | 26.4394 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.6334 | 81.0 | 145800 | 26.8811 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.6256 | 82.0 | 147600 | 27.1629 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.5289 | 83.0 | 149400 | 26.2994 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.4807 | 84.0 | 151200 | 27.1147 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.4047 | 85.0 | 153000 | 26.8552 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.3842 | 86.0 | 154800 | 27.6915 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.3706 | 87.0 | 156600 | 27.8776 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.2643 | 88.0 | 158400 | 27.3786 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.4693 | 89.0 | 160200 | 28.4332 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.0632 | 90.0 | 162000 | 27.2075 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.0348 | 91.0 | 163800 | 27.4420 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.3628 | 92.0 | 165600 | 26.9981 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.9328 | 93.0 | 167400 | 27.8303 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.0108 | 94.0 | 169200 | 27.6372 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.2565 | 95.0 | 171000 | 27.6750 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.8395 | 96.0 | 172800 | 27.5674 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.9181 | 97.0 | 174600 | 27.3259 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.84 | 98.0 | 176400 | 27.4295 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.7649 | 99.0 | 178200 | 26.5326 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.9249 | 100.0 | 180000 | 27.0068 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.729 | 101.0 | 181800 | 26.8027 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.7564 | 102.0 | 183600 | 26.4852 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.7072 | 103.0 | 185400 | 26.3657 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.8089 | 104.0 | 187200 | 26.4242 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.712 | 105.0 | 189000 | 26.6715 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5872 | 106.0 | 190800 | 27.1173 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.6578 | 107.0 | 192600 | 27.0949 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4741 | 108.0 | 194400 | 27.0716 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5447 | 109.0 | 196200 | 26.8918 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.404 | 110.0 | 198000 | 26.8743 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.6108 | 111.0 | 199800 | 27.0441 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4738 | 112.0 | 201600 | 27.0646 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.3679 | 113.0 | 203400 | 27.0223 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4184 | 114.0 | 205200 | 27.0707 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4131 | 115.0 | 207000 | 27.1931 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.241 | 116.0 | 208800 | 27.1367 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5176 | 117.0 | 210600 | 27.2458 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.262 | 118.0 | 212400 | 27.2294 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.34 | 119.0 | 214200 | 27.1732 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2806 | 120.0 | 216000 | 27.1519 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.5.1
- Datasets 3.2.0
- Tokenizers 0.21.1
| [
"ampulla of vater",
"angiectasia",
"blood - fresh",
"blood - hematin",
"erosion",
"erythema",
"foreign body",
"ileocecal valve",
"lymphangiectasia",
"normal clean mucosa",
"polyp",
"pylorus",
"reduced mucosal view",
"ulcer"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-0 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-4 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-5 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
zoros-ai/deta-swin-large-room-detector-test-epoch-6 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"internal_room",
"outdoor"
] |
goodcasper/cppe-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# cppe-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 125.0968
- eval_model_preparation_time: 0.0034
- eval_map: 0.0
- eval_map_50: 0.0003
- eval_map_75: 0.0
- eval_map_small: 0.0
- eval_map_medium: 0.0001
- eval_map_large: 0.0002
- eval_mar_1: 0.0013
- eval_mar_10: 0.0027
- eval_mar_100: 0.0076
- eval_mar_small: 0.0
- eval_mar_medium: 0.0146
- eval_mar_large: 0.0218
- eval_map_Coverall: 0.0
- eval_mar_100_Coverall: 0.0072
- eval_map_Face_Shield: 0.0
- eval_mar_100_Face_Shield: 0.0025
- eval_map_Gloves: 0.0001
- eval_mar_100_Gloves: 0.0004
- eval_map_Goggles: 0.0
- eval_mar_100_Goggles: 0.0108
- eval_map_Mask: 0.0
- eval_mar_100_Mask: 0.0169
- eval_runtime: 3.3548
- eval_samples_per_second: 44.713
- eval_steps_per_second: 5.664
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 120
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.5.1
- Datasets 3.2.0
- Tokenizers 0.21.1
| [
"ampulla of vater",
"angiectasia",
"blood - fresh",
"blood - hematin",
"erosion",
"erythema",
"foreign body",
"ileocecal valve",
"lymphangiectasia",
"normal clean mucosa",
"polyp",
"pylorus",
"reduced mucosal view",
"ulcer"
] |
ddddddpppppppp/yolos-base-fashionpedia-finetuned-v1-1-1 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# yolos-base-fashionpedia-finetuned-v1-1-1
This model is a fine-tuned version of [hustvl/yolos-base](https://huggingface.co/hustvl/yolos-base) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.03
- training_steps: 5
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.44.2
- Pytorch 2.6.0+cu124
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
goodcasper/rtdetr-r50-cppe5-finetune |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-r50-cppe5-finetune
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 11.4048
- Map: 0.3698
- Map 50: 0.6663
- Map 75: 0.3596
- Map Small: 0.2524
- Map Medium: 0.3166
- Map Large: 0.5839
- Mar 1: 0.332
- Mar 10: 0.4962
- Mar 100: 0.5113
- Mar Small: 0.3579
- Mar Medium: 0.4461
- Mar Large: 0.7014
- Map Coverall: 0.5578
- Mar 100 Coverall: 0.6901
- Map Face Shield: 0.435
- Mar 100 Face Shield: 0.5468
- Map Gloves: 0.2703
- Mar 100 Gloves: 0.4138
- Map Goggles: 0.247
- Mar 100 Goggles: 0.4431
- Map Mask: 0.3388
- Mar 100 Mask: 0.4627
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 38.1173 | 0.0239 | 0.0536 | 0.0182 | 0.0017 | 0.0063 | 0.0375 | 0.045 | 0.1248 | 0.1733 | 0.0817 | 0.1162 | 0.2999 | 0.1064 | 0.391 | 0.0011 | 0.1595 | 0.0009 | 0.0674 | 0.0001 | 0.0677 | 0.0111 | 0.1809 |
| No log | 2.0 | 214 | 20.2500 | 0.0786 | 0.1471 | 0.0694 | 0.0378 | 0.0366 | 0.1131 | 0.1295 | 0.3169 | 0.3967 | 0.2123 | 0.3301 | 0.5708 | 0.2668 | 0.6315 | 0.0395 | 0.4772 | 0.0063 | 0.1781 | 0.0108 | 0.3015 | 0.0694 | 0.3951 |
| No log | 3.0 | 321 | 14.0771 | 0.1696 | 0.3114 | 0.1609 | 0.1019 | 0.125 | 0.2663 | 0.2055 | 0.4412 | 0.5148 | 0.3882 | 0.433 | 0.7183 | 0.4291 | 0.6932 | 0.0585 | 0.4987 | 0.0727 | 0.4357 | 0.0159 | 0.4354 | 0.2719 | 0.5111 |
| No log | 4.0 | 428 | 12.8123 | 0.2421 | 0.4463 | 0.2257 | 0.1836 | 0.1973 | 0.4273 | 0.2903 | 0.4809 | 0.5287 | 0.3982 | 0.4544 | 0.7118 | 0.5089 | 0.6991 | 0.1559 | 0.543 | 0.1756 | 0.3964 | 0.0668 | 0.4923 | 0.3032 | 0.5124 |
| 40.7658 | 5.0 | 535 | 12.0783 | 0.3013 | 0.5536 | 0.2782 | 0.2043 | 0.2539 | 0.5116 | 0.3139 | 0.4932 | 0.5297 | 0.3962 | 0.4534 | 0.7173 | 0.5221 | 0.6869 | 0.274 | 0.5468 | 0.2288 | 0.4214 | 0.1387 | 0.4846 | 0.3427 | 0.5084 |
| 40.7658 | 6.0 | 642 | 11.9476 | 0.3233 | 0.5871 | 0.3042 | 0.2256 | 0.2655 | 0.5617 | 0.3154 | 0.4967 | 0.5339 | 0.3914 | 0.4604 | 0.7117 | 0.5475 | 0.6982 | 0.3259 | 0.5544 | 0.226 | 0.4196 | 0.1744 | 0.4985 | 0.3426 | 0.4987 |
| 40.7658 | 7.0 | 749 | 12.0143 | 0.3191 | 0.5775 | 0.3 | 0.2074 | 0.2837 | 0.5463 | 0.3135 | 0.4943 | 0.5242 | 0.3958 | 0.4549 | 0.7014 | 0.5407 | 0.7005 | 0.3394 | 0.5468 | 0.2554 | 0.4263 | 0.1487 | 0.4708 | 0.3116 | 0.4764 |
| 40.7658 | 8.0 | 856 | 11.7715 | 0.323 | 0.5883 | 0.312 | 0.245 | 0.2818 | 0.5729 | 0.3169 | 0.4901 | 0.5167 | 0.3926 | 0.4433 | 0.7148 | 0.5428 | 0.6964 | 0.3181 | 0.5165 | 0.2554 | 0.4339 | 0.1594 | 0.4462 | 0.3392 | 0.4907 |
| 40.7658 | 9.0 | 963 | 11.5823 | 0.3418 | 0.6091 | 0.3348 | 0.2329 | 0.2916 | 0.5665 | 0.3219 | 0.5008 | 0.5334 | 0.4041 | 0.4671 | 0.7151 | 0.5559 | 0.7099 | 0.3539 | 0.5443 | 0.2411 | 0.4531 | 0.2406 | 0.4677 | 0.3175 | 0.492 |
| 13.7224 | 10.0 | 1070 | 11.6300 | 0.3581 | 0.6369 | 0.362 | 0.2516 | 0.3175 | 0.5714 | 0.3324 | 0.5096 | 0.5319 | 0.4041 | 0.4692 | 0.7025 | 0.5515 | 0.705 | 0.4138 | 0.543 | 0.2426 | 0.417 | 0.2285 | 0.4938 | 0.3542 | 0.5009 |
| 13.7224 | 11.0 | 1177 | 11.6619 | 0.3652 | 0.6597 | 0.344 | 0.2459 | 0.3263 | 0.5773 | 0.3305 | 0.5129 | 0.5405 | 0.3997 | 0.4736 | 0.7207 | 0.5566 | 0.7108 | 0.4033 | 0.5494 | 0.2706 | 0.4487 | 0.2514 | 0.5031 | 0.344 | 0.4907 |
| 13.7224 | 12.0 | 1284 | 11.8307 | 0.3288 | 0.6261 | 0.3112 | 0.219 | 0.2902 | 0.5476 | 0.3111 | 0.4881 | 0.5164 | 0.3599 | 0.4488 | 0.7113 | 0.5372 | 0.7059 | 0.3679 | 0.5456 | 0.2415 | 0.4027 | 0.1971 | 0.4508 | 0.3003 | 0.4769 |
| 13.7224 | 13.0 | 1391 | 11.7494 | 0.3458 | 0.6348 | 0.3297 | 0.2254 | 0.3003 | 0.5931 | 0.3272 | 0.5064 | 0.5295 | 0.3798 | 0.4552 | 0.7058 | 0.5416 | 0.7045 | 0.4048 | 0.5443 | 0.265 | 0.4326 | 0.1986 | 0.4846 | 0.319 | 0.4813 |
| 13.7224 | 14.0 | 1498 | 11.6319 | 0.3567 | 0.6434 | 0.3523 | 0.2294 | 0.303 | 0.5846 | 0.3245 | 0.4986 | 0.5265 | 0.3667 | 0.4571 | 0.7032 | 0.5587 | 0.6991 | 0.4268 | 0.5734 | 0.255 | 0.4107 | 0.2166 | 0.4662 | 0.3264 | 0.4831 |
| 11.7687 | 15.0 | 1605 | 11.7305 | 0.3557 | 0.6509 | 0.3442 | 0.2408 | 0.3136 | 0.5382 | 0.3188 | 0.4873 | 0.5143 | 0.3672 | 0.4534 | 0.6755 | 0.5477 | 0.695 | 0.4122 | 0.5608 | 0.2519 | 0.3969 | 0.2532 | 0.4585 | 0.3137 | 0.4604 |
| 11.7687 | 16.0 | 1712 | 11.4537 | 0.3478 | 0.6403 | 0.3341 | 0.2193 | 0.3057 | 0.5617 | 0.3218 | 0.4949 | 0.5209 | 0.3553 | 0.4458 | 0.7135 | 0.5613 | 0.7027 | 0.3627 | 0.5418 | 0.2732 | 0.4393 | 0.2056 | 0.4492 | 0.3363 | 0.4716 |
| 11.7687 | 17.0 | 1819 | 11.5661 | 0.3576 | 0.6407 | 0.3421 | 0.2485 | 0.3213 | 0.5554 | 0.3299 | 0.5023 | 0.5256 | 0.4024 | 0.449 | 0.6992 | 0.5424 | 0.6959 | 0.4262 | 0.5785 | 0.262 | 0.417 | 0.214 | 0.4569 | 0.3435 | 0.4796 |
| 11.7687 | 18.0 | 1926 | 11.4580 | 0.3564 | 0.6423 | 0.3545 | 0.2224 | 0.318 | 0.5983 | 0.3288 | 0.4976 | 0.5194 | 0.3526 | 0.4523 | 0.711 | 0.546 | 0.6973 | 0.4153 | 0.5405 | 0.2618 | 0.4259 | 0.2044 | 0.4477 | 0.3546 | 0.4858 |
| 10.479 | 19.0 | 2033 | 11.5507 | 0.3608 | 0.6498 | 0.3486 | 0.2252 | 0.3254 | 0.587 | 0.3291 | 0.5001 | 0.5211 | 0.3523 | 0.4462 | 0.7184 | 0.5407 | 0.6977 | 0.4341 | 0.5392 | 0.2655 | 0.4366 | 0.2224 | 0.4508 | 0.3413 | 0.4809 |
| 10.479 | 20.0 | 2140 | 11.7187 | 0.3576 | 0.6502 | 0.3485 | 0.2247 | 0.319 | 0.5606 | 0.322 | 0.4913 | 0.5105 | 0.3577 | 0.4402 | 0.6818 | 0.5599 | 0.6964 | 0.3959 | 0.5367 | 0.2602 | 0.3938 | 0.2339 | 0.4477 | 0.3382 | 0.4778 |
| 10.479 | 21.0 | 2247 | 11.2984 | 0.363 | 0.6442 | 0.3638 | 0.2116 | 0.3125 | 0.6121 | 0.3344 | 0.5051 | 0.527 | 0.3712 | 0.4552 | 0.719 | 0.5621 | 0.7 | 0.4212 | 0.5747 | 0.2714 | 0.4317 | 0.2142 | 0.4538 | 0.346 | 0.4747 |
| 10.479 | 22.0 | 2354 | 11.5029 | 0.3654 | 0.6576 | 0.3668 | 0.2177 | 0.3175 | 0.5762 | 0.3344 | 0.5016 | 0.5186 | 0.3388 | 0.4581 | 0.7034 | 0.5487 | 0.6955 | 0.4263 | 0.5595 | 0.2562 | 0.4125 | 0.2452 | 0.4492 | 0.3506 | 0.4764 |
| 10.479 | 23.0 | 2461 | 11.4438 | 0.3625 | 0.6551 | 0.3527 | 0.2173 | 0.3068 | 0.6091 | 0.3361 | 0.5003 | 0.5183 | 0.3378 | 0.4499 | 0.7181 | 0.5512 | 0.6937 | 0.4094 | 0.5481 | 0.2637 | 0.4165 | 0.2516 | 0.4585 | 0.3366 | 0.4747 |
| 9.4021 | 24.0 | 2568 | 11.4254 | 0.3631 | 0.6577 | 0.3507 | 0.2274 | 0.3116 | 0.5933 | 0.3297 | 0.4955 | 0.5166 | 0.3529 | 0.4438 | 0.7136 | 0.5599 | 0.695 | 0.4109 | 0.538 | 0.2742 | 0.4375 | 0.2418 | 0.4538 | 0.3285 | 0.4587 |
| 9.4021 | 25.0 | 2675 | 11.6428 | 0.354 | 0.6519 | 0.3337 | 0.2378 | 0.3008 | 0.5779 | 0.3268 | 0.4892 | 0.5029 | 0.3615 | 0.4249 | 0.7016 | 0.5466 | 0.6892 | 0.4023 | 0.5203 | 0.258 | 0.404 | 0.2308 | 0.44 | 0.3321 | 0.4609 |
| 9.4021 | 26.0 | 2782 | 11.4190 | 0.3659 | 0.6554 | 0.3551 | 0.2343 | 0.3126 | 0.6047 | 0.333 | 0.4993 | 0.5179 | 0.3872 | 0.4418 | 0.7056 | 0.5518 | 0.6932 | 0.4314 | 0.5582 | 0.2773 | 0.4205 | 0.2282 | 0.46 | 0.341 | 0.4573 |
| 9.4021 | 27.0 | 2889 | 11.3988 | 0.3661 | 0.6618 | 0.3654 | 0.2523 | 0.3161 | 0.5919 | 0.3314 | 0.506 | 0.5185 | 0.3714 | 0.4545 | 0.709 | 0.5592 | 0.6959 | 0.4322 | 0.562 | 0.2704 | 0.4134 | 0.2311 | 0.4508 | 0.3374 | 0.4702 |
| 9.4021 | 28.0 | 2996 | 11.4137 | 0.3635 | 0.6549 | 0.3624 | 0.2519 | 0.3113 | 0.5752 | 0.3305 | 0.4981 | 0.5141 | 0.3739 | 0.4446 | 0.6948 | 0.5584 | 0.6995 | 0.4198 | 0.5532 | 0.2665 | 0.4125 | 0.2305 | 0.4431 | 0.3425 | 0.4622 |
| 8.5484 | 29.0 | 3103 | 11.4048 | 0.3698 | 0.6663 | 0.3596 | 0.2524 | 0.3166 | 0.5839 | 0.332 | 0.4962 | 0.5113 | 0.3579 | 0.4461 | 0.7014 | 0.5578 | 0.6901 | 0.435 | 0.5468 | 0.2703 | 0.4138 | 0.247 | 0.4431 | 0.3388 | 0.4627 |
| 8.5484 | 30.0 | 3210 | 11.4198 | 0.3626 | 0.6611 | 0.3605 | 0.2487 | 0.3088 | 0.5785 | 0.3296 | 0.4903 | 0.5045 | 0.3532 | 0.4404 | 0.6922 | 0.5612 | 0.7009 | 0.4241 | 0.5316 | 0.2636 | 0.4027 | 0.2319 | 0.4277 | 0.3323 | 0.4596 |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.5.1
- Datasets 3.2.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
goodcasper/kvasir_rtdetrv2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# kvasir_rtdetrv2
This model is a fine-tuned version of [jadechoghari/RT-DETRv2](https://huggingface.co/jadechoghari/RT-DETRv2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 8.5313
- Map: 0.279
- Map 50: 0.3513
- Map 75: 0.3148
- Map Small: -1.0
- Map Medium: 0.1617
- Map Large: 0.3331
- Mar 1: 0.3197
- Mar 10: 0.3197
- Mar 100: 0.3197
- Mar Small: -1.0
- Mar Medium: 0.1869
- Mar Large: 0.3819
- Map Ampulla of vater: -1.0
- Mar 100 Ampulla of vater: -1.0
- Map Angiectasia: 0.2223
- Mar 100 Angiectasia: 0.3358
- Map Blood - fresh: 0.6514
- Mar 100 Blood - fresh: 0.6855
- Map Blood - hematin: 0.0
- Mar 100 Blood - hematin: 0.0
- Map Erosion: 0.0807
- Mar 100 Erosion: 0.125
- Map Erythema: 0.0693
- Mar 100 Erythema: 0.0667
- Map Foreign body: 0.4394
- Mar 100 Foreign body: 0.4922
- Map Lymphangiectasia: 0.4583
- Mar 100 Lymphangiectasia: 0.5092
- Map Polyp: 0.3218
- Mar 100 Polyp: 0.3714
- Map Ulcer: 0.2681
- Mar 100 Ulcer: 0.2917
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Ampulla of vater | Mar 100 Ampulla of vater | Map Angiectasia | Mar 100 Angiectasia | Map Blood - fresh | Mar 100 Blood - fresh | Map Blood - hematin | Mar 100 Blood - hematin | Map Erosion | Mar 100 Erosion | Map Erythema | Mar 100 Erythema | Map Foreign body | Mar 100 Foreign body | Map Ileocecal valve | Mar 100 Ileocecal valve | Map Lymphangiectasia | Mar 100 Lymphangiectasia | Map Normal clean mucosa | Mar 100 Normal clean mucosa | Map Polyp | Mar 100 Polyp | Map Pylorus | Mar 100 Pylorus | Map Reduced mucosal view | Mar 100 Reduced mucosal view | Map Ulcer | Mar 100 Ulcer |
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:---------------:|:-------------------:|:-----------------:|:---------------------:|:-------------------:|:-----------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------------:|:--------------------:|:-------------------:|:-----------------------:|:--------------------:|:------------------------:|:-----------------------:|:---------------------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------------------:|:----------------------------:|:---------:|:-------------:|
| 79.2744 | 1.0 | 900 | 10.7812 | 0.0328 | 0.0625 | 0.0298 | -1.0 | 0.0012 | 0.0374 | 0.1799 | 0.2489 | 0.2642 | -1.0 | 0.0739 | 0.3264 | -1.0 | -1.0 | 0.0019 | 0.2146 | 0.2642 | 0.5473 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0261 | 0.5264 | -1.0 | -1.0 | 0.0014 | 0.3571 | -1.0 | -1.0 | 0.0012 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0322 |
| 10.6582 | 2.0 | 1800 | 10.3337 | 0.0736 | 0.1291 | 0.0695 | -1.0 | 0.0114 | 0.0836 | 0.1681 | 0.194 | 0.1946 | -1.0 | 0.0336 | 0.2376 | -1.0 | -1.0 | 0.0482 | 0.2236 | 0.4152 | 0.6036 | 0.0 | 0.0 | 0.0001 | 0.0477 | 0.0 | 0.0 | 0.1817 | 0.3419 | -1.0 | -1.0 | 0.0004 | 0.0663 | -1.0 | -1.0 | 0.0163 | 0.4571 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0107 |
| 9.2063 | 3.0 | 2700 | 10.6762 | 0.039 | 0.0724 | 0.0367 | -1.0 | 0.0308 | 0.0397 | 0.1387 | 0.1525 | 0.1572 | -1.0 | 0.0496 | 0.1911 | -1.0 | -1.0 | 0.0487 | 0.178 | 0.2236 | 0.4236 | 0.0 | 0.0 | 0.0006 | 0.0455 | 0.0 | 0.0 | 0.0766 | 0.2512 | -1.0 | -1.0 | 0.0007 | 0.1143 | -1.0 | -1.0 | 0.001 | 0.4 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0025 |
| 8.297 | 4.0 | 3600 | 10.3237 | 0.091 | 0.1369 | 0.0931 | -1.0 | 0.0333 | 0.1065 | 0.1766 | 0.1823 | 0.1831 | -1.0 | 0.0645 | 0.2251 | -1.0 | -1.0 | 0.0335 | 0.1585 | 0.5323 | 0.7073 | 0.0 | 0.0 | 0.0004 | 0.0284 | 0.0 | 0.0 | 0.247 | 0.4047 | -1.0 | -1.0 | 0.0035 | 0.1255 | -1.0 | -1.0 | 0.0025 | 0.2143 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.0091 |
| 8.0561 | 5.0 | 4500 | 10.1934 | 0.0654 | 0.111 | 0.0614 | -1.0 | 0.0533 | 0.0742 | 0.1453 | 0.1553 | 0.1554 | -1.0 | 0.0852 | 0.1806 | -1.0 | -1.0 | 0.0752 | 0.2366 | 0.2632 | 0.4109 | 0.0 | 0.0 | 0.0062 | 0.0227 | 0.0 | 0.0 | 0.2376 | 0.3519 | -1.0 | -1.0 | 0.0046 | 0.1765 | -1.0 | -1.0 | 0.0012 | 0.1857 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0001 | 0.014 |
| 7.8983 | 6.0 | 5400 | 10.7110 | 0.0409 | 0.0693 | 0.0405 | -1.0 | 0.0041 | 0.05 | 0.0957 | 0.1006 | 0.1006 | -1.0 | 0.01 | 0.1326 | -1.0 | -1.0 | 0.0302 | 0.1049 | 0.2532 | 0.5145 | 0.0 | 0.0 | 0.002 | 0.0125 | 0.0 | 0.0 | 0.0829 | 0.1736 | -1.0 | -1.0 | 0.0001 | 0.0357 | -1.0 | -1.0 | 0.0001 | 0.0571 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0066 |
| 7.5869 | 7.0 | 6300 | 10.6131 | 0.0649 | 0.0852 | 0.0659 | -1.0 | 0.0151 | 0.076 | 0.0956 | 0.096 | 0.096 | -1.0 | 0.0154 | 0.1153 | -1.0 | -1.0 | 0.0189 | 0.0553 | 0.481 | 0.5727 | 0.0 | 0.0 | 0.0001 | 0.0034 | 0.0 | 0.0 | 0.0617 | 0.1039 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0223 | 0.1286 | -1.0 | -1.0 | 0.0 | 0.0 |
| 7.3776 | 8.0 | 7200 | 10.5249 | 0.0716 | 0.0924 | 0.0756 | -1.0 | 0.0012 | 0.0849 | 0.0921 | 0.0922 | 0.0922 | -1.0 | 0.0011 | 0.1142 | -1.0 | -1.0 | 0.0248 | 0.0496 | 0.4052 | 0.4564 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.076 | 0.1318 | -1.0 | -1.0 | 0.0002 | 0.0061 | -1.0 | -1.0 | 0.1386 | 0.1857 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 7.2382 | 9.0 | 8100 | 11.2648 | 0.0429 | 0.0583 | 0.0455 | -1.0 | 0.0146 | 0.0492 | 0.0479 | 0.048 | 0.048 | -1.0 | 0.0143 | 0.0546 | -1.0 | -1.0 | 0.003 | 0.0024 | 0.2283 | 0.2582 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0361 | 0.0519 | 0.0003 | 0.0051 | -1.0 | -1.0 | 0.1188 | 0.1143 | -1.0 | -1.0 | 0.0 | 0.0 |
| 6.831 | 10.0 | 9000 | 10.7233 | 0.0528 | 0.0733 | 0.0541 | -1.0 | 0.0161 | 0.0593 | 0.0668 | 0.0668 | 0.0668 | -1.0 | 0.0158 | 0.0786 | -1.0 | -1.0 | 0.0088 | 0.0171 | 0.3129 | 0.3382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1213 | 0.1798 | 0.0319 | 0.0663 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 6.7327 | 11.0 | 9900 | 10.5114 | 0.0408 | 0.0499 | 0.0466 | -1.0 | 0.0033 | 0.0473 | 0.0531 | 0.0531 | 0.0531 | -1.0 | 0.0111 | 0.0592 | -1.0 | -1.0 | 0.0 | 0.0 | 0.3085 | 0.3709 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0588 | 0.1008 | -1.0 | -1.0 | 0.0 | 0.0061 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 |
| 6.4742 | 12.0 | 10800 | 10.3285 | 0.0807 | 0.1329 | 0.0795 | -1.0 | 0.0146 | 0.0912 | 0.0897 | 0.0901 | 0.0901 | -1.0 | 0.0146 | 0.1023 | 0.0007 | 0.0033 | 0.3936 | 0.4036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1238 | 0.1682 | 0.0002 | 0.0071 | 0.2076 | 0.2286 | 0.0 | 0.0 |
| 6.3868 | 13.0 | 11700 | 10.5031 | 0.0399 | 0.0471 | 0.0427 | -1.0 | 0.0089 | 0.0449 | 0.0422 | 0.0422 | 0.0422 | -1.0 | 0.0086 | 0.0476 | 0.0 | 0.0 | 0.2838 | 0.2891 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0752 | 0.0907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.2545 | 14.0 | 12600 | 10.7381 | 0.0744 | 0.091 | 0.0853 | -1.0 | 0.0 | 0.084 | 0.0757 | 0.0757 | 0.0757 | -1.0 | 0.0 | 0.0855 | 0.0 | 0.0 | 0.4021 | 0.4091 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0519 | 0.0581 | 0.0 | 0.0 | 0.2158 | 0.2143 | 0.0 | 0.0 |
| 6.224 | 15.0 | 13500 | 9.9789 | 0.1098 | 0.1532 | 0.1177 | -1.0 | 0.0062 | 0.1249 | 0.1182 | 0.1182 | 0.1182 | -1.0 | 0.0063 | 0.1369 | 0.0055 | 0.0187 | 0.5409 | 0.5655 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1279 | 0.1651 | 0.0 | 0.0 | 0.3139 | 0.3143 | 0.0 | 0.0 |
| 6.0771 | 16.0 | 14400 | 10.0770 | 0.105 | 0.1321 | 0.1192 | -1.0 | 0.0175 | 0.1193 | 0.1091 | 0.1091 | 0.1091 | -1.0 | 0.0171 | 0.1244 | 0.0004 | 0.0016 | 0.4296 | 0.4327 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1507 | 0.1752 | 0.0228 | 0.0153 | 0.3416 | 0.3571 | 0.0 | 0.0 |
| 6.0795 | 17.0 | 15300 | 9.5517 | 0.1306 | 0.1799 | 0.1463 | -1.0 | 0.0156 | 0.1519 | 0.1493 | 0.1493 | 0.1493 | -1.0 | 0.0197 | 0.1794 | -1.0 | -1.0 | 0.019 | 0.0569 | 0.6091 | 0.6545 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2918 | 0.3581 | 0.0397 | 0.0459 | 0.2158 | 0.2286 | 0.0 | 0.0 |
| 6.0202 | 18.0 | 16200 | 10.1174 | 0.0805 | 0.1046 | 0.0831 | -1.0 | 0.0088 | 0.092 | 0.0859 | 0.0859 | 0.0859 | -1.0 | 0.0086 | 0.1019 | -1.0 | -1.0 | 0.0053 | 0.0211 | 0.4644 | 0.4655 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1359 | 0.1721 | 0.0 | 0.0 | 0.1188 | 0.1143 | 0.0 | 0.0 |
| 5.7902 | 19.0 | 17100 | 10.5153 | 0.0924 | 0.1166 | 0.1017 | -1.0 | 0.0119 | 0.1051 | 0.1045 | 0.1045 | 0.1045 | -1.0 | 0.0114 | 0.1222 | -1.0 | -1.0 | 0.0027 | 0.0171 | 0.4373 | 0.4491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1928 | 0.2457 | 0.0 | 0.0 | 0.1983 | 0.2286 | 0.0 | 0.0 |
| 5.774 | 20.0 | 18000 | 10.6290 | 0.1295 | 0.1553 | 0.1475 | -1.0 | 0.0211 | 0.1474 | 0.1436 | 0.1436 | 0.1436 | -1.0 | 0.0202 | 0.1694 | -1.0 | -1.0 | 0.0048 | 0.0285 | 0.4796 | 0.4873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3066 | 0.3829 | 0.0053 | 0.0082 | 0.3696 | 0.3857 | 0.0 | 0.0 |
| 5.6177 | 21.0 | 18900 | 9.8951 | 0.135 | 0.1749 | 0.1554 | -1.0 | 0.035 | 0.1561 | 0.1516 | 0.1516 | 0.1516 | -1.0 | 0.0354 | 0.1793 | -1.0 | -1.0 | 0.0139 | 0.0415 | 0.5542 | 0.5691 | 0.0 | 0.0 | 0.0004 | 0.0011 | 0.0 | 0.0 | 0.3607 | 0.4364 | 0.055 | 0.0878 | 0.2307 | 0.2286 | 0.0 | 0.0 |
| 5.6471 | 22.0 | 19800 | 10.2807 | 0.0776 | 0.0958 | 0.0819 | -1.0 | 0.0299 | 0.0881 | 0.0811 | 0.0811 | 0.0811 | -1.0 | 0.0293 | 0.0948 | 0.0048 | 0.0179 | 0.482 | 0.4836 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1962 | 0.2202 | 0.0158 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.5442 | 23.0 | 20700 | 10.2248 | 0.0997 | 0.1248 | 0.1044 | -1.0 | 0.0209 | 0.1138 | 0.1052 | 0.1052 | 0.1052 | -1.0 | 0.0205 | 0.1239 | -1.0 | -1.0 | 0.0047 | 0.022 | 0.5267 | 0.5418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2273 | 0.2543 | 0.0198 | 0.0143 | 0.1188 | 0.1143 | 0.0 | 0.0 |
| 5.3507 | 24.0 | 21600 | 10.8557 | 0.1004 | 0.1178 | 0.111 | -1.0 | 0.0149 | 0.1144 | 0.1033 | 0.1033 | 0.1033 | -1.0 | 0.0143 | 0.1189 | 0.0036 | 0.0065 | 0.4187 | 0.4218 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2233 | 0.2372 | 0.0139 | 0.0071 | 0.2446 | 0.2571 | 0.0 | 0.0 |
| 5.395 | 25.0 | 22500 | 10.4770 | 0.1333 | 0.1663 | 0.1389 | -1.0 | 0.0348 | 0.1512 | 0.147 | 0.147 | 0.147 | -1.0 | 0.0343 | 0.1739 | -1.0 | -1.0 | 0.0078 | 0.0423 | 0.5931 | 0.6127 | 0.0 | 0.0 | 0.0028 | 0.0216 | 0.0 | 0.0 | 0.2955 | 0.3318 | 0.0 | 0.0 | 0.3 | 0.3143 | 0.0 | 0.0 |
| 5.2179 | 26.0 | 23400 | 9.9679 | 0.158 | 0.1981 | 0.173 | -1.0 | 0.0501 | 0.1833 | 0.1932 | 0.1936 | 0.1936 | -1.0 | 0.0532 | 0.2361 | -1.0 | -1.0 | 0.0341 | 0.1228 | 0.5985 | 0.6091 | 0.0 | 0.0 | 0.0003 | 0.0045 | 0.0 | 0.0 | 0.3278 | 0.3814 | 0.0666 | 0.0959 | 0.3944 | 0.5286 | 0.0 | 0.0 |
| 5.4533 | 27.0 | 24300 | 10.5207 | 0.1164 | 0.1355 | 0.1231 | -1.0 | 0.032 | 0.1303 | 0.131 | 0.131 | 0.131 | -1.0 | 0.0384 | 0.1454 | 0.0063 | 0.035 | 0.478 | 0.4818 | 0.0 | 0.0 | 0.0002 | 0.0023 | 0.0 | 0.0 | 0.1891 | 0.2279 | 0.0145 | 0.0316 | 0.3594 | 0.4 | 0.0 | 0.0 |
| 5.2858 | 28.0 | 25200 | 9.7756 | 0.1548 | 0.1938 | 0.1742 | -1.0 | 0.0434 | 0.1778 | 0.1723 | 0.1723 | 0.1723 | -1.0 | 0.0479 | 0.2053 | -1.0 | -1.0 | 0.0658 | 0.1561 | 0.511 | 0.5182 | 0.0 | 0.0 | 0.0008 | 0.0068 | 0.0 | 0.0 | 0.2647 | 0.2853 | 0.0795 | 0.099 | 0.4711 | 0.4857 | 0.0 | 0.0 |
| 5.2589 | 29.0 | 26100 | 9.7499 | 0.1537 | 0.1951 | 0.1685 | -1.0 | 0.0614 | 0.1863 | 0.1775 | 0.1775 | 0.1775 | -1.0 | 0.0729 | 0.2149 | -1.0 | -1.0 | 0.0437 | 0.1016 | 0.5447 | 0.5582 | 0.0 | 0.0 | 0.044 | 0.0602 | 0.0 | 0.0 | 0.2797 | 0.3132 | 0.194 | 0.2071 | 0.2769 | 0.3571 | 0.0 | 0.0 |
| 5.1659 | 30.0 | 27000 | 10.3720 | 0.1176 | 0.1385 | 0.1273 | -1.0 | 0.0436 | 0.1391 | 0.1241 | 0.1241 | 0.1241 | -1.0 | 0.0478 | 0.144 | 0.0734 | 0.1008 | 0.3363 | 0.3455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0884 | 0.0969 | 0.1696 | 0.1735 | 0.3906 | 0.4 | 0.0 | 0.0 |
| 5.1328 | 31.0 | 27900 | 9.0129 | 0.2236 | 0.3056 | 0.2417 | -1.0 | 0.0961 | 0.2664 | 0.2691 | 0.2691 | 0.2691 | -1.0 | 0.1086 | 0.3197 | -1.0 | -1.0 | 0.187 | 0.2984 | 0.6421 | 0.6545 | 0.0 | 0.0 | 0.0143 | 0.0273 | 0.0 | 0.0 | 0.3934 | 0.4372 | 0.2577 | 0.2908 | 0.5028 | 0.6714 | 0.0155 | 0.0421 |
| 5.0064 | 32.0 | 28800 | 10.2932 | 0.0894 | 0.1094 | 0.0957 | -1.0 | 0.0252 | 0.1051 | 0.0933 | 0.0933 | 0.0933 | -1.0 | 0.0323 | 0.1125 | -1.0 | -1.0 | 0.0171 | 0.0358 | 0.4061 | 0.4182 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1577 | 0.162 | 0.09 | 0.0949 | 0.1337 | 0.1286 | 0.0 | 0.0 |
| 4.9407 | 33.0 | 29700 | 10.7048 | 0.1219 | 0.1466 | 0.1331 | -1.0 | 0.036 | 0.1392 | 0.1394 | 0.1394 | 0.1394 | -1.0 | 0.0376 | 0.1675 | -1.0 | -1.0 | 0.0546 | 0.1171 | 0.5656 | 0.5709 | 0.0 | 0.0 | 0.001 | 0.0057 | 0.0 | 0.0 | 0.1269 | 0.1496 | 0.0315 | 0.0541 | 0.3173 | 0.3571 | 0.0 | 0.0 |
| 4.8729 | 34.0 | 30600 | 10.2508 | 0.1721 | 0.2085 | 0.1948 | -1.0 | 0.0473 | 0.1904 | 0.1878 | 0.1878 | 0.1878 | -1.0 | 0.0527 | 0.2119 | -1.0 | -1.0 | 0.0804 | 0.1358 | 0.5668 | 0.5836 | 0.0 | 0.0 | 0.0056 | 0.017 | 0.0 | 0.0 | 0.3077 | 0.3217 | 0.0932 | 0.1133 | 0.4946 | 0.5143 | 0.0003 | 0.0041 |
| 4.8488 | 35.0 | 31500 | 10.0591 | 0.1599 | 0.2129 | 0.168 | -1.0 | 0.0606 | 0.1816 | 0.1791 | 0.1791 | 0.1791 | -1.0 | 0.0613 | 0.2028 | -1.0 | -1.0 | 0.0399 | 0.0699 | 0.6811 | 0.7109 | 0.0 | 0.0 | 0.005 | 0.0227 | 0.0 | 0.0 | 0.2554 | 0.2729 | 0.0588 | 0.0704 | 0.3972 | 0.4571 | 0.0014 | 0.0083 |
| 4.8473 | 36.0 | 32400 | 9.7629 | 0.1791 | 0.2272 | 0.2013 | -1.0 | 0.041 | 0.213 | 0.2048 | 0.2048 | 0.2048 | -1.0 | 0.0447 | 0.2498 | -1.0 | -1.0 | 0.0699 | 0.1268 | 0.6334 | 0.6836 | 0.0 | 0.0 | 0.0044 | 0.0261 | 0.0 | 0.0 | 0.4428 | 0.4775 | 0.0918 | 0.1224 | 0.3689 | 0.4 | 0.0007 | 0.0066 |
| 4.7609 | 37.0 | 33300 | 9.4269 | 0.1982 | 0.2427 | 0.2185 | -1.0 | 0.0577 | 0.24 | 0.2165 | 0.2165 | 0.2165 | -1.0 | 0.0637 | 0.2576 | -1.0 | -1.0 | 0.1136 | 0.1496 | 0.6424 | 0.6582 | 0.0 | 0.0 | 0.045 | 0.0739 | 0.0 | 0.0 | 0.3606 | 0.3891 | 0.2512 | 0.2735 | 0.3696 | 0.3857 | 0.0018 | 0.0182 |
| 4.6472 | 38.0 | 34200 | 10.1803 | 0.1411 | 0.1705 | 0.1488 | -1.0 | 0.0508 | 0.1567 | 0.1685 | 0.1685 | 0.1685 | -1.0 | 0.0548 | 0.1875 | -1.0 | -1.0 | 0.031 | 0.065 | 0.4748 | 0.4873 | 0.0 | 0.0 | 0.0211 | 0.0273 | 0.0 | 0.0 | 0.3377 | 0.3519 | 0.1146 | 0.1276 | 0.2908 | 0.4571 | 0.0 | 0.0 |
| 4.6291 | 39.0 | 35100 | 9.9849 | 0.1439 | 0.1643 | 0.1611 | -1.0 | 0.0458 | 0.1635 | 0.1589 | 0.1589 | 0.1589 | -1.0 | 0.0474 | 0.1825 | -1.0 | -1.0 | 0.0454 | 0.078 | 0.5588 | 0.5691 | 0.0 | 0.0 | 0.0188 | 0.0136 | 0.0 | 0.0 | 0.2635 | 0.2752 | 0.1127 | 0.1367 | 0.2958 | 0.3571 | 0.0 | 0.0 |
| 4.6207 | 40.0 | 36000 | 10.4053 | 0.1035 | 0.1165 | 0.1116 | -1.0 | 0.0415 | 0.1157 | 0.1055 | 0.1055 | 0.1055 | -1.0 | 0.042 | 0.1162 | 0.052 | 0.0715 | 0.4691 | 0.4673 | 0.0 | 0.0 | 0.0173 | 0.0159 | 0.0 | 0.0 | 0.1972 | 0.2023 | 0.0474 | 0.05 | 0.1485 | 0.1429 | 0.0 | 0.0 |
| 4.6075 | 41.0 | 36900 | 10.3330 | 0.1099 | 0.1305 | 0.1159 | -1.0 | 0.0597 | 0.1156 | 0.1166 | 0.1166 | 0.1166 | -1.0 | 0.064 | 0.1199 | -1.0 | -1.0 | 0.1209 | 0.1667 | 0.4629 | 0.4655 | 0.0 | 0.0 | 0.0123 | 0.0216 | 0.0 | 0.0 | 0.2205 | 0.2364 | 0.0238 | 0.0163 | 0.1485 | 0.1429 | 0.0 | 0.0 |
| 4.5587 | 42.0 | 37800 | 10.0813 | 0.1694 | 0.2019 | 0.1948 | -1.0 | 0.0677 | 0.188 | 0.1835 | 0.1835 | 0.1835 | -1.0 | 0.0698 | 0.2018 | -1.0 | -1.0 | 0.0863 | 0.1293 | 0.5907 | 0.6073 | 0.0 | 0.0 | 0.0167 | 0.0352 | 0.0 | 0.0 | 0.242 | 0.2597 | 0.1077 | 0.1204 | 0.4812 | 0.5 | 0.0 | 0.0 |
| 4.4395 | 43.0 | 38700 | 10.9231 | 0.1042 | 0.1153 | 0.1127 | -1.0 | 0.0455 | 0.1119 | 0.108 | 0.108 | 0.108 | -1.0 | 0.0434 | 0.1179 | 0.0234 | 0.0439 | 0.4344 | 0.4364 | 0.0 | 0.0 | 0.0104 | 0.0159 | 0.0 | 0.0 | 0.1767 | 0.1829 | 0.0332 | 0.0357 | 0.2594 | 0.2571 | 0.0 | 0.0 |
| 4.521 | 44.0 | 39600 | 10.6241 | 0.1061 | 0.1202 | 0.113 | -1.0 | 0.0601 | 0.1181 | 0.1231 | 0.1231 | 0.1231 | -1.0 | 0.0642 | 0.138 | -1.0 | -1.0 | 0.0371 | 0.0724 | 0.3946 | 0.3982 | 0.0 | 0.0 | 0.0047 | 0.0148 | 0.0 | 0.0 | 0.2077 | 0.2171 | 0.1382 | 0.1622 | 0.1728 | 0.2429 | 0.0 | 0.0 |
| 4.4526 | 45.0 | 40500 | 9.8171 | 0.1453 | 0.1663 | 0.1632 | -1.0 | 0.0629 | 0.1544 | 0.1564 | 0.1564 | 0.1564 | -1.0 | 0.0651 | 0.1699 | -1.0 | -1.0 | 0.0485 | 0.0984 | 0.5592 | 0.5709 | 0.0 | 0.0 | 0.016 | 0.0182 | 0.0 | 0.0 | 0.2109 | 0.2171 | 0.0845 | 0.1031 | 0.3888 | 0.4 | 0.0 | 0.0 |
| 4.3424 | 46.0 | 41400 | 10.0793 | 0.1458 | 0.1735 | 0.1595 | -1.0 | 0.0935 | 0.1615 | 0.1625 | 0.1625 | 0.1625 | -1.0 | 0.1036 | 0.1809 | 0.1148 | 0.1837 | 0.451 | 0.4691 | 0.0 | 0.0 | 0.0436 | 0.0625 | 0.0 | 0.0 | 0.2497 | 0.2589 | 0.1852 | 0.2194 | 0.2594 | 0.2571 | 0.0089 | 0.0116 |
| 4.2732 | 47.0 | 42300 | 9.5580 | 0.1741 | 0.2259 | 0.1939 | -1.0 | 0.1167 | 0.1966 | 0.1976 | 0.1976 | 0.1976 | -1.0 | 0.132 | 0.2317 | -1.0 | -1.0 | 0.2186 | 0.322 | 0.431 | 0.4436 | 0.0 | 0.0 | 0.0532 | 0.0795 | 0.0 | 0.0 | 0.3535 | 0.3744 | 0.2987 | 0.3255 | 0.1485 | 0.1429 | 0.0631 | 0.0901 |
| 4.2635 | 48.0 | 43200 | 9.5880 | 0.1969 | 0.24 | 0.2237 | -1.0 | 0.1303 | 0.2253 | 0.2167 | 0.2167 | 0.2167 | -1.0 | 0.1444 | 0.2487 | -1.0 | -1.0 | 0.1506 | 0.2341 | 0.5663 | 0.5764 | 0.0 | 0.0 | 0.0469 | 0.0716 | 0.0624 | 0.06 | 0.3414 | 0.362 | 0.4027 | 0.4306 | 0.1485 | 0.1429 | 0.053 | 0.0727 |
| 4.3473 | 49.0 | 44100 | 9.1821 | 0.2263 | 0.2787 | 0.2537 | -1.0 | 0.1357 | 0.2767 | 0.2491 | 0.2491 | 0.2491 | -1.0 | 0.1573 | 0.2973 | 0.3114 | 0.374 | 0.5264 | 0.5345 | 0.0 | 0.0 | 0.0276 | 0.0841 | 0.0624 | 0.06 | 0.3972 | 0.4194 | 0.3691 | 0.3959 | 0.2594 | 0.2571 | 0.0831 | 0.1165 |
| 4.2556 | 50.0 | 45000 | 10.2247 | 0.1449 | 0.1713 | 0.1604 | -1.0 | 0.0754 | 0.1615 | 0.1551 | 0.1551 | 0.1551 | -1.0 | 0.0787 | 0.1788 | -1.0 | -1.0 | 0.13 | 0.1772 | 0.3987 | 0.4 | 0.0 | 0.0 | 0.0194 | 0.033 | 0.0 | 0.0 | 0.3078 | 0.324 | 0.1868 | 0.198 | 0.2594 | 0.2571 | 0.002 | 0.0066 |
| 4.1977 | 51.0 | 45900 | 9.2216 | 0.1874 | 0.2343 | 0.2091 | -1.0 | 0.1337 | 0.2085 | 0.217 | 0.217 | 0.217 | -1.0 | 0.155 | 0.2429 | -1.0 | -1.0 | 0.1896 | 0.2862 | 0.4743 | 0.4764 | 0.0 | 0.0 | 0.0366 | 0.0841 | 0.0 | 0.0 | 0.3866 | 0.4062 | 0.3421 | 0.3755 | 0.1485 | 0.1429 | 0.1093 | 0.1818 |
| 4.1898 | 52.0 | 46800 | 9.7273 | 0.1163 | 0.1369 | 0.1346 | -1.0 | 0.0681 | 0.1338 | 0.1406 | 0.1406 | 0.1406 | -1.0 | 0.0695 | 0.1616 | 0.0517 | 0.0911 | 0.3446 | 0.3455 | 0.0 | 0.0 | 0.0337 | 0.0443 | 0.0 | 0.0 | 0.3858 | 0.4031 | 0.1734 | 0.2153 | 0.0495 | 0.1429 | 0.0081 | 0.0231 |
| 4.1393 | 53.0 | 47700 | 9.0779 | 0.2185 | 0.2708 | 0.2442 | -1.0 | 0.1417 | 0.241 | 0.2516 | 0.2516 | 0.2516 | -1.0 | 0.1619 | 0.2854 | -1.0 | -1.0 | 0.1238 | 0.2358 | 0.5532 | 0.5636 | 0.0 | 0.0 | 0.0498 | 0.0773 | 0.0 | 0.0 | 0.4332 | 0.4566 | 0.3604 | 0.4163 | 0.2594 | 0.2571 | 0.1867 | 0.2579 |
| 4.1674 | 54.0 | 48600 | 9.2724 | 0.2032 | 0.2494 | 0.2284 | -1.0 | 0.1114 | 0.2295 | 0.2268 | 0.2268 | 0.2268 | -1.0 | 0.1284 | 0.2619 | -1.0 | -1.0 | 0.1533 | 0.2528 | 0.4144 | 0.4164 | 0.0 | 0.0 | 0.0603 | 0.108 | 0.0 | 0.0 | 0.3506 | 0.369 | 0.3706 | 0.399 | 0.3795 | 0.3857 | 0.1001 | 0.1107 |
| 4.062 | 55.0 | 49500 | 9.8853 | 0.1788 | 0.2132 | 0.1987 | -1.0 | 0.0857 | 0.2054 | 0.2002 | 0.2002 | 0.2002 | -1.0 | 0.0915 | 0.2295 | 0.11 | 0.1537 | 0.4711 | 0.4709 | 0.0 | 0.0 | 0.0284 | 0.0523 | 0.0 | 0.0 | 0.3651 | 0.3876 | 0.27 | 0.2969 | 0.3208 | 0.3857 | 0.0442 | 0.0545 |
| 4.0447 | 56.0 | 50400 | 9.4406 | 0.1889 | 0.2299 | 0.2152 | -1.0 | 0.1054 | 0.2198 | 0.2189 | 0.2189 | 0.2189 | -1.0 | 0.1204 | 0.2565 | -1.0 | -1.0 | 0.2242 | 0.3065 | 0.6034 | 0.6145 | 0.0 | 0.0 | 0.0543 | 0.1136 | 0.0 | 0.0 | 0.3333 | 0.355 | 0.2405 | 0.2551 | 0.1809 | 0.2429 | 0.0632 | 0.0826 |
| 4.0525 | 57.0 | 51300 | 8.5313 | 0.279 | 0.3513 | 0.3148 | -1.0 | 0.1617 | 0.3331 | 0.3197 | 0.3197 | 0.3197 | -1.0 | 0.1869 | 0.3819 | -1.0 | -1.0 | 0.2223 | 0.3358 | 0.6514 | 0.6855 | 0.0 | 0.0 | 0.0807 | 0.125 | 0.0693 | 0.0667 | 0.4394 | 0.4922 | 0.4583 | 0.5092 | 0.3218 | 0.3714 | 0.2681 | 0.2917 |
| 3.9868 | 58.0 | 52200 | 9.4776 | 0.2031 | 0.2472 | 0.2321 | -1.0 | 0.0976 | 0.2392 | 0.2349 | 0.2349 | 0.2349 | -1.0 | 0.1079 | 0.2783 | -1.0 | -1.0 | 0.1263 | 0.1813 | 0.5434 | 0.5636 | 0.0 | 0.0 | 0.0305 | 0.0591 | 0.0 | 0.0 | 0.4666 | 0.4977 | 0.3126 | 0.3653 | 0.3045 | 0.3857 | 0.0436 | 0.0612 |
| 3.985 | 59.0 | 53100 | 9.0269 | 0.2231 | 0.2748 | 0.2558 | -1.0 | 0.1278 | 0.2574 | 0.2467 | 0.2467 | 0.2467 | -1.0 | 0.1432 | 0.2854 | 0.181 | 0.2447 | 0.5378 | 0.5455 | 0.0 | 0.0 | 0.0371 | 0.0795 | 0.0 | 0.0 | 0.4728 | 0.5039 | 0.4175 | 0.4388 | 0.2594 | 0.2571 | 0.1024 | 0.1504 |
| 3.9343 | 60.0 | 54000 | 8.8573 | 0.2573 | 0.3279 | 0.2847 | -1.0 | 0.1713 | 0.2993 | 0.2878 | 0.2878 | 0.2878 | -1.0 | 0.1934 | 0.3424 | -1.0 | -1.0 | 0.2269 | 0.3415 | 0.5622 | 0.5782 | 0.0 | 0.0 | 0.0949 | 0.142 | 0.1178 | 0.1133 | 0.436 | 0.4589 | 0.403 | 0.4418 | 0.2594 | 0.2571 | 0.2153 | 0.257 |
| 3.8614 | 61.0 | 54900 | 8.9312 | 0.2245 | 0.2794 | 0.2618 | -1.0 | 0.1398 | 0.2543 | 0.2579 | 0.2579 | 0.2579 | -1.0 | 0.1603 | 0.2941 | -1.0 | -1.0 | 0.1612 | 0.239 | 0.5292 | 0.5436 | 0.0 | 0.0 | 0.0557 | 0.0682 | 0.0 | 0.0 | 0.5101 | 0.5426 | 0.4532 | 0.4969 | 0.1977 | 0.2429 | 0.1136 | 0.1876 |
| 3.8713 | 62.0 | 55800 | 10.4252 | 0.1653 | 0.1981 | 0.1872 | -1.0 | 0.118 | 0.1725 | 0.1811 | 0.1811 | 0.1811 | -1.0 | 0.1276 | 0.1868 | -1.0 | -1.0 | 0.1148 | 0.1642 | 0.404 | 0.3982 | 0.0 | 0.0 | 0.0452 | 0.0682 | 0.0 | 0.0 | 0.3502 | 0.3659 | 0.3017 | 0.3418 | 0.2594 | 0.2571 | 0.0126 | 0.0347 |
| 3.8619 | 63.0 | 56700 | 9.7946 | 0.198 | 0.2331 | 0.2177 | -1.0 | 0.0805 | 0.2209 | 0.2354 | 0.2354 | 0.2354 | -1.0 | 0.0869 | 0.2635 | -1.0 | -1.0 | 0.1488 | 0.2016 | 0.6364 | 0.6618 | 0.0 | 0.0 | 0.022 | 0.0284 | 0.0 | 0.0 | 0.4538 | 0.469 | 0.2603 | 0.298 | 0.245 | 0.3857 | 0.016 | 0.0744 |
| 3.7579 | 64.0 | 57600 | 9.9298 | 0.1695 | 0.2009 | 0.1844 | -1.0 | 0.0654 | 0.184 | 0.187 | 0.187 | 0.187 | -1.0 | 0.0749 | 0.2051 | -1.0 | -1.0 | 0.1455 | 0.2163 | 0.6024 | 0.6182 | 0.0 | 0.0 | 0.0542 | 0.0648 | 0.0 | 0.0 | 0.2904 | 0.2992 | 0.1685 | 0.199 | 0.2594 | 0.2571 | 0.0049 | 0.0281 |
| 3.8098 | 65.0 | 58500 | 9.9744 | 0.1621 | 0.1939 | 0.1829 | -1.0 | 0.1164 | 0.1675 | 0.1887 | 0.1887 | 0.1887 | -1.0 | 0.1267 | 0.196 | -1.0 | -1.0 | 0.1121 | 0.1593 | 0.4968 | 0.5236 | 0.0 | 0.0 | 0.0813 | 0.0898 | 0.0 | 0.0 | 0.3013 | 0.3147 | 0.1916 | 0.2286 | 0.2132 | 0.2429 | 0.0624 | 0.1397 |
| 3.7683 | 66.0 | 59400 | 8.7909 | 0.2397 | 0.2972 | 0.2721 | -1.0 | 0.1646 | 0.2569 | 0.2918 | 0.2918 | 0.2918 | -1.0 | 0.1961 | 0.321 | -1.0 | -1.0 | 0.204 | 0.3154 | 0.5382 | 0.5491 | 0.0 | 0.0 | 0.0935 | 0.1682 | 0.0 | 0.0 | 0.4722 | 0.4977 | 0.3572 | 0.4204 | 0.3315 | 0.3857 | 0.1609 | 0.2893 |
| 3.7267 | 67.0 | 60300 | 10.1599 | 0.1421 | 0.1752 | 0.1592 | -1.0 | 0.1203 | 0.1417 | 0.1724 | 0.1724 | 0.1724 | -1.0 | 0.1422 | 0.1699 | -1.0 | -1.0 | 0.1458 | 0.2203 | 0.404 | 0.4109 | 0.0 | 0.0 | 0.0483 | 0.0807 | 0.0 | 0.0 | 0.2656 | 0.2736 | 0.133 | 0.151 | 0.1563 | 0.2429 | 0.1256 | 0.1719 |
| 3.7199 | 68.0 | 61200 | 8.8641 | 0.2725 | 0.3427 | 0.3062 | -1.0 | 0.165 | 0.3088 | 0.3302 | 0.3302 | 0.3302 | -1.0 | 0.1866 | 0.3821 | -1.0 | -1.0 | 0.192 | 0.3049 | 0.6518 | 0.6764 | 0.0 | 0.0 | 0.0653 | 0.1045 | 0.0 | 0.0 | 0.4767 | 0.5 | 0.4832 | 0.5418 | 0.3478 | 0.5286 | 0.2359 | 0.3157 |
| 3.6778 | 69.0 | 62100 | 9.6715 | 0.2081 | 0.2582 | 0.2364 | -1.0 | 0.1571 | 0.2137 | 0.239 | 0.239 | 0.239 | -1.0 | 0.1809 | 0.2499 | -1.0 | -1.0 | 0.1625 | 0.2789 | 0.4731 | 0.4855 | 0.0 | 0.0 | 0.0654 | 0.0875 | 0.0 | 0.0 | 0.4162 | 0.4372 | 0.2975 | 0.3337 | 0.2594 | 0.2571 | 0.199 | 0.2711 |
| 3.6991 | 70.0 | 63000 | 9.6208 | 0.1455 | 0.1812 | 0.1655 | -1.0 | 0.1164 | 0.1521 | 0.1695 | 0.1695 | 0.1695 | -1.0 | 0.1272 | 0.1814 | -1.0 | -1.0 | 0.1428 | 0.2073 | 0.33 | 0.3382 | 0.0 | 0.0 | 0.0657 | 0.0818 | 0.0 | 0.0 | 0.4265 | 0.4465 | 0.1722 | 0.2071 | 0.1188 | 0.1143 | 0.0533 | 0.1306 |
| 3.6509 | 71.0 | 63900 | 9.7932 | 0.1775 | 0.2123 | 0.1978 | -1.0 | 0.1122 | 0.1929 | 0.2024 | 0.2024 | 0.2024 | -1.0 | 0.1233 | 0.2224 | -1.0 | -1.0 | 0.1002 | 0.1545 | 0.547 | 0.5764 | 0.0 | 0.0 | 0.0489 | 0.0636 | 0.0 | 0.0 | 0.44 | 0.4612 | 0.1909 | 0.2143 | 0.1485 | 0.1429 | 0.1221 | 0.2091 |
| 3.6655 | 72.0 | 64800 | 9.8964 | 0.1653 | 0.21 | 0.1912 | -1.0 | 0.1433 | 0.1751 | 0.1946 | 0.1946 | 0.1946 | -1.0 | 0.1648 | 0.2115 | -1.0 | -1.0 | 0.1848 | 0.278 | 0.1953 | 0.2236 | 0.0 | 0.0 | 0.0765 | 0.117 | 0.0 | 0.0 | 0.3586 | 0.3721 | 0.2494 | 0.2765 | 0.2594 | 0.2571 | 0.164 | 0.2273 |
| 3.614 | 73.0 | 65700 | 9.1561 | 0.2011 | 0.2487 | 0.2256 | -1.0 | 0.1361 | 0.2228 | 0.2284 | 0.2284 | 0.2284 | -1.0 | 0.1516 | 0.2567 | -1.0 | -1.0 | 0.1439 | 0.213 | 0.428 | 0.4491 | 0.0 | 0.0 | 0.042 | 0.0943 | 0.0 | 0.0 | 0.4314 | 0.4543 | 0.3205 | 0.3459 | 0.2594 | 0.2571 | 0.1846 | 0.2421 |
| 3.5871 | 74.0 | 66600 | 8.8299 | 0.2385 | 0.3006 | 0.2697 | -1.0 | 0.136 | 0.2656 | 0.2824 | 0.2824 | 0.2824 | -1.0 | 0.1616 | 0.3206 | -1.0 | -1.0 | 0.1886 | 0.2894 | 0.583 | 0.6109 | 0.0 | 0.0 | 0.0675 | 0.1352 | 0.0 | 0.0 | 0.4898 | 0.5194 | 0.287 | 0.3184 | 0.3092 | 0.3714 | 0.2208 | 0.2967 |
| 3.5323 | 75.0 | 67500 | 9.1346 | 0.2164 | 0.2716 | 0.2464 | -1.0 | 0.1572 | 0.2322 | 0.2676 | 0.2676 | 0.2676 | -1.0 | 0.1824 | 0.2938 | -1.0 | -1.0 | 0.1771 | 0.2837 | 0.5506 | 0.6 | 0.0 | 0.0 | 0.0665 | 0.1114 | 0.0 | 0.0 | 0.5254 | 0.5512 | 0.3222 | 0.3582 | 0.1258 | 0.2429 | 0.1804 | 0.2612 |
| 3.5715 | 76.0 | 68400 | 9.0672 | 0.1969 | 0.2543 | 0.2269 | -1.0 | 0.161 | 0.2057 | 0.2291 | 0.2291 | 0.2291 | -1.0 | 0.1847 | 0.245 | -1.0 | -1.0 | 0.1891 | 0.2886 | 0.2864 | 0.3182 | 0.0 | 0.0 | 0.0718 | 0.1023 | 0.0 | 0.0 | 0.4419 | 0.4643 | 0.3133 | 0.348 | 0.2594 | 0.2571 | 0.2101 | 0.2835 |
| 3.5523 | 77.0 | 69300 | 9.2915 | 0.1925 | 0.2454 | 0.2208 | -1.0 | 0.1451 | 0.207 | 0.2196 | 0.2196 | 0.2196 | -1.0 | 0.1599 | 0.243 | -1.0 | -1.0 | 0.2282 | 0.3057 | 0.3223 | 0.3382 | 0.0 | 0.0 | 0.0659 | 0.0977 | 0.0 | 0.0 | 0.3725 | 0.393 | 0.3095 | 0.3255 | 0.2594 | 0.2571 | 0.175 | 0.2595 |
| 3.5193 | 78.0 | 70200 | 9.0734 | 0.205 | 0.2597 | 0.2407 | -1.0 | 0.1565 | 0.2278 | 0.2399 | 0.2399 | 0.2399 | -1.0 | 0.1779 | 0.2736 | -1.0 | -1.0 | 0.1832 | 0.2886 | 0.453 | 0.4855 | 0.0 | 0.0 | 0.0593 | 0.1 | 0.0 | 0.0 | 0.4275 | 0.4465 | 0.3793 | 0.4061 | 0.1188 | 0.1143 | 0.2236 | 0.3182 |
| 3.4495 | 79.0 | 71100 | 10.0510 | 0.1351 | 0.1747 | 0.1584 | -1.0 | 0.1208 | 0.1439 | 0.1489 | 0.1489 | 0.1489 | -1.0 | 0.1345 | 0.1585 | 0.1153 | 0.1813 | 0.0879 | 0.0836 | 0.0 | 0.0 | 0.0499 | 0.0591 | 0.0 | 0.0 | 0.2831 | 0.293 | 0.2206 | 0.2276 | 0.2594 | 0.2571 | 0.2 | 0.238 |
| 3.455 | 80.0 | 72000 | 9.5833 | 0.1653 | 0.2049 | 0.1854 | -1.0 | 0.1352 | 0.1728 | 0.197 | 0.197 | 0.197 | -1.0 | 0.151 | 0.2099 | -1.0 | -1.0 | 0.1233 | 0.2252 | 0.3726 | 0.3927 | 0.0 | 0.0 | 0.0435 | 0.0591 | 0.0 | 0.0 | 0.2698 | 0.2791 | 0.1893 | 0.1929 | 0.2871 | 0.3857 | 0.2021 | 0.238 |
| 3.4406 | 81.0 | 72900 | 9.7269 | 0.1534 | 0.1879 | 0.1779 | -1.0 | 0.1228 | 0.1579 | 0.1668 | 0.1668 | 0.1668 | -1.0 | 0.1314 | 0.1733 | 0.119 | 0.1732 | 0.3015 | 0.3182 | 0.0 | 0.0 | 0.0489 | 0.0523 | 0.0 | 0.0 | 0.3038 | 0.3171 | 0.1879 | 0.1898 | 0.2594 | 0.2571 | 0.1597 | 0.1934 |
| 3.4391 | 82.0 | 73800 | 9.4147 | 0.169 | 0.2173 | 0.2035 | -1.0 | 0.1504 | 0.1803 | 0.1936 | 0.1937 | 0.1937 | -1.0 | 0.1643 | 0.2128 | -1.0 | -1.0 | 0.1551 | 0.2407 | 0.2475 | 0.2655 | 0.0 | 0.0 | 0.0596 | 0.0716 | 0.0 | 0.0 | 0.3349 | 0.3512 | 0.2202 | 0.2347 | 0.2371 | 0.2429 | 0.2662 | 0.3372 |
| 3.4209 | 83.0 | 74700 | 9.6021 | 0.1496 | 0.1985 | 0.1746 | -1.0 | 0.1504 | 0.1607 | 0.1706 | 0.1706 | 0.1706 | -1.0 | 0.1682 | 0.183 | 0.1401 | 0.2293 | 0.2354 | 0.2455 | 0.0 | 0.0 | 0.0535 | 0.0795 | 0.0 | 0.0 | 0.2488 | 0.2566 | 0.2304 | 0.2469 | 0.1188 | 0.1143 | 0.3197 | 0.3636 |
| 3.4073 | 84.0 | 75600 | 9.0867 | 0.1865 | 0.2371 | 0.2167 | -1.0 | 0.1441 | 0.2076 | 0.2148 | 0.2148 | 0.2148 | -1.0 | 0.1619 | 0.2416 | -1.0 | -1.0 | 0.117 | 0.2138 | 0.3945 | 0.42 | 0.0 | 0.0 | 0.0567 | 0.0909 | 0.0 | 0.0 | 0.3661 | 0.3822 | 0.2991 | 0.3163 | 0.1337 | 0.1286 | 0.3112 | 0.381 |
| 3.389 | 85.0 | 76500 | 9.4807 | 0.1768 | 0.2228 | 0.208 | -1.0 | 0.1415 | 0.1947 | 0.2009 | 0.2009 | 0.2009 | -1.0 | 0.1564 | 0.2257 | 0.1146 | 0.2138 | 0.203 | 0.2091 | 0.0 | 0.0 | 0.0497 | 0.0716 | 0.0 | 0.0 | 0.4156 | 0.4364 | 0.2752 | 0.2878 | 0.2594 | 0.2571 | 0.2738 | 0.3322 |
| 3.3456 | 86.0 | 77400 | 9.0557 | 0.2022 | 0.2525 | 0.2352 | -1.0 | 0.1575 | 0.2134 | 0.2282 | 0.2282 | 0.2282 | -1.0 | 0.1775 | 0.2422 | 0.127 | 0.226 | 0.4286 | 0.4455 | 0.0 | 0.0 | 0.0616 | 0.0932 | 0.0 | 0.0 | 0.3986 | 0.4217 | 0.2707 | 0.2908 | 0.2594 | 0.2571 | 0.2742 | 0.3198 |
| 3.3397 | 87.0 | 78300 | 9.3463 | 0.161 | 0.2092 | 0.1875 | -1.0 | 0.1475 | 0.1714 | 0.1976 | 0.1979 | 0.1979 | -1.0 | 0.1743 | 0.212 | 0.1265 | 0.2976 | 0.266 | 0.2818 | 0.0 | 0.0 | 0.0482 | 0.0864 | 0.0 | 0.0 | 0.2308 | 0.262 | 0.2237 | 0.248 | 0.2446 | 0.2571 | 0.3093 | 0.3479 |
| 3.3091 | 88.0 | 79200 | 8.8186 | 0.2121 | 0.2705 | 0.2469 | -1.0 | 0.1642 | 0.2347 | 0.2561 | 0.2561 | 0.2561 | -1.0 | 0.1892 | 0.2862 | 0.1803 | 0.3463 | 0.5122 | 0.5327 | 0.0 | 0.0 | 0.0572 | 0.1057 | 0.0 | 0.0 | 0.346 | 0.3597 | 0.2837 | 0.3071 | 0.178 | 0.2429 | 0.3515 | 0.4107 |
| 3.2853 | 89.0 | 80100 | 9.6335 | 0.17 | 0.2189 | 0.196 | -1.0 | 0.1388 | 0.1855 | 0.1947 | 0.1947 | 0.1947 | -1.0 | 0.1572 | 0.2136 | 0.166 | 0.2724 | 0.1945 | 0.2109 | 0.0 | 0.0 | 0.0618 | 0.0909 | 0.0 | 0.0 | 0.3438 | 0.3589 | 0.2247 | 0.2347 | 0.2594 | 0.2571 | 0.2797 | 0.3273 |
| 3.2441 | 90.0 | 81000 | 8.9347 | 0.212 | 0.2747 | 0.2508 | -1.0 | 0.156 | 0.241 | 0.2679 | 0.2681 | 0.2681 | -1.0 | 0.1794 | 0.3129 | -1.0 | -1.0 | 0.1655 | 0.326 | 0.3231 | 0.3673 | 0.0 | 0.0 | 0.0552 | 0.1 | 0.0 | 0.0 | 0.4246 | 0.4519 | 0.3288 | 0.3592 | 0.259 | 0.3857 | 0.3515 | 0.4231 |
| 3.2649 | 91.0 | 81900 | 9.2574 | 0.1865 | 0.2348 | 0.2174 | -1.0 | 0.1437 | 0.2068 | 0.2119 | 0.212 | 0.212 | -1.0 | 0.1631 | 0.2348 | 0.1361 | 0.2374 | 0.2932 | 0.3127 | 0.0 | 0.0 | 0.0514 | 0.0818 | 0.0 | 0.0 | 0.3714 | 0.3876 | 0.2742 | 0.2867 | 0.2594 | 0.2571 | 0.2927 | 0.3446 |
| 3.2003 | 92.0 | 82800 | 9.0196 | 0.2008 | 0.2581 | 0.2375 | -1.0 | 0.1537 | 0.2312 | 0.2323 | 0.2323 | 0.2323 | -1.0 | 0.1726 | 0.2709 | 0.1416 | 0.278 | 0.3877 | 0.4036 | 0.0 | 0.0 | 0.051 | 0.0864 | 0.0 | 0.0 | 0.4113 | 0.4333 | 0.3422 | 0.3673 | 0.1188 | 0.1143 | 0.3548 | 0.4074 |
| 3.2487 | 93.0 | 83700 | 9.3034 | 0.189 | 0.247 | 0.2236 | -1.0 | 0.1452 | 0.2184 | 0.2212 | 0.2212 | 0.2212 | -1.0 | 0.1653 | 0.2598 | 0.145 | 0.2789 | 0.2808 | 0.2982 | 0.0 | 0.0 | 0.0651 | 0.1114 | 0.0 | 0.0 | 0.4198 | 0.4411 | 0.3158 | 0.3388 | 0.1188 | 0.1143 | 0.3559 | 0.4083 |
| 3.2358 | 94.0 | 84600 | 9.0121 | 0.1913 | 0.2486 | 0.2262 | -1.0 | 0.1612 | 0.2164 | 0.224 | 0.224 | 0.224 | -1.0 | 0.1794 | 0.261 | 0.1607 | 0.3016 | 0.3036 | 0.3182 | 0.0 | 0.0 | 0.0677 | 0.1136 | 0.0 | 0.0 | 0.3968 | 0.4194 | 0.312 | 0.3286 | 0.1188 | 0.1143 | 0.3619 | 0.4207 |
| 3.2117 | 95.0 | 85500 | 9.3648 | 0.1753 | 0.2306 | 0.2071 | -1.0 | 0.1642 | 0.1976 | 0.207 | 0.207 | 0.207 | -1.0 | 0.1891 | 0.2329 | 0.1731 | 0.3057 | 0.2413 | 0.2473 | 0.0 | 0.0 | 0.0502 | 0.1057 | 0.0 | 0.0 | 0.3482 | 0.362 | 0.2897 | 0.3061 | 0.1188 | 0.1143 | 0.356 | 0.4215 |
| 3.2056 | 96.0 | 86400 | 9.3275 | 0.184 | 0.2436 | 0.2132 | -1.0 | 0.1546 | 0.2111 | 0.2224 | 0.2225 | 0.2225 | -1.0 | 0.1743 | 0.2547 | 0.168 | 0.2976 | 0.2201 | 0.2291 | 0.0 | 0.0 | 0.0492 | 0.0864 | 0.0 | 0.0 | 0.382 | 0.4047 | 0.337 | 0.3541 | 0.1661 | 0.2429 | 0.3335 | 0.3876 |
| 3.1736 | 97.0 | 87300 | 9.2785 | 0.1761 | 0.2313 | 0.208 | -1.0 | 0.1516 | 0.204 | 0.2101 | 0.2104 | 0.2104 | -1.0 | 0.1681 | 0.2506 | -1.0 | -1.0 | 0.1519 | 0.2691 | 0.2109 | 0.2436 | 0.0 | 0.0 | 0.0584 | 0.1045 | 0.0 | 0.0 | 0.3568 | 0.3837 | 0.3109 | 0.3357 | 0.1188 | 0.1143 | 0.3767 | 0.4421 |
| 3.1516 | 98.0 | 88200 | 9.1423 | 0.1878 | 0.2444 | 0.2213 | -1.0 | 0.1493 | 0.2108 | 0.2164 | 0.2166 | 0.2166 | -1.0 | 0.1666 | 0.2485 | -1.0 | -1.0 | 0.1642 | 0.2707 | 0.2692 | 0.2745 | 0.0 | 0.0 | 0.0568 | 0.1023 | 0.0 | 0.0 | 0.3523 | 0.3783 | 0.2575 | 0.2735 | 0.2371 | 0.2429 | 0.3533 | 0.4074 |
| 3.1589 | 99.0 | 89100 | 9.2758 | 0.1966 | 0.2583 | 0.2323 | -1.0 | 0.1694 | 0.2291 | 0.2274 | 0.2274 | 0.2274 | -1.0 | 0.1886 | 0.267 | -1.0 | -1.0 | 0.2091 | 0.3252 | 0.2661 | 0.2818 | 0.0 | 0.0 | 0.0629 | 0.1023 | 0.0 | 0.0 | 0.3853 | 0.407 | 0.3418 | 0.3663 | 0.1188 | 0.1143 | 0.3857 | 0.4496 |
| 3.1424 | 100.0 | 90000 | 9.6651 | 0.1641 | 0.2165 | 0.1939 | -1.0 | 0.1559 | 0.1813 | 0.1905 | 0.1905 | 0.1905 | -1.0 | 0.1714 | 0.2155 | -1.0 | -1.0 | 0.1715 | 0.2683 | 0.1911 | 0.2109 | 0.0 | 0.0 | 0.0576 | 0.0852 | 0.0 | 0.0 | 0.3186 | 0.3357 | 0.283 | 0.3112 | 0.1188 | 0.1143 | 0.3363 | 0.3893 |
| 3.0981 | 101.0 | 90900 | 9.2337 | 0.19 | 0.248 | 0.2244 | -1.0 | 0.164 | 0.2137 | 0.2299 | 0.2302 | 0.2302 | -1.0 | 0.1887 | 0.2625 | -1.0 | -1.0 | 0.1621 | 0.3098 | 0.1972 | 0.2291 | 0.0 | 0.0 | 0.0637 | 0.1261 | 0.0 | 0.0 | 0.3354 | 0.3698 | 0.3019 | 0.3378 | 0.2594 | 0.2571 | 0.3901 | 0.4421 |
| 3.0978 | 102.0 | 91800 | 8.8596 | 0.2145 | 0.2814 | 0.252 | -1.0 | 0.1901 | 0.2396 | 0.2516 | 0.252 | 0.252 | -1.0 | 0.2153 | 0.2832 | -1.0 | -1.0 | 0.1882 | 0.3041 | 0.2823 | 0.3018 | 0.0 | 0.0 | 0.0632 | 0.1341 | 0.0624 | 0.06 | 0.4214 | 0.4643 | 0.3736 | 0.4133 | 0.1188 | 0.1143 | 0.4205 | 0.476 |
| 3.0633 | 103.0 | 92700 | 8.8539 | 0.2087 | 0.2734 | 0.2512 | -1.0 | 0.1727 | 0.2422 | 0.247 | 0.2473 | 0.2473 | -1.0 | 0.196 | 0.2916 | -1.0 | -1.0 | 0.2013 | 0.3447 | 0.2797 | 0.3018 | 0.0 | 0.0 | 0.0678 | 0.1261 | 0.0 | 0.0 | 0.4138 | 0.4581 | 0.4035 | 0.4398 | 0.1188 | 0.1143 | 0.3935 | 0.4405 |
| 3.0413 | 104.0 | 93600 | 8.6098 | 0.2477 | 0.3234 | 0.2979 | -1.0 | 0.1961 | 0.2835 | 0.286 | 0.2863 | 0.2863 | -1.0 | 0.2206 | 0.3329 | -1.0 | -1.0 | 0.2421 | 0.3691 | 0.4016 | 0.4236 | 0.0 | 0.0 | 0.0789 | 0.1477 | 0.0554 | 0.0533 | 0.4569 | 0.4977 | 0.4375 | 0.4755 | 0.1188 | 0.1143 | 0.4379 | 0.495 |
| 3.0627 | 105.0 | 94500 | 9.0454 | 0.2242 | 0.2876 | 0.262 | -1.0 | 0.1644 | 0.26 | 0.2572 | 0.2572 | 0.2572 | -1.0 | 0.1859 | 0.2985 | 0.1921 | 0.3146 | 0.351 | 0.3673 | 0.0 | 0.0 | 0.0618 | 0.1182 | 0.0 | 0.0 | 0.408 | 0.4333 | 0.3617 | 0.3939 | 0.2594 | 0.2571 | 0.384 | 0.4306 |
| 3.0311 | 106.0 | 95400 | 8.5642 | 0.264 | 0.3394 | 0.3099 | -1.0 | 0.1957 | 0.3008 | 0.302 | 0.3021 | 0.3021 | -1.0 | 0.2235 | 0.3483 | 0.2405 | 0.3789 | 0.3999 | 0.4145 | 0.0 | 0.0 | 0.072 | 0.1364 | 0.0624 | 0.06 | 0.4704 | 0.5163 | 0.4218 | 0.451 | 0.2594 | 0.2571 | 0.4499 | 0.505 |
| 3.0275 | 107.0 | 96300 | 8.7584 | 0.2422 | 0.3099 | 0.2843 | -1.0 | 0.1623 | 0.2867 | 0.2792 | 0.2793 | 0.2793 | -1.0 | 0.1879 | 0.3312 | 0.201 | 0.339 | 0.3975 | 0.4145 | 0.0 | 0.0 | 0.0732 | 0.1375 | 0.0 | 0.0 | 0.438 | 0.4791 | 0.3918 | 0.4214 | 0.2594 | 0.2571 | 0.4191 | 0.4653 |
| 2.9986 | 108.0 | 97200 | 8.8765 | 0.2343 | 0.3021 | 0.2755 | -1.0 | 0.1788 | 0.2655 | 0.2691 | 0.2692 | 0.2692 | -1.0 | 0.2055 | 0.3037 | 0.2017 | 0.3293 | 0.3655 | 0.38 | 0.0 | 0.0 | 0.0669 | 0.1284 | 0.0 | 0.0 | 0.4343 | 0.4628 | 0.3651 | 0.3969 | 0.2594 | 0.2571 | 0.4156 | 0.4686 |
| 2.9829 | 109.0 | 98100 | 8.8965 | 0.2281 | 0.2935 | 0.2683 | -1.0 | 0.1711 | 0.2631 | 0.2646 | 0.2649 | 0.2649 | -1.0 | 0.1935 | 0.3099 | 0.192 | 0.3447 | 0.3279 | 0.3491 | 0.0 | 0.0 | 0.0709 | 0.1205 | 0.0 | 0.0 | 0.4205 | 0.4504 | 0.3721 | 0.4051 | 0.2594 | 0.2571 | 0.4103 | 0.457 |
| 2.9701 | 110.0 | 99000 | 8.8118 | 0.2344 | 0.3012 | 0.2761 | -1.0 | 0.174 | 0.2694 | 0.2716 | 0.2717 | 0.2717 | -1.0 | 0.1962 | 0.3161 | 0.1832 | 0.3236 | 0.3535 | 0.38 | 0.0 | 0.0 | 0.0694 | 0.1273 | 0.0 | 0.0 | 0.4355 | 0.4659 | 0.4009 | 0.4347 | 0.2594 | 0.2571 | 0.4078 | 0.457 |
| 2.953 | 111.0 | 99900 | 8.8947 | 0.2303 | 0.2986 | 0.273 | -1.0 | 0.1721 | 0.2658 | 0.268 | 0.2684 | 0.2684 | -1.0 | 0.1943 | 0.3124 | 0.1789 | 0.3154 | 0.3476 | 0.3673 | 0.0 | 0.0 | 0.0666 | 0.1261 | 0.0 | 0.0 | 0.4472 | 0.4806 | 0.3824 | 0.4143 | 0.2446 | 0.2571 | 0.4059 | 0.4545 |
| 2.9259 | 112.0 | 100800 | 8.8265 | 0.2332 | 0.3019 | 0.274 | -1.0 | 0.1637 | 0.2746 | 0.269 | 0.2694 | 0.2694 | -1.0 | 0.1857 | 0.3199 | 0.2121 | 0.3439 | 0.3501 | 0.3673 | 0.0 | 0.0 | 0.0684 | 0.1284 | 0.0 | 0.0 | 0.4288 | 0.4643 | 0.3591 | 0.3888 | 0.2594 | 0.2571 | 0.4208 | 0.4744 |
| 2.9515 | 113.0 | 101700 | 9.0842 | 0.2178 | 0.2828 | 0.2573 | -1.0 | 0.1701 | 0.2505 | 0.2493 | 0.2495 | 0.2495 | -1.0 | 0.1933 | 0.2861 | 0.1957 | 0.3163 | 0.2567 | 0.2655 | 0.0 | 0.0 | 0.0695 | 0.1193 | 0.0 | 0.0 | 0.4211 | 0.4504 | 0.3596 | 0.3878 | 0.2594 | 0.2571 | 0.3985 | 0.4488 |
| 2.9245 | 114.0 | 102600 | 8.9998 | 0.2242 | 0.2932 | 0.2672 | -1.0 | 0.1679 | 0.2626 | 0.2583 | 0.2585 | 0.2585 | -1.0 | 0.1904 | 0.3046 | 0.2053 | 0.3366 | 0.2567 | 0.2655 | 0.0 | 0.0 | 0.0692 | 0.1182 | 0.0 | 0.0 | 0.4168 | 0.455 | 0.4047 | 0.4367 | 0.2594 | 0.2571 | 0.4061 | 0.457 |
| 2.8905 | 115.0 | 103500 | 8.8322 | 0.2384 | 0.3093 | 0.2825 | -1.0 | 0.1688 | 0.2805 | 0.277 | 0.2772 | 0.2772 | -1.0 | 0.1946 | 0.328 | 0.2081 | 0.3602 | 0.3419 | 0.3655 | 0.0 | 0.0 | 0.0688 | 0.1364 | 0.0 | 0.0 | 0.437 | 0.469 | 0.4001 | 0.4306 | 0.2594 | 0.2571 | 0.4306 | 0.476 |
| 2.9102 | 116.0 | 104400 | 8.9404 | 0.2228 | 0.2887 | 0.2616 | -1.0 | 0.1648 | 0.2628 | 0.2581 | 0.2584 | 0.2584 | -1.0 | 0.1873 | 0.3033 | 0.1968 | 0.3228 | 0.2853 | 0.3018 | 0.0 | 0.0 | 0.0627 | 0.1261 | 0.0 | 0.0 | 0.4243 | 0.4566 | 0.3693 | 0.4031 | 0.2594 | 0.2571 | 0.407 | 0.4579 |
| 2.8889 | 117.0 | 105300 | 8.9507 | 0.2298 | 0.2987 | 0.2726 | -1.0 | 0.1689 | 0.2675 | 0.2636 | 0.2639 | 0.2639 | -1.0 | 0.1915 | 0.3069 | 0.2044 | 0.326 | 0.3122 | 0.3309 | 0.0 | 0.0 | 0.0685 | 0.1284 | 0.0 | 0.0 | 0.4283 | 0.462 | 0.3953 | 0.4214 | 0.2594 | 0.2571 | 0.4006 | 0.4488 |
| 2.8859 | 118.0 | 106200 | 8.9731 | 0.2261 | 0.2935 | 0.268 | -1.0 | 0.1659 | 0.2642 | 0.2591 | 0.2594 | 0.2594 | -1.0 | 0.1885 | 0.3039 | 0.1984 | 0.3228 | 0.2985 | 0.3127 | 0.0 | 0.0 | 0.067 | 0.125 | 0.0 | 0.0 | 0.4224 | 0.4558 | 0.3867 | 0.4122 | 0.2594 | 0.2571 | 0.4027 | 0.4488 |
| 2.8696 | 119.0 | 107100 | 8.9422 | 0.2248 | 0.2919 | 0.2647 | -1.0 | 0.1645 | 0.2632 | 0.2575 | 0.2577 | 0.2577 | -1.0 | 0.1867 | 0.3002 | 0.2017 | 0.3154 | 0.2861 | 0.3018 | 0.0 | 0.0 | 0.0631 | 0.1193 | 0.0 | 0.0 | 0.4303 | 0.4667 | 0.3836 | 0.4133 | 0.2594 | 0.2571 | 0.3991 | 0.4455 |
| 2.8514 | 120.0 | 108000 | 8.9632 | 0.2254 | 0.292 | 0.2645 | -1.0 | 0.1667 | 0.2632 | 0.2582 | 0.2584 | 0.2584 | -1.0 | 0.1908 | 0.3001 | 0.2041 | 0.3325 | 0.2904 | 0.3018 | 0.0 | 0.0 | 0.0647 | 0.1193 | 0.0 | 0.0 | 0.4248 | 0.462 | 0.3882 | 0.4143 | 0.2594 | 0.2571 | 0.3973 | 0.4388 |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.5.1
- Datasets 3.2.0
- Tokenizers 0.21.1
| [
"ampulla of vater",
"angiectasia",
"blood - fresh",
"blood - hematin",
"erosion",
"erythema",
"foreign body",
"ileocecal valve",
"lymphangiectasia",
"normal clean mucosa",
"polyp",
"pylorus",
"reduced mucosal view",
"ulcer"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.