EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-07-24_ent_g75

This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8247
  • Accuracy: 0.7225
  • Exit 0 Accuracy: 0.0725
  • Exit 1 Accuracy: 0.0625
  • Exit 2 Accuracy: 0.4775
  • Exit 3 Accuracy: 0.68
  • Exit 4 Accuracy: 0.71

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 24
  • total_train_batch_size: 48
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Accuracy Exit 0 Accuracy Exit 1 Accuracy Exit 2 Accuracy Exit 3 Accuracy Exit 4 Accuracy
No log 0.96 16 2.6912 0.1525 0.0625 0.0675 0.1075 0.0625 0.0625
No log 1.98 33 2.5217 0.24 0.0725 0.0675 0.1175 0.0675 0.0625
No log 3.0 50 2.3171 0.325 0.085 0.0675 0.1475 0.11 0.085
No log 3.96 66 2.1709 0.3675 0.0625 0.0675 0.2125 0.17 0.12
No log 4.98 83 2.0866 0.4625 0.0625 0.0675 0.2075 0.1825 0.1875
No log 6.0 100 1.8124 0.52 0.065 0.065 0.2175 0.2525 0.3775
No log 6.96 116 1.5919 0.58 0.0675 0.065 0.23 0.3125 0.575
No log 7.98 133 1.3590 0.67 0.0675 0.065 0.2475 0.3425 0.61
No log 9.0 150 1.2656 0.675 0.0675 0.065 0.245 0.405 0.6575
No log 9.96 166 1.1703 0.715 0.07 0.065 0.2575 0.435 0.695
No log 10.98 183 1.0771 0.7075 0.0625 0.065 0.275 0.505 0.685
No log 12.0 200 1.0342 0.73 0.065 0.065 0.285 0.535 0.6925
No log 12.96 216 1.1164 0.695 0.0675 0.065 0.29 0.5475 0.675
No log 13.98 233 1.2252 0.6675 0.075 0.065 0.31 0.5475 0.6875
No log 15.0 250 1.1467 0.7025 0.0625 0.065 0.285 0.5875 0.6875
No log 15.96 266 1.1496 0.705 0.07 0.065 0.3225 0.5775 0.7025
No log 16.98 283 1.2554 0.705 0.065 0.065 0.315 0.59 0.7025
No log 18.0 300 1.1977 0.7075 0.0675 0.065 0.335 0.605 0.705
No log 18.96 316 1.2663 0.71 0.0675 0.065 0.345 0.6125 0.7125
No log 19.98 333 1.2708 0.72 0.0675 0.065 0.3425 0.63 0.72
No log 21.0 350 1.2362 0.7225 0.0675 0.065 0.36 0.63 0.72
No log 21.96 366 1.3372 0.6975 0.0675 0.0625 0.365 0.64 0.715
No log 22.98 383 1.3829 0.7125 0.0725 0.0625 0.3625 0.6375 0.715
No log 24.0 400 1.3804 0.72 0.0725 0.0625 0.3725 0.65 0.715
No log 24.96 416 1.3353 0.725 0.0925 0.0625 0.39 0.6425 0.725
No log 25.98 433 1.4828 0.7125 0.0725 0.0625 0.3925 0.6425 0.7125
No log 27.0 450 1.4568 0.725 0.07 0.0625 0.43 0.655 0.72
No log 27.96 466 1.5439 0.71 0.0675 0.0625 0.415 0.6625 0.705
No log 28.98 483 1.5584 0.7125 0.0725 0.0625 0.44 0.67 0.72
1.8318 30.0 500 1.5625 0.7125 0.0725 0.0625 0.455 0.665 0.7125
1.8318 30.96 516 1.5523 0.7125 0.0675 0.0625 0.445 0.6725 0.71
1.8318 31.98 533 1.6609 0.6925 0.0675 0.0625 0.46 0.675 0.705
1.8318 33.0 550 1.6248 0.72 0.0775 0.0625 0.45 0.675 0.7175
1.8318 33.96 566 1.6584 0.7225 0.0675 0.0625 0.45 0.68 0.7175
1.8318 34.98 583 1.6129 0.7275 0.0675 0.0625 0.4525 0.675 0.725
1.8318 36.0 600 1.6905 0.7225 0.065 0.0625 0.465 0.6775 0.725
1.8318 36.96 616 1.7220 0.7125 0.065 0.0625 0.4525 0.6825 0.71
1.8318 37.98 633 1.6845 0.7175 0.0675 0.0625 0.46 0.685 0.72
1.8318 39.0 650 1.7377 0.7175 0.0725 0.0625 0.4625 0.6775 0.7125
1.8318 39.96 666 1.7170 0.72 0.0875 0.0625 0.4575 0.685 0.7275
1.8318 40.98 683 1.7412 0.7125 0.0725 0.0625 0.465 0.685 0.715
1.8318 42.0 700 1.7694 0.7125 0.07 0.0625 0.4575 0.6825 0.715
1.8318 42.96 716 1.7794 0.715 0.0725 0.0625 0.4675 0.675 0.7125
1.8318 43.98 733 1.7864 0.7175 0.07 0.0625 0.4625 0.6775 0.7175
1.8318 45.0 750 1.8285 0.7075 0.07 0.0625 0.47 0.6775 0.7125
1.8318 45.96 766 1.8063 0.7125 0.07 0.0625 0.4775 0.68 0.71
1.8318 46.98 783 1.8103 0.72 0.0725 0.0625 0.4725 0.6825 0.725
1.8318 48.0 800 1.8323 0.7175 0.0725 0.0625 0.47 0.68 0.72
1.8318 48.96 816 1.8306 0.72 0.0725 0.0625 0.475 0.685 0.715
1.8318 49.98 833 1.8208 0.7225 0.07 0.0625 0.48 0.6825 0.7175
1.8318 51.0 850 1.8062 0.72 0.07 0.0625 0.4775 0.685 0.7175
1.8318 51.96 866 1.8022 0.7175 0.0725 0.0625 0.48 0.68 0.7175
1.8318 52.98 883 1.8090 0.7175 0.07 0.0625 0.475 0.685 0.715
1.8318 54.0 900 1.8197 0.7175 0.0725 0.0625 0.4775 0.68 0.7125
1.8318 54.96 916 1.8260 0.72 0.0725 0.0625 0.4775 0.6825 0.7125
1.8318 55.98 933 1.8257 0.72 0.0725 0.0625 0.4775 0.68 0.7125
1.8318 57.0 950 1.8264 0.72 0.0725 0.0625 0.4775 0.68 0.7125
1.8318 57.6 960 1.8247 0.7225 0.0725 0.0625 0.4775 0.68 0.71

Framework versions

  • Transformers 4.31.0
  • Pytorch 1.13.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
21
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Omar95farag/EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-07-24_ent_g75

Finetuned
(217)
this model