layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6864
  • Answer: {'precision': 0.6979722518676628, 'recall': 0.8084054388133498, 'f1': 0.7491408934707904, 'number': 809}
  • Header: {'precision': 0.296875, 'recall': 0.31932773109243695, 'f1': 0.3076923076923077, 'number': 119}
  • Question: {'precision': 0.7652790079716564, 'recall': 0.8112676056338028, 'f1': 0.7876025524156791, 'number': 1065}
  • Overall Precision: 0.7092
  • Overall Recall: 0.7807
  • Overall F1: 0.7433
  • Overall Accuracy: 0.8094

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7694 1.0 10 1.6060 {'precision': 0.024282560706401765, 'recall': 0.013597033374536464, 'f1': 0.01743264659270998, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3227176220806794, 'recall': 0.14272300469483568, 'f1': 0.19791666666666666, 'number': 1065} 0.1764 0.0818 0.1118 0.3371
1.456 2.0 20 1.2789 {'precision': 0.21739130434782608, 'recall': 0.315203955500618, 'f1': 0.2573158425832492, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.37261146496815284, 'recall': 0.5492957746478874, 'f1': 0.444022770398482, 'number': 1065} 0.3062 0.4215 0.3547 0.5669
1.1265 3.0 30 0.9633 {'precision': 0.46710526315789475, 'recall': 0.6143386897404203, 'f1': 0.5306994127068873, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.5227606461086637, 'recall': 0.6685446009389672, 'f1': 0.5867325916769675, 'number': 1065} 0.4984 0.6066 0.5472 0.6822
0.8681 4.0 40 0.8085 {'precision': 0.5848690591658584, 'recall': 0.7453646477132262, 'f1': 0.6554347826086957, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.6259607173356105, 'recall': 0.6882629107981221, 'f1': 0.6556350626118067, 'number': 1065} 0.6026 0.6703 0.6347 0.7390
0.6998 5.0 50 0.7327 {'precision': 0.61875, 'recall': 0.7342398022249691, 'f1': 0.6715658564160543, 'number': 809} {'precision': 0.14285714285714285, 'recall': 0.07563025210084033, 'f1': 0.0989010989010989, 'number': 119} {'precision': 0.6388676358071921, 'recall': 0.784037558685446, 'f1': 0.7040472175379427, 'number': 1065} 0.6172 0.7215 0.6653 0.7721
0.5889 6.0 60 0.6932 {'precision': 0.6111111111111112, 'recall': 0.788627935723115, 'f1': 0.6886130599028603, 'number': 809} {'precision': 0.1791044776119403, 'recall': 0.10084033613445378, 'f1': 0.12903225806451613, 'number': 119} {'precision': 0.7079964061096137, 'recall': 0.739906103286385, 'f1': 0.7235996326905417, 'number': 1065} 0.6466 0.7215 0.6820 0.7731
0.5103 7.0 70 0.6603 {'precision': 0.6570247933884298, 'recall': 0.7861557478368356, 'f1': 0.7158131682611143, 'number': 809} {'precision': 0.3058823529411765, 'recall': 0.2184873949579832, 'f1': 0.2549019607843137, 'number': 119} {'precision': 0.7271937445699392, 'recall': 0.7859154929577464, 'f1': 0.7554151624548736, 'number': 1065} 0.6801 0.7521 0.7143 0.7886
0.4557 8.0 80 0.6577 {'precision': 0.649949849548646, 'recall': 0.8009888751545118, 'f1': 0.7176079734219271, 'number': 809} {'precision': 0.2641509433962264, 'recall': 0.23529411764705882, 'f1': 0.24888888888888888, 'number': 119} {'precision': 0.7243150684931506, 'recall': 0.7943661971830986, 'f1': 0.7577250335871025, 'number': 1065} 0.6702 0.7637 0.7139 0.7959
0.3927 9.0 90 0.6559 {'precision': 0.6729559748427673, 'recall': 0.7935723114956736, 'f1': 0.7283040272263188, 'number': 809} {'precision': 0.29310344827586204, 'recall': 0.2857142857142857, 'f1': 0.2893617021276596, 'number': 119} {'precision': 0.7451838879159369, 'recall': 0.7990610328638498, 'f1': 0.7711826008155868, 'number': 1065} 0.6903 0.7662 0.7263 0.8041
0.3806 10.0 100 0.6697 {'precision': 0.6778242677824268, 'recall': 0.8009888751545118, 'f1': 0.7342776203966006, 'number': 809} {'precision': 0.2719298245614035, 'recall': 0.2605042016806723, 'f1': 0.26609442060085836, 'number': 119} {'precision': 0.7642418930762489, 'recall': 0.8187793427230047, 'f1': 0.7905711695376247, 'number': 1065} 0.7015 0.7782 0.7379 0.8083
0.3299 11.0 110 0.6691 {'precision': 0.6905016008537886, 'recall': 0.799752781211372, 'f1': 0.7411225658648338, 'number': 809} {'precision': 0.30578512396694213, 'recall': 0.31092436974789917, 'f1': 0.30833333333333335, 'number': 119} {'precision': 0.7675628794449263, 'recall': 0.8309859154929577, 'f1': 0.7980162308385933, 'number': 1065} 0.7096 0.7873 0.7464 0.8102
0.3093 12.0 120 0.6782 {'precision': 0.6955128205128205, 'recall': 0.8046971569839307, 'f1': 0.746131805157593, 'number': 809} {'precision': 0.3008130081300813, 'recall': 0.31092436974789917, 'f1': 0.3057851239669422, 'number': 119} {'precision': 0.7582222222222222, 'recall': 0.8009389671361502, 'f1': 0.7789954337899543, 'number': 1065} 0.7056 0.7732 0.7379 0.8096
0.2923 13.0 130 0.6818 {'precision': 0.6923890063424947, 'recall': 0.8096415327564895, 'f1': 0.7464387464387465, 'number': 809} {'precision': 0.3217391304347826, 'recall': 0.31092436974789917, 'f1': 0.3162393162393162, 'number': 119} {'precision': 0.7752212389380531, 'recall': 0.8225352112676056, 'f1': 0.7981776765375853, 'number': 1065} 0.7157 0.7868 0.7495 0.8091
0.2694 14.0 140 0.6849 {'precision': 0.7018299246501615, 'recall': 0.8059332509270705, 'f1': 0.7502876869965477, 'number': 809} {'precision': 0.29133858267716534, 'recall': 0.31092436974789917, 'f1': 0.3008130081300813, 'number': 119} {'precision': 0.7716814159292036, 'recall': 0.8187793427230047, 'f1': 0.7945330296127562, 'number': 1065} 0.7141 0.7832 0.7471 0.8097
0.2738 15.0 150 0.6864 {'precision': 0.6979722518676628, 'recall': 0.8084054388133498, 'f1': 0.7491408934707904, 'number': 809} {'precision': 0.296875, 'recall': 0.31932773109243695, 'f1': 0.3076923076923077, 'number': 119} {'precision': 0.7652790079716564, 'recall': 0.8112676056338028, 'f1': 0.7876025524156791, 'number': 1065} 0.7092 0.7807 0.7433 0.8094

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
17
Safetensors
Model size
113M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for aparnajjn/layoutlm-funsd

Finetuned
(169)
this model