This model is a fine-tuned version of huawei-noah/TinyBERT_General_4L_312D on SQuAD dataset.
Training parameters
- learning_rate: 1e-05
- train_batch_size: 20
- eval_batch_size: 32
- optimizer:paged_adamw_32bit
- num_epochs: 1
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.