Description
This model is an adapted version of an mDeBERTa model, fine-tuned on the SQuAD v2 dataset for the COVID-19 domain and optimized for the Greek language.
Training Details
- Training Dataset:
COVID-QA-el-small
- Batch Size: 8
- Number of Epochs: 3
- Learning Rate: 3e-05
- Gradient Accumulation Steps: 2
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for panosgriz/mdeberta-v3-base-squad2-covid-el-small
Base model
microsoft/mdeberta-v3-base
Finetuned
timpal0l/mdeberta-v3-base-squad2