Cypriot-Greek BERT Model π¨πΎ
A specialized BERT model fine-tuned for Masked Language Modeling (MLM) on Cypriot dialect and Modern Greek language pairs.
Model Details
- Base Model:
bert-base-greek-uncased-v1
- Model Type: Masked Language Model (MLM)
- Languages: Cypriot Greek (dialect), Modern Greek (standard)
- Dataset Size: 30,000 Cypriot-Greek language pairs
- Training Task: Bidirectional masked language modeling
Training Configuration
Hyperparameters
- Learning Rate: 5e-5
- Batch Size: 16 per device
- Gradient Accumulation: 1 step
- Epochs: 8
- Warmup Steps: 1,000
- Weight Decay: 0.01
Hardware & Optimization
- Precision: bfloat16 (bf16)
- Gradient Checkpointing: Disabled
- Memory Optimization: Pin memory enabled
- Data Loading: 4 workers for parallel processing
- Hardware: A100 GPU 40gb
Steps
Step | Train Loss | Eval Loss |
---|---|---|
500 | 1.90 | 1.8217 |
750 | 1.9252 | 1.6843 |
1000 | 1.7017 | 1.6174 |
1500 | 1.3602 | 1.5243 |
2200 | 0.95 | 1.42 |
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for Elormiden/bert-base-cypriot-greek
Base model
nlpaueb/bert-base-greek-uncased-v1