metadata license: apache-2.0 base_model: tasal9/ZamAI-mT5-Pashto tags:

  • generated_from_trainer model-index:
  • name: ZamAI-mT5-Pashto results: [] ZamAI-mT5-Pashto This model is a fine-tuned version of tasal9/ZamAI-mT5-Pashto on an unknown dataset. It achieves the following results on the evaluation set:

eval_loss: 0.0529 eval_runtime: 71.7668 eval_samples_per_second: 13.934 eval_steps_per_second: 13.934 epoch: 0.64 step: 400 Model description More information needed

Intended uses & limitations More information needed

Training and evaluation data More information needed

Training procedure Training hyperparameters The following hyperparameters were used during training:

learning_rate: 1e-05 train_batch_size: 1 eval_batch_size: 1 seed: 42 gradient_accumulation_steps: 8 total_train_batch_size: 8 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 1 Framework versions Transformers 4.42.4 Pytorch 2.8.0+cu128 Datasets 2.20.0 Tokenizers 0.19.1

Downloads last month
111
Safetensors
Model size
582M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for tasal9/ZamAI-mT5-Pashto

Base model

google/mt5-base
Adapter
(28)
this model

Dataset used to train tasal9/ZamAI-mT5-Pashto

Space using tasal9/ZamAI-mT5-Pashto 1