indictrans-en-ne-checkpoint-1B

This model is a fine-tuned version of indictrans-en-ne-checkpoint-1B/checkpoint-431 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1392
  • Bleu: 36.0128
  • Chrf: 60.1977
  • Num Input Tokens Seen: 196608000

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: inverse_sqrt
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Bleu Chrf Input Tokens Seen
0.1655 0.2897 1000 0.1563 32.3875 57.9890 65536000
0.1545 0.5795 2000 0.1433 35.2047 59.6328 131072000
0.1471 0.8692 3000 0.1392 36.0128 60.1977 196608000

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.2.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
1.12B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.