ElanMT

This model is a tiny variant of ElanMT-BT-ja-en and is trained from scratch exclusively on openly licensed data and Wikipedia back translated data using ElanMT-base-en-ja.

Model Details

This is a translation model based on Marian MT 4-layer encoder-decoder transformer architecture with sentencepiece tokenizer.

Usage

See here.

Training Data

See here.

Training Procedure

See here.

Evaluation

See here.

Disclaimer

The translated result may be very incorrect, harmful or biased. The model was developed to investigate achievable performance with only a relatively small, licensed corpus, and is not suitable for use cases requiring high translation accuracy. Under Section 5 of the CC BY-SA 4.0 License, ELAN MITSUA Project / Abstract Engine is not responsible for any direct or indirect loss caused by the use of the model.

Downloads last month
10
Safetensors
Model size
15.6M params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Mitsua/elan-mt-tiny-ja-en

Collection including Mitsua/elan-mt-tiny-ja-en