T5ForConditionalGeneration files for the Madlad-400 3B parameter MT model.

from transformers import T5ForConditionalGeneration, T5Tokenizer, GenerationConfig
model = T5ForConditionalGeneration.from_pretrained("iliemihai/madlad400-3b-mt")
tokenizer = T5Tokenizer.from_pretrained("iliemihai/madlad400-3b-mt")

text = """The quick brown fox jumped over the laisy dog."""
input_ids = tokenizer.encode(f"<2ro> {text} </s>", return_tensors="pt")
outputs = model.generate(input_ids, 
        do_sample=True,
        early_stopping=True,
        top_p=0.92,
        top_k=50,
        temperature=0.3,
        max_length=256)
print(tokenizer.decode(outputs[0]))

**KUDOS TO jbochi FOR RELEASING THE COLAB AND CONVERSION CODE Colab to generate these files is here.

Downloads last month
12
Safetensors
Model size
2.94B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for iliemihai/madlad400-3b-mt

Quantizations
1 model