mlx-community/sarvam-translate-mlx-4bit
This model mlx-community/sarvam-translate-mlx-4bit was converted to MLX format from sarvamai/sarvam-translate using mlx-lm version 0.25.2.
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/sarvam-translate-mlx-4bit")
def translate(lang, text):
return f"""<bos><start_of_turn>user
Translate the text below to {lang}.
{text}<end_of_turn>
"""
generate(
model,
tokenizer,
prompt=translate("Kannada", "How are you?"),
verbose=True
)
# Output:
# ನೀವು ಹೇಗಿದ್ದೀರಿ?
- Downloads last month
- 43
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support