Model Card for GaMS-9B-SFT-Translator

GaMS-9B-SFT-Trasnlator represents the version of GaMS-9B-Instruct further trained using SFT on approximately 130k English to Slovene translation pairs.

image/png

Acknowledgment

The model was developed within the PoVeJMo research program (Adaptive Natural Language Processing with Large Language Models), particularly within the research project titled SloLLaMai -- Open-access computationally efficient models for Slovenian. The program is funded within the Recovery and Resilience Plan by the Slovenian Research and Innovation Agency (ARIS) and NextGenerationEU. The authors also acknowledge the financial support from the Slovenian Research and Innovation Agency (research core funding No. P6-0411 -- Language Resources and Technologies for Slovene).

Usage

The model can be run through pipeline API using the following code:

from transformers import pipeline

model_id = "GaMS-Beta/GaMS-9B-SFT-Translator"

pline = pipeline(
    "text-generation",
    model=model_id,
    device_map="cuda" # replace with "mps" to run on a Mac device
)

# Example of response generation
text_to_translate = "A fast brown fox jumped over the lazy dog."
message = [{"role": "user", "content": f"Prevedi naslednje besedilo v slovenščino.\n{text_to_translate}"}]
response = pline(message, max_new_tokens=512)
print("Slovene translation:", response[0]["generated_text"][-1]["content"])

For multi GPU inference set the device_map to auto:

Downloads last month
26
Safetensors
Model size
9.24B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for GaMS-Beta/GaMS-9B-SFT-Translator

Base model

google/gemma-2-9b
Finetuned
cjvt/GaMS-9B
Finetuned
(1)
this model
Quantizations
2 models