Model Overview
Barcha-7B-Instruct is an open model instruction-tuned for Tunisian Derja, is a continually pre-trained and aligned version of Qwen/Qwen2-7B-Instruct with Tunisian_Derja_Dataset
Uploaded model
- Developed by: Linagora
- License: apache-2.0
- Finetuned from model : Qwen/Qwen2-7B-Instruct
Usage
Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with:
pip install transformers
Running with the pipeline
API
import torch
from transformers import pipeline
model_id= "linagora/Barcha-7B-Instruct"
pipe = pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device="cuda" # replace with "mps" to run on a Mac device
)
messages = [
{"role": "user", "content": ' شنو معنى برشا'},
]
outputs = pipe(messages, max_new_tokens=128, temperature=0.0)
assistant_response = outputs[0]["generated_text"][-1]["content"].strip()
print(assistant_response)
- Response:برشّا هي كلمة تعني كتر من واحد حاجة
Running the model on a single / multi GPU
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model_id= "linagora/Barcha-7B-Instruct"
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype="auto",
device_map="cuda"
)
tokenizer = AutoTokenizer.from_pretrained(model_id)
messages = [
{"role": "user", "content": "شنو معنى لاباس""},
]
input_ids = tokenizer.apply_chat_template(messages, return_tensors="pt", return_dict=True , add_generation_prompt=True).to(model.device)
outputs = model.generate(**input_ids, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
- Response:لاباس هو كلمة جاية من العربية، معناها هل أنت بخير
Citations
When using this model ** Barcha-7B-Instruct **, please cite:
@model{linagora2025LLM-tn,
author = {Wajdi Ghezaiel and Jean-Pierre Lorré},
title = {Barcha-7B-Instruct :Tunisian Arabic Derja LLM based on Qwen2-7B},
year = {2025},
month = {July},
url = {https://huggingface.co/datasets/linagora/Barcha-7B-Instruct}
}
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support