Qwen2.5-Sign
Collection
3 items
•
Updated
Qwen2.5-Sign is a text-to-chinese-sign model base on Qwen2.5
Parameter | Value |
---|---|
learning_rate | 5e-05 |
train_batch_size | 4 |
eval_batch_size | 4 |
gradient_accumulation_steps | 8 |
total_train_batch_size | 32 |
lr_scheduler_type | cosine |
lr_scheduler_warmup_steps | 100 |
num_epochs | 4 |
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained(
"thundax/Qwen2.5-1.5B-Sign",
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("thundax/Qwen2.5-1.5B-Sign")
text = "ç«™ä¸€ä¸ªåˆ¶é«˜ç‚¹çœ‹ä¸Šæµ·ï¼Œä¸Šæµ·çš„å¼„å ‚æ˜¯å£®è§‚çš„æ™¯è±¡ã€‚å®ƒæ˜¯è¿™åŸŽå¸‚èƒŒæ™¯ä¸€æ ·çš„ä¸œè¥¿ã€‚"
input_text = f'Translate sentence into labels\n{text}\n'
model_inputs = tokenizer([input_text], return_tensors="pt").to(device)
generated_ids = model.generate(
model_inputs.input_ids,
max_new_tokens=512
)
generated_ids = [
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
If you find our work helpful, feel free to give us a cite.
@software{qwen2-sign,
author = {thundax},
title = {qwen2-sign: A Tool for Text to Sign},
year = {2025},
url = {https://github.com/thundax-lyp},
}
Base model
Qwen/Qwen2.5-0.5B