Minueza-2-96M
Collection
The second version of the Minueza series. Base model and its fine-tunings.
•
6 items
•
Updated
•
1
This model is a fine-tuned version of Felladrin/Minueza-2-96M on the English totally-not-an-llm/EverythingLM-data-V2-sharegpt dataset.
pip install transformers==4.51.1 torch==2.6.0
from transformers import pipeline, TextStreamer
import torch
generate_text = pipeline(
"text-generation",
model="Felladrin/Minueza-2-96M-Instruct-Variant-04",
device=torch.device("cuda" if torch.cuda.is_available() else "cpu"),
)
messages = [
{
"role": "user",
"content": "How to become a healthier person?",
},
]
generate_text(
generate_text.tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
),
streamer=TextStreamer(generate_text.tokenizer, skip_special_tokens=True),
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.9,
top_k=0,
min_p=0.1,
repetition_penalty=1.17,
)
The following hyperparameters were used during training:
This model is licensed under the Apache License 2.0.