Model Image

HyperNovaSynth-12B

From the void where giants fall, a deeper silence erupts. Darker, heavier, stranger.
What follows is not light but gravity itself made song.
This is no ordinary flare, but the whisper of something vast unraveling.

πŸ”§ Recommended Sampling Settings:

Temperature: 0.75 to 1.25
Min P: 0.035
Context Length: Stable at 12k tokens, with possible support for extended contexts

πŸ’¬ Prompt Format

Supports ChatML style messages. Example:

<|im_start|>user
Your question here.
<|im_end|>
<|im_start|>assistant

HyperNovaSynth-12B is a merge of the following models using LazyMergekit:

🧩 Configuration

merge_method: slerp
base_model: Marcjoni/SuperNovaSynth-12B
models:
  - model: yamatazen/LorablatedStock-12B
parameters:
  t:
    - filter: "mlp"
      value: 0.75
    - filter: "attn"
      value: 0.35
    - value: 0.55
dtype: bfloat16

πŸ’» Usage

!pip install -qU transformers accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "Marcjoni/HyperNovaSynth-12B"
messages = [{"role": "user", "content": "What is a large language model?"}]

tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    torch_dtype=torch.float16,
    device_map="auto",
)

outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=1, top_k=0, top_p=1)
print(outputs[0]["generated_text"])
Downloads last month
46
Safetensors
Model size
12.2B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Marcjoni/HyperNovaSynth-12B