Carasique-v0.1 / mergekit_config.yml
Nohobby's picture
Upload folder using huggingface_hub
0ba236e verified
raw
history blame contribute delete
604 Bytes
models:
- model: unsloth/Mistral-Nemo-Instruct-2407
parameters:
weight: 0.1
density: 0.4
- model: nlpguy/StableProse
parameters:
weight: 0.12
density: 0.5
- model: Sao10K/MN-12B-Lyra-v2a1
parameters:
weight: 0.2
density: 0.6
- model: TheDrummer/Rocinante-12B-v1
parameters:
weight: 0.25
density: 0.7
- model: nbeerbower/mistral-nemo-gutenberg-12B-v3
parameters:
weight: 0.33
density: 0.8
merge_method: della_linear
base_model: unsloth/Mistral-Nemo-Base-2407
parameters:
epsilon: 0.05
lambda: 1
dtype: bfloat16