merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using mistralai/Mistral-Nemo-Base-2407 as a base.
Models Merged
The following models were included in the merge:
- PocketDoc/Dans-SakuraKaze-V1.0.0-12b
- mistralai/Mistral-Nemo-Instruct-2407
- TheDrummer/Rocinante-12B-v1.1
- ReadyArt/Forgotten-Safeword-12B-3.6
- PocketDoc/Dans-DangerousWinds-V1.1.0-12b
Configuration
The following YAML configuration was used to produce this model:
models:
- model: mistralai/Mistral-Nemo-Base-2407
# No parameters necessary for base model
- model: mistralai/Mistral-Nemo-Instruct-2407
parameters:
density: 0.50 # Mid-level density for general instruction tuning
weight: 0.20 # Moderate influence for balanced instruction-following
- model: TheDrummer/Rocinante-12B-v1.1 # Highest influence (strong reasoning/language balance)
parameters:
density: 0.60 # Higher density for deeper reasoning and coherence
weight: 0.30 # Primary influence model
- model: ReadyArt/Forgotten-Safeword-12B-3.6 # Creativity & conversational nuance
parameters:
density: 0.50 # Balanced density for creative and nuanced responses
weight: 0.15 # Mid-tier influence
- model: PocketDoc/Dans-SakuraKaze-V1.0.0-12b # Second highest influence (natural conversation flow)
parameters:
density: 0.55 # Slightly high density for fluid conversation
weight: 0.20 # Substantial influence in dialogue
- model: PocketDoc/Dans-DangerousWinds-V1.1.0-12b # Reinforcement of strong responses
parameters:
density: 0.60 # High density for reinforcement learning-style response shaping
weight: 0.15 # Secondary reinforcement
merge_method: dare_ties
base_model: mistralai/Mistral-Nemo-Base-2407
parameters:
normalize: true # Ensures weight distribution remains balanced
int8_mask: true # Reduces memory usage while keeping precision
dtype: bfloat16 # Optimal balance between performance and efficiency
- Downloads last month
- 8
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support