merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using unsloth/Mistral-Small-Instruct-2409 as a base.
Models Merged
The following models were included in the merge:
- Kaoeiri/MS-Inky-2409-22B
- ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1
- qingy2024/MwM-22B-Instruct
- allura-org/MS-Meadowlark-22B
- Kaoeiri/MS_fujin-2409-22B
- Darkknight535/MS-Moonlight-22B-v3
- invisietch/MiS-Firefly-v0.2-22B
- concedo/Beepo-22B
- spow12/ChatWaifu_v2.0_22B
- InferenceIllusionist/SorcererLM-22B
- crestf411/MS-sunfall-v0.7.0
- Kaoeiri/MS_dampf-2409-22B
- Kaoeiri/Magnum-v4-Cydonia-vXXX-22B
- Kaoeiri/MS_Moingooistral-2409-22B
- Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-22B
- Envoid/Mistral-Small-NovusKyver
- Kaoeiri/MS_a-coolyte-2409-22B
- Gryphe/Pantheon-RP-1.6.2-22b-Small
- DigitalSouls/BlackSheep-DigitalSoul-22B
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Kaoeiri/MS_Moingooistral-2409-22B
parameters:
weight: 0.10
density: 0.72
- model: Kaoeiri/Magnum-v4-Cydonia-vXXX-22B
parameters:
weight: 0.72
density: 0.80
- model: Kaoeiri/MS-Inky-2409-22B
parameters:
weight: 0.24
density: 0.70
- model: Gryphe/Pantheon-RP-1.6.2-22b-Small
parameters:
weight: 0.32
density: 0.76
- model: DigitalSouls/BlackSheep-DigitalSoul-22B
parameters:
weight: 0.12
density: 0.68
- model: InferenceIllusionist/SorcererLM-22B
parameters:
weight: 0.10
density: 0.70
- model: Envoid/Mistral-Small-NovusKyver
parameters:
weight: 0.08
density: 0.70
- model: concedo/Beepo-22B
parameters:
weight: 0.35
density: 0.78
- model: crestf411/MS-sunfall-v0.7.0
parameters:
weight: 0.14
density: 0.68
- model: Kaoeiri/MS_a-coolyte-2409-22B
parameters:
weight: 0.10
density: 0.65
- model: invisietch/MiS-Firefly-v0.2-22B
parameters:
weight: 0.12
density: 0.68
- model: Kaoeiri/MS_fujin-2409-22B
parameters:
weight: 0.08
density: 0.62
- model: Kaoeiri/MS_dampf-2409-22B
parameters:
weight: 0.10
density: 0.62
- model: qingy2024/MwM-22B-Instruct
parameters:
weight: 0.18
density: 0.72
- model: ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1
parameters:
weight: 0.12
density: 0.65
- model: Darkknight535/MS-Moonlight-22B-v3
parameters:
weight: 0.10
density: 0.65
- model: allura-org/MS-Meadowlark-22B
parameters:
weight: 0.16
density: 0.70
- model: Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V1-22B
parameters:
weight: 0.16
density: 0.62
- model: spow12/ChatWaifu_v2.0_22B
parameters:
weight: 0.20
density: 0.65
merge_method: dare_ties
base_model: unsloth/Mistral-Small-Instruct-2409
parameters:
density: 0.95
epsilon: 0.12
lambda: 1.22
dtype: bfloat16
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Kaoeiri/MS-MagpantheonselRP-22B-14.1-Recalculated
Merge model
this model