Karasik 0.1
Overview
Somewhat experimental merge of some Mistral Small models.
Quants
Merge Details
Merge Method
This model was merged using the della_linear merge method using ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1 as a base.
Models Merged
The following models were included in the merge:
- rAIfle/Acolyte-22B
- byroneverson/Mistral-Small-Instruct-2409-abliterated
- DazzlingXeno/Cydonian-Gutenberg
Configuration
The following YAML configuration was used to produce this model:
base_model: ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1
parameters:
epsilon: 0.04
lambda: 1.05
int8_mask: true
rescale: true
normalize: false
dtype: bfloat16
tokenizer_source: base
merge_method: della_linear
models:
- model: ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1
parameters:
weight: [0.2, 0.3, 0.2, 0.3, 0.2]
density: [0.45, 0.55, 0.45, 0.55, 0.45]
- model: rAIfle/Acolyte-22B
parameters:
weight: [0.01768, -0.01675, 0.01285, -0.01696, 0.01421]
density: [0.6, 0.4, 0.5, 0.4, 0.6]
- model: byroneverson/Mistral-Small-Instruct-2409-abliterated
parameters:
weight: [0.208, 0.139, 0.139, 0.139, 0.208]
density: [0.7]
- model: DazzlingXeno/Cydonian-Gutenberg
parameters:
weight: [0.33]
density: [0.45, 0.55, 0.45, 0.55, 0.45]
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Nohobby/Karasik-22B-v0.1
Merge model
this model