chargoddard's picture
Upload folder using huggingface_hub
27862d9 verified
---
base_model:
- unsloth/gemma-2-9b-it
- anthracite-org/magnum-v3-9b-customgemma2
- rtzr/ko-gemma-2-9b-it
- flammenai/Mahou-1.3-gemma2-9B
- AlicanKiraz0/SenecaLLM_x_gemma-2-9b-CyberSecurity
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [unsloth/gemma-2-9b-it](https://huggingface.co/unsloth/gemma-2-9b-it) as a base.
### Models Merged
The following models were included in the merge:
* [anthracite-org/magnum-v3-9b-customgemma2](https://huggingface.co/anthracite-org/magnum-v3-9b-customgemma2)
* [rtzr/ko-gemma-2-9b-it](https://huggingface.co/rtzr/ko-gemma-2-9b-it)
* [flammenai/Mahou-1.3-gemma2-9B](https://huggingface.co/flammenai/Mahou-1.3-gemma2-9B)
* [AlicanKiraz0/SenecaLLM_x_gemma-2-9b-CyberSecurity](https://huggingface.co/AlicanKiraz0/SenecaLLM_x_gemma-2-9b-CyberSecurity)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: unsloth/gemma-2-9b-it
merge_method: ties
slices:
- sources:
- model: unsloth/gemma-2-9b-it
layer_range: [0, 42]
parameters:
density: 0.5
weight: 0.3
- model: rtzr/ko-gemma-2-9b-it
layer_range: [0, 42]
parameters:
density: 0.45
weight: 0.4
- model: AlicanKiraz0/SenecaLLM_x_gemma-2-9b-CyberSecurity
layer_range: [0, 42]
parameters:
density: 0.4
weight: 0.1
- model: anthracite-org/magnum-v3-9b-customgemma2
layer_range: [0, 42]
parameters:
density: 0.45
weight: 0.15
- model: flammenai/Mahou-1.3-gemma2-9B
layer_range: [0, 42]
parameters:
density: 0.5
weight: 0.05
merge_mode: cat
dtype: bfloat16
parameters:
normalize: true
int8_mask: true
dtype: bfloat16
```