|
--- |
|
base_model: |
|
- sometimesanotion/Qwenvergence-14B-v13-Prose-DS |
|
- Sao10K/14B-Qwen2.5-Kunou-v1 |
|
- deepcogito/cogito-v1-preview-qwen-14B |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
# merge |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the [Model Breadcrumbs](https://arxiv.org/abs/2312.06795) merge method using [Sao10K/14B-Qwen2.5-Kunou-v1](https://huggingface.co/Sao10K/14B-Qwen2.5-Kunou-v1) as a base. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [sometimesanotion/Qwenvergence-14B-v13-Prose-DS](https://huggingface.co/sometimesanotion/Qwenvergence-14B-v13-Prose-DS) |
|
* [deepcogito/cogito-v1-preview-qwen-14B](https://huggingface.co/deepcogito/cogito-v1-preview-qwen-14B) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: Sao10K/14B-Qwen2.5-Kunou-v1 |
|
- model: sometimesanotion/Qwenvergence-14B-v13-Prose-DS |
|
parameters: |
|
density: [0.16, 0.26, 0.36, 0.46, 0.56, 0.46, 0.36, 0.26, 0.16] |
|
weight: [0.166, 0.496, 0.496, 0.166, 0.166, 0.496, 0.496, 0.166] |
|
- model: deepcogito/cogito-v1-preview-qwen-14B |
|
parameters: |
|
density: [0.56, 0.46, 0.36, 0.26, 0.16, 0.26, 0.36, 0.46, 0.56] |
|
weight: [0.496, 0.166, 0.166, 0.496, 0.496, 0.166, 0.166, 0.496] |
|
merge_method: breadcrumbs |
|
base_model: Sao10K/14B-Qwen2.5-Kunou-v1 |
|
parameters: |
|
gamma: 0.06 |
|
lambda: 0.96 |
|
tokenizer_source: base |
|
dtype: bfloat16 |
|
``` |
|
|