|
--- |
|
base_model: [] |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
# merged_model_output |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the [DELLA](https://arxiv.org/abs/2406.11617) merge method using /media/administrator/oiseauxai1data/modelout/Smart-base-v2 as a base. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* /media/administrator/oiseauxai1data/modelout/Story-Base-V2 |
|
* /media/administrator/oiseauxai1data/modelout/Middle-Base-V2 |
|
* /media/administrator/oiseauxai1data/modelout/Dark-Base-V2 |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
# --- Mergekit Example: della_linear --- |
|
# Method: Implements the DELLA concept (Deep Ensembling with Layer-wise Linear Averaging). |
|
# This typically involves a sophisticated layer-wise linear combination of models. |
|
|
|
base_model: /media/administrator/oiseauxai1data/modelout/Smart-base-v2 # The foundational model |
|
models: |
|
- model: /media/administrator/oiseauxai1data/modelout/Smart-base-v2 |
|
parameters: |
|
weight: [0.4, 0.2, 0.2, 0.2] # Contribution of this model (e.g., 50%) (can also use a gradiant) [0.1, 0.1, 0.1, 0.2, 0.5] |
|
density: 0.50 # Sparsity/pruning factor for this model's contribution. |
|
epsilon: 0.15 # Single epsilon for the pruning |
|
models: |
|
- model: /media/administrator/oiseauxai1data/modelout/Dark-Base-V2 |
|
parameters: |
|
weight: [0.1, 0.2, 0.3, 0.4] # Contribution of this model (e.g., 50%) (can also use a gradiant) [0.1, 0.1, 0.1, 0.2, 0.5] |
|
density: 0.50 # Sparsity/pruning factor for this model's contribution. |
|
epsilon: 0.15 # Single epsilon for the pruning |
|
- model: /media/administrator/oiseauxai1data/modelout/Story-Base-V2 |
|
parameters: |
|
weight: [0.2, 0.3, 0.2, 0.3] # Contribution of this model (e.g., 50%) (can also use a gradiant) [0.1, 0.1, 0.1, 0.2, 0.5] |
|
density: 0.50 # Sparsity/pruning factor for this model's contribution. |
|
epsilon: 0.15 # Single epsilon for the pruning |
|
- model: /media/administrator/oiseauxai1data/modelout/Middle-Base-V2 |
|
parameters: |
|
weight: [0.3, 0.3, 0.3, 0.1] # Contribution of this model (e.g., 50%) (can also use a gradiant) [0.1, 0.1, 0.1, 0.2, 0.5] |
|
density: 0.50 # Sparsity/pruning factor for this model's contribution. |
|
epsilon: 0.15 # Single epsilon for the pruning |
|
model_name: L3.3-70B-Amalgamma-V8 # Name of your merge |
|
dtype: float32 # Input size float32, float16, bfloat16 |
|
out_dtype: bfloat16 # output size float32, float16, bfloat16 |
|
merge_method: della |
|
parameters: |
|
normalize: false # If true (default), weights are normalized to sum to 1. |
|
# If false, absolute weights are used. |
|
lambda: 1.1 # Single lambda for scaling the final merged deltas |
|
|
|
tokenizer_source: base # Or 'base' if base_model is set, or 'union', careful with this one |
|
chat_template: llama3 # Template for chat (Chatml, llama3, etc...) |
|
license: apache-2.0 # License type |
|
|
|
``` |
|
|