|
--- |
|
base_model: |
|
- nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2 |
|
- nothingiisreal/MN-12B-Starcannon-v3 |
|
- anthracite-org/magnum-v4-12b |
|
- Fizzarolli/MN-12b-Sunrose |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
I have no idea what I’m doing… if this causes the apocalypse someone please let me know. |
|
|
|
MN-12B-Inferor-v0.0 8.0bpw h8 EXL2 |
|
|
|
Includes [measurement.json](https://huggingface.co/FuturisticVibes/MN-12B-Inferor-v0.0-8.0bpw-h8-exl2/tree/measurement) file for further quantization |
|
|
|
I’ll be taking a break until mid-2025 to recoup some funds, might still do small models occasionally, but anything big will have to wait. See you all next year! 😊 |
|
|
|
Original Model: https://huggingface.co/Infermatic/MN-12B-Inferor-v0.0 |
|
|
|
# Original Model Card |
|
|
|
|
|
 |
|
|
|
|
|
# Inferor |
|
|
|
My first merge yay! |
|
|
|
#### This was made thanks to [infermatic.ai](https://infermatic.ai/) |
|
|
|
Thanks everyone that is trying it and giving me feedback. ily - svak |
|
|
|
|
|
--- |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [anthracite-org/magnum-v4-12b](https://huggingface.co/anthracite-org/magnum-v4-12b) as a base. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2](https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2) |
|
* [nothingiisreal/MN-12B-Starcannon-v3](https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v3) |
|
* [Fizzarolli/MN-12b-Sunrose](https://huggingface.co/Fizzarolli/MN-12b-Sunrose) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
base_model: anthracite-org/magnum-v4-12b |
|
dtype: bfloat16 |
|
merge_method: model_stock |
|
slices: |
|
- sources: |
|
- layer_range: [0, 40] |
|
model: Fizzarolli/MN-12b-Sunrose |
|
- layer_range: [0, 40] |
|
model: nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2 |
|
- layer_range: [0, 40] |
|
model: nothingiisreal/MN-12B-Starcannon-v3 |
|
- layer_range: [0, 40] |
|
model: anthracite-org/magnum-v4-12b |
|
``` |
|
|