k4yt3x's picture
Update README.md
6bd0c00 verified
---
base_model:
- EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1
- Sao10K/70B-L3.3-Cirrus-x1
- TheDrummer/Anubis-70B-v1
- nbeerbower/Llama-3.1-Nemotron-lorablated-70B
- Sao10K/L3.3-70B-Euryale-v2.3
- SicariusSicariiStuff/Negative_LLAMA_70B
library_name: transformers
tags:
- mergekit
- merge
license: llama3.3
---
# Originia-Llama3.3-70B
**An experimental merged model that focuses on storytelling and role-playing capabilities.**
GGUF Q4_K_M imatrix quant is available at [k4yt3x/Originia-Llama3.3-70B-GGUF](https://huggingface.co/k4yt3x/Originia-Llama3.3-70B-GGUF).
> Where did the name come from?
I asked the model and it named itself "Originia."
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [nbeerbower/Llama-3.1-Nemotron-lorablated-70B](https://huggingface.co/nbeerbower/Llama-3.1-Nemotron-lorablated-70B) as a base.
### Models Merged
The following models were included in the merge:
- [EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1](https://huggingface.co/EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1)
- [Sao10K/70B-L3.3-Cirrus-x1](https://huggingface.co/Sao10K/70B-L3.3-Cirrus-x1)
- [TheDrummer/Anubis-70B-v1](https://huggingface.co/TheDrummer/Anubis-70B-v1)
- [Sao10K/L3.3-70B-Euryale-v2.3](https://huggingface.co/Sao10K/L3.3-70B-Euryale-v2.3)
- [SicariusSicariiStuff/Negative_LLAMA_70B](https://huggingface.co/SicariusSicariiStuff/Negative_LLAMA_70B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: nbeerbower/Llama-3.1-Nemotron-lorablated-70B
merge_method: sce
dtype: bfloat16
models:
- model: EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1
- model: Sao10K/L3.3-70B-Euryale-v2.3
- model: Sao10K/70B-L3.3-Cirrus-x1
- model: TheDrummer/Anubis-70B-v1
- model: SicariusSicariiStuff/Negative_LLAMA_70B
```