--- base_model: - mergekit-community/because_im_bored_nsfw1 - aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K - jondurbin/bagel-8b-v1.0 - NeverSleep/Llama-3-Lumimaid-8B-v0.1 - Undi95/Llama-3-LewdPlay-8B-evo - Deev124/hermes-llama3-roleplay-4000-v1 - Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2 - unsloth/Meta-Llama-3.1-8B - DevsDoCode/LLama-3-8b-Uncensored library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [unsloth/Meta-Llama-3.1-8B](https://huggingface.co/unsloth/Meta-Llama-3.1-8B) as a base. ### Models Merged The following models were included in the merge: * [mergekit-community/because_im_bored_nsfw1](https://huggingface.co/mergekit-community/because_im_bored_nsfw1) * [aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K](https://huggingface.co/aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K) * [jondurbin/bagel-8b-v1.0](https://huggingface.co/jondurbin/bagel-8b-v1.0) * [NeverSleep/Llama-3-Lumimaid-8B-v0.1](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1) * [Undi95/Llama-3-LewdPlay-8B-evo](https://huggingface.co/Undi95/Llama-3-LewdPlay-8B-evo) * [Deev124/hermes-llama3-roleplay-4000-v1](https://huggingface.co/Deev124/hermes-llama3-roleplay-4000-v1) * [Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2](https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2) * [DevsDoCode/LLama-3-8b-Uncensored](https://huggingface.co/DevsDoCode/LLama-3-8b-Uncensored) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K parameters: density: 0.53 weight: 0.14 - model: NeverSleep/Llama-3-Lumimaid-8B-v0.1 parameters: density: 0.75 weight: 0.25 - model: mergekit-community/because_im_bored_nsfw1 parameters: density: 0.53 weight: 0.14 - model: jondurbin/bagel-8b-v1.0 parameters: density: 0.53 weight: 0.14 - model: Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2 parameters: density: 0.53 weight: 0.14 - model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.81 weight: 0.30 - model: Deev124/hermes-llama3-roleplay-4000-v1 parameters: density: 0.53 weight: 0.14 - model: DevsDoCode/LLama-3-8b-Uncensored parameters: density: 0.53 weight: 0.14 merge_method: dare_ties base_model: unsloth/Meta-Llama-3.1-8B parameters: normalize: true int8_mask: true dtype: float16 ```