--- license: cc-by-4.0 language: - en base_model: - TeeZee/Orca-2-13b_flat - NeverSleep/X-NoroChronos-13B - NeverSleep/Noromaid-13b-v0.3 - KatyTheCutie/EstopianMaid-13B - Undi95/MLewdBoros-L2-13B - KoboldAI/LLaMA2-13B-Psyfighter2 - KoboldAI/LLaMA2-13B-Erebus-v3 library_name: transformers tags: - storywriting - text adventure - creative - story - writing - fiction - roleplaying - rp - mergekit - merge --- ![pic](https://huggingface.co/FallenMerick/Bionic-Vaquita-13B/resolve/main/Bionic-Vaquita.jpg) # Bionic-Vaquita-13B In the same vein as the legendary [Psyonic-Cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), I have attempted to create a 13B model that is equal parts creative and chaotic, while still remaining coherent enough for roleplaying purposes.
Seven different Llama-2 13B models were hand-picked and merged via TIES to create three separate components for the final stack.
Emotional intelligence and coherency were the primary focus of the late-stage manual testing that led to selecting this model.

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). GGUF quants: https://huggingface.co/backyardai/Bionic-Vaquita-13B-GGUF ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * [FallenMerick/XNoroChronos-Orca2-Noromaid](https://huggingface.co/FallenMerick/XNoroChronos-Orca2-Noromaid-13B) * [FallenMerick/Psyfighter2-Orca2-Erebus3](https://huggingface.co/FallenMerick/Psyfighter2-Orca2-Erebus3-13B) * [FallenMerick/EstopianMaid-Orca2-MlewdBoros](https://huggingface.co/FallenMerick/EstopianMaid-Orca2-MlewdBoros-13B) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: FallenMerick/XNoroChronos-Orca2-Noromaid layer_range: [0, 16] - sources: - model: FallenMerick/EstopianMaid-Orca2-MlewdBoros layer_range: [16, 24] - sources: - model: FallenMerick/Psyfighter2-Orca2-Erebus3 layer_range: [24, 40] merge_method: passthrough dtype: bfloat16 ```