--- base_model: - xdrshjr/llama3.2_1b_uncensored_5000_8epoch_lora - suayptalha/FastLlama-3.2-1B-Instruct - Kanonenbombe/llama3.2-1B-Function-calling - BarraHome/llama3.2-1b-mla - Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged - danieliuspodb/llama-3.2-1b-extremist4 - Weyaxi/Einstein-v8-Llama3.2-1B - Grogros/Grogros-dmWM-llama-3.2-1B-Instruct-WOHealth-Al4-NH-WO-d4-a0.2-v4-learnability_adv - artificialguybr/LLAMA-3.2-1B-OpenHermes2.5 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [xdrshjr/llama3.2_1b_uncensored_5000_8epoch_lora](https://huggingface.co/xdrshjr/llama3.2_1b_uncensored_5000_8epoch_lora) as a base. ### Models Merged The following models were included in the merge: * [suayptalha/FastLlama-3.2-1B-Instruct](https://huggingface.co/suayptalha/FastLlama-3.2-1B-Instruct) * [Kanonenbombe/llama3.2-1B-Function-calling](https://huggingface.co/Kanonenbombe/llama3.2-1B-Function-calling) * [BarraHome/llama3.2-1b-mla](https://huggingface.co/BarraHome/llama3.2-1b-mla) * [Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged](https://huggingface.co/Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged) * [danieliuspodb/llama-3.2-1b-extremist4](https://huggingface.co/danieliuspodb/llama-3.2-1b-extremist4) * [Weyaxi/Einstein-v8-Llama3.2-1B](https://huggingface.co/Weyaxi/Einstein-v8-Llama3.2-1B) * [Grogros/Grogros-dmWM-llama-3.2-1B-Instruct-WOHealth-Al4-NH-WO-d4-a0.2-v4-learnability_adv](https://huggingface.co/Grogros/Grogros-dmWM-llama-3.2-1B-Instruct-WOHealth-Al4-NH-WO-d4-a0.2-v4-learnability_adv) * [artificialguybr/LLAMA-3.2-1B-OpenHermes2.5](https://huggingface.co/artificialguybr/LLAMA-3.2-1B-OpenHermes2.5) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: xdrshjr/llama3.2_1b_uncensored_5000_8epoch_lora merge_method: model_stock dtype: bfloat16 parameters: t: [0, 0.5, 1, 0.5, 0] models: - model: suayptalha/FastLlama-3.2-1B-Instruct - model: Kanonenbombe/llama3.2-1B-Function-calling - model: Weyaxi/Einstein-v8-Llama3.2-1B - model: Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged - model: artificialguybr/LLAMA-3.2-1B-OpenHermes2.5 - model: danieliuspodb/llama-3.2-1b-extremist4 - model: BarraHome/llama3.2-1b-mla - model: Grogros/Grogros-dmWM-llama-3.2-1B-Instruct-WOHealth-Al4-NH-WO-d4-a0.2-v4-learnability_adv ```