--- base_model: - mergekit-community/MN-Chthonia-12B - mistralai/Mistral-Nemo-Instruct-2407 - mergekit-community/MN-Nyx-Chthonia-12B - mistralai/Mistral-Nemo-Base-2407 - mergekit-community/MN-Hekate-Deichteira-12B - PocketDoc/Dans-PersonalityEngine-V1.1.0-12b - mergekit-community/MN-Hekate-Ekklesia-12B - mergekit-community/MN-Hekate-Enodia-12B - HumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [PocketDoc/Dans-PersonalityEngine-V1.1.0-12b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.1.0-12b) as a base. ### Models Merged The following models were included in the merge: * [mergekit-community/MN-Chthonia-12B](https://huggingface.co/mergekit-community/MN-Chthonia-12B) * [mistralai/Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) * [mergekit-community/MN-Nyx-Chthonia-12B](https://huggingface.co/mergekit-community/MN-Nyx-Chthonia-12B) * [mistralai/Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407) * [mergekit-community/MN-Hekate-Deichteira-12B](https://huggingface.co/mergekit-community/MN-Hekate-Deichteira-12B) * [mergekit-community/MN-Hekate-Ekklesia-12B](https://huggingface.co/mergekit-community/MN-Hekate-Ekklesia-12B) * [mergekit-community/MN-Hekate-Enodia-12B](https://huggingface.co/mergekit-community/MN-Hekate-Enodia-12B) * [HumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407](https://huggingface.co/HumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407) ### Configuration The following YAML configuration was used to produce this model: ```yaml dtype: float32 out_dtype: bfloat16 merge_method: model_stock base_model: PocketDoc/Dans-PersonalityEngine-V1.1.0-12b models: - model: HumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407 parameters: weight: [0.3, 0.6, 1, 1.2, 2.4] - model: mergekit-community/MN-Chthonia-12B parameters: weight: [1.2, 1, 1.2, 0.6, 0.8] - model: mergekit-community/MN-Hekate-Deichteira-12B parameters: weight: [0.8, 0.6, 1.2, 3, 1.2] - model: mergekit-community/MN-Hekate-Ekklesia-12B parameters: weight: [0.4, 0.8, 1.4, 1.4, 1] - model: mergekit-community/MN-Hekate-Enodia-12B parameters: weight: [1, 1.2, 1.6, 0.8, 0.8] - model: mergekit-community/MN-Nyx-Chthonia-12B parameters: weight: [0.6, 0.8, 1, 1.4, 1.2] - model: mistralai/Mistral-Nemo-Base-2407 parameters: weight: [3, 1, 0.6, 0.4] - model: mistralai/Mistral-Nemo-Instruct-2407 parameters: weight: [0.2, 2, 0.6, 0.2] chat_template: chatml tokenizer: source: union tokens: "[INST]": source: mergekit-community/MN-Hekate-Deichteira-12B force: true "[/INST]": source: mergekit-community/MN-Hekate-Deichteira-12B force: true "<|im_start|>": source: PocketDoc/Dans-PersonalityEngine-V1.1.0-12b force: true "<|im_end|>": source: PocketDoc/Dans-PersonalityEngine-V1.1.0-12b force: true ```