--- base_model: - Nitral-AI/Captain_BMO-12B - pot99rta/CaptainMaid-12B-VioletMell-V0.420 library_name: transformers tags: - mergekit - merge --- # BMO-CaptianMaid-12B ![image/png](https://cdn-uploads.huggingface.co/production/uploads/636ea389fd9751c3d081e88e/bRUq0aF5mcJXmTVgeqeI8.png) ```Models Merged:``` ```1. Nitral-AI/Captain_BMO-12B``` ```2. pot99rta/CaptainMaid-12B-VioletMell-V0.420``` ```Preset:``` ```Use ChatML or Mistral - Phi works too for some unknown reason.``` Phi and Mistral works with interesting results.. I quite like it with my settings. # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [pot99rta/CaptainMaid-12B-VioletMell-V0.420](https://huggingface.co/pot99rta/CaptainMaid-12B-VioletMell-V0.420) as a base. ### Models Merged The following models were included in the merge: * [Nitral-AI/Captain_BMO-12B](https://huggingface.co/Nitral-AI/Captain_BMO-12B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: pot99rta/CaptainMaid-12B-VioletMell-V0.420 #no parameters necessary for base model - model: pot99rta/CaptainMaid-12B-VioletMell-V0.420 parameters: density: 0.5 weight: 0.5 - model: Nitral-AI/Captain_BMO-12B parameters: density: 0.5 weight: 0.5 merge_method: ties base_model: pot99rta/CaptainMaid-12B-VioletMell-V0.420 parameters: normalize: false int8_mask: true dtype: float16 ```