metadata
base_model:
- eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
- Nondzu/Mistral-7B-Instruct-v0.2-code-ft
- xingyaoww/CodeActAgent-Mistral-7b-v0.1
- beowolx/MistralHermes-CodePro-7B-v1
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
language:
- en
Gonzo-Code-7B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO as a base.
Models Merged
The following models were included in the merge:
- Nondzu/Mistral-7B-Instruct-v0.2-code-ft
- xingyaoww/CodeActAgent-Mistral-7b-v0.1
- beowolx/MistralHermes-CodePro-7B-v1
Configuration
The following YAML configuration was used to produce this model:
models:
- model: eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
# No parameters necessary for base model
- model: xingyaoww/CodeActAgent-Mistral-7b-v0.1
parameters:
density: 0.53
weight: 0.4
- model: Nondzu/Mistral-7B-Instruct-v0.2-code-ft
parameters:
density: 0.53
weight: 0.3
- model: beowolx/MistralHermes-CodePro-7B-v1
parameters:
density: 0.53
weight: 0.3
merge_method: dare_ties
base_model: eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
parameters:
int8_mask: true
dtype: bfloat16