--- base_model: - maywell/Qwen2-7B-Multilingual-RP - mergekit-community/mergekit-slerp-zwbqvcb - bunnycore/Qwen-2.5-7b-TitanFusion-v3 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the Passthrough merge method. ### Models Merged The following models were included in the merge: * [maywell/Qwen2-7B-Multilingual-RP](https://huggingface.co/maywell/Qwen2-7B-Multilingual-RP) * [mergekit-community/mergekit-slerp-zwbqvcb](https://huggingface.co/mergekit-community/mergekit-slerp-zwbqvcb) * [bunnycore/Qwen-2.5-7b-TitanFusion-v3](https://huggingface.co/bunnycore/Qwen-2.5-7b-TitanFusion-v3) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - layer_range: [0, 8] model: bunnycore/Qwen-2.5-7b-TitanFusion-v3 - sources: - layer_range: [4, 12] model: mergekit-community/mergekit-slerp-zwbqvcb - sources: - layer_range: [9, 16] model: maywell/Qwen2-7B-Multilingual-RP - sources: - layer_range: [13, 20] model: bunnycore/Qwen-2.5-7b-TitanFusion-v3 - sources: - layer_range: [17, 24] model: mergekit-community/mergekit-slerp-zwbqvcb - sources: - layer_range: [21, 28] model: maywell/Qwen2-7B-Multilingual-RP merge_method: passthrough dtype: bfloat16 ```