--- base_model: - nbeerbower/Yanfei-Qwen3-32B - huihui-ai/Qwen3-32B-abliterated - nbeerbower/Zhiming-Qwen3-32B-lora - nbeerbower/Menghua-Qwen3-32B-lora library_name: transformers tags: - mergekit - merge license: apache-2.0 datasets: - nbeerbower/YanfeiMix-DPO --- ![image/png](https://huggingface.co/nbeerbower/Yanfei-Qwen3-32B/resolve/main/yanfei_cover.png?download=true) # Yanfei-v2-Qwen3-32B A repair of Yanfei-Qwen-32B by [TIES](https://arxiv.org/abs/2306.01708) merging huihui-ai/Qwen3-32B-abliterated, Zhiming-Qwen3-32B, and Menghua-Qwen3-32B using [mergekit](https://github.com/cg123/mergekit). ## Sponsorship This model was made possible with compute support from [Nectar AI](https://nectar.ai). Thank you! ❤️ ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: ./Zhiming-Qwen3-32B-merged parameters: weight: 1 density: 1 - model: ./Menghua-Qwen3-32B-merged parameters: weight: 1 density: 1 - model: huihui-ai/Qwen3-32B-abliterated parameters: weight: 1 density: 1 merge_method: ties base_model: nbeerbower/Yanfei-Qwen3-32B parameters: weight: 1 density: 1 normalize: true int8_mask: true dtype: bfloat16 ```