--- base_model: - Qwen/Qwen2.5-Coder-14B-Instruct - Gen-Verse/ReasonFlux-F1-14B - agentica-org/DeepCoder-14B-Preview - qihoo360/Light-R1-14B-DS - Qwen/Qwen2.5-Coder-14B library_name: transformers tags: - mergekit - merge license: apache-2.0 language: - en - zh pipeline_tag: text-generation --- ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64e174e202fa032de4143324/GNPQMcj3XRXlXy3ebsPRZ.jpeg) # YOYO-O1-14B *Combined the most top-notch 14B **inference** model and **code** model in the entire open-source community.* ### Merge Method This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [Qwen/Qwen2.5-Coder-14B](https://huggingface.co/Qwen/Qwen2.5-Coder-14B) as a base. ### Models Merged The following models were included in the merge: * [Qwen/Qwen2.5-Coder-14B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct) * [Gen-Verse/ReasonFlux-F1-14B](https://huggingface.co/Gen-Verse/ReasonFlux-F1-14B) * [agentica-org/DeepCoder-14B-Preview](https://huggingface.co/agentica-org/DeepCoder-14B-Preview) * [qihoo360/Light-R1-14B-DS](https://huggingface.co/qihoo360/Light-R1-14B-DS) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: sce models: # Pivot model - model: Qwen/Qwen2.5-Coder-14B # Target models - model: agentica-org/DeepCoder-14B-Preview - model: qihoo360/Light-R1-14B-DS - model: Gen-Verse/ReasonFlux-F1-14B - model: Qwen/Qwen2.5-Coder-14B-Instruct base_model: Qwen/Qwen2.5-Coder-14B parameters: select_topk: 1 dtype: float16 tokenizer_source: qihoo360/Light-R1-14B-DS normalize: true int8_mask: true ```