metadata
base_model:
- spacematt/Qwen2.5-CompositeFlow-Coder-14B-Instruct
- CultriX/Qwen2.5-14B-Wernicke
- deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- tanliboy/lambda-qwen2.5-14b-dpo-test
- djuna/Q2.5-Veltha-14B-0.5
- Triangle104/Herodotos-14B
- suayptalha/Lamarckvergence-14B
- wanlige/li-14b-v0.4
- sometimesanotion/Lamarck-14B-v0.7
- YOYO-AI/Qwen2.5-14B-YOYO-V5
- deepcogito/cogito-v1-preview-qwen-14B
- arcee-ai/Virtuoso-Small-v2
- CultriX/Qwen2.5-14B-Ultimav2
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Model Stock merge method using spacematt/Qwen2.5-CompositeFlow-Coder-14B-Instruct as a base.
Models Merged
The following models were included in the merge:
- CultriX/Qwen2.5-14B-Wernicke
- deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- tanliboy/lambda-qwen2.5-14b-dpo-test
- djuna/Q2.5-Veltha-14B-0.5
- Triangle104/Herodotos-14B
- suayptalha/Lamarckvergence-14B
- wanlige/li-14b-v0.4
- sometimesanotion/Lamarck-14B-v0.7
- YOYO-AI/Qwen2.5-14B-YOYO-V5
- deepcogito/cogito-v1-preview-qwen-14B
- arcee-ai/Virtuoso-Small-v2
- CultriX/Qwen2.5-14B-Ultimav2
Configuration
The following YAML configuration was used to produce this model:
models:
- model: deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- model: CultriX/Qwen2.5-14B-Wernicke
- model: CultriX/Qwen2.5-14B-Ultimav2
- model: wanlige/li-14b-v0.4
- model: tanliboy/lambda-qwen2.5-14b-dpo-test
- model: arcee-ai/Virtuoso-Small-v2
- model: sometimesanotion/Lamarck-14B-v0.7
- model: suayptalha/Lamarckvergence-14B
- model: YOYO-AI/Qwen2.5-14B-YOYO-V5
- model: deepcogito/cogito-v1-preview-qwen-14B
- model: Triangle104/Herodotos-14B
- model: djuna/Q2.5-Veltha-14B-0.5
merge_method: model_stock
base_model: spacematt/Qwen2.5-CompositeFlow-Coder-14B-Instruct
normalize: true
int8_mask: true
dtype: bfloat16