Update README.md
Browse files
README.md
CHANGED
@@ -8,13 +8,16 @@ library_name: transformers
|
|
8 |
tags:
|
9 |
- mergekit
|
10 |
- merge
|
11 |
-
|
12 |
---
|
|
|
|
|
|
|
|
|
|
|
13 |
# merge
|
14 |
|
15 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
16 |
|
17 |
-
## Merge Details
|
18 |
### Merge Method
|
19 |
|
20 |
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [prithivMLmods/Calcium-Opus-14B-Elite](https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite) as a base.
|
|
|
8 |
tags:
|
9 |
- mergekit
|
10 |
- merge
|
|
|
11 |
---
|
12 |
+
|
13 |
+
# **Calcium-Opus-14B-Elite-Stock**
|
14 |
+
|
15 |
+
Calcium-Opus-14B-Elite is based on the Qwen 2.5 14B modality architecture, designed to enhance the reasoning capabilities of 14B-parameter models. These models have proven effective in context understanding, reasoning, and mathematical problem-solving.It has been fine-tuned using a long chain-of-thought reasoning model and specialized datasets, with a focus on chain-of-thought (CoT) reasoning for problem-solving. This model is optimized for tasks requiring logical reasoning, detailed explanations, and multi-step problem-solving, making it ideal for applications such as instruction-following, text generation, and complex reasoning tasks.
|
16 |
+
|
17 |
# merge
|
18 |
|
19 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
20 |
|
|
|
21 |
### Merge Method
|
22 |
|
23 |
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [prithivMLmods/Calcium-Opus-14B-Elite](https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite) as a base.
|