|
--- |
|
base_model: |
|
- macadeliccc/WestLake-7B-v2-laser-truthy-dpo |
|
- ChaoticNeutrals/This_is_fine_7B |
|
library_name: transformers |
|
tags: |
|
- mistral |
|
- quantized |
|
- text-generation-inference |
|
- mergekit |
|
- merge |
|
pipeline_tag: text-generation |
|
inference: false |
|
--- |
|
# [Uploading Q3, Q4, Q5, Q6 and Q8.] |
|
|
|
# **GGUF-Imatrix quantizations for [ChaoticNeutrals/Prodigy_7B](https://huggingface.co/ChaoticNeutrals/Prodigy_7B/).** |
|
|
|
*If you want any specific quantization to be added, feel free to ask.* |
|
|
|
All credits belong to the [creator](https://huggingface.co/ChaoticNeutrals/). |
|
|
|
`Base⇢ GGUF(F16)⇢ Imatrix-Data(F16)⇢ GGUF(Imatrix-Quants)` |
|
|
|
The new **IQ3_S** quant-option has shown to be better than the old Q3_K_S, so I added that instead of the later. Only supported in `koboldcpp-1.59.1` or higher. |
|
|
|
Using [llama.cpp](https://github.com/ggerganov/llama.cpp/)-[b2277](https://github.com/ggerganov/llama.cpp/releases/tag/b2277). |
|
|
|
For --imatrix data, `imatrix-Prodigy_7B-F16.dat` was used. |
|
|
|
# Original model information: |
|
|
|
# Wing |
|
|
|
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/S-E_CADzfAg3xaVX01rdx.jpeg) |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the SLERP merge method. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [macadeliccc/WestLake-7B-v2-laser-truthy-dpo](https://huggingface.co/macadeliccc/WestLake-7B-v2-laser-truthy-dpo) |
|
* [ChaoticNeutrals/This_is_fine_7B](https://huggingface.co/ChaoticNeutrals/This_is_fine_7B) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
slices: |
|
- sources: |
|
- model: ChaoticNeutrals/This_is_fine_7B |
|
layer_range: [0, 32] |
|
- model: macadeliccc/WestLake-7B-v2-laser-truthy-dpo |
|
layer_range: [0, 32] |
|
merge_method: slerp |
|
base_model: macadeliccc/WestLake-7B-v2-laser-truthy-dpo |
|
parameters: |
|
t: |
|
- filter: self_attn |
|
value: [0, 0.5, 0.3, 0.7, 1] |
|
- filter: mlp |
|
value: [1, 0.5, 0.7, 0.3, 0] |
|
- value: 0.5 |
|
dtype: float16 |
|
``` |