Llamacpp quants
Browse files- .gitattributes +16 -0
- Hermes-2-Pro-Mistral-10.7B-IQ3_M.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-IQ3_S.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-IQ4_NL.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-IQ4_XS.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q2_K.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q3_K_L.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q3_K_M.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q3_K_S.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q4_0.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q4_K_M.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q4_K_S.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q5_0.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q5_K_M.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q5_K_S.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q6_K.gguf +3 -0
- Hermes-2-Pro-Mistral-10.7B-Q8_0.gguf +3 -0
- README.md +64 -0
.gitattributes
CHANGED
@@ -33,3 +33,19 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
Hermes-2-Pro-Mistral-10.7B-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
+
Hermes-2-Pro-Mistral-10.7B-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
Hermes-2-Pro-Mistral-10.7B-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
Hermes-2-Pro-Mistral-10.7B-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
Hermes-2-Pro-Mistral-10.7B-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
41 |
+
Hermes-2-Pro-Mistral-10.7B-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
42 |
+
Hermes-2-Pro-Mistral-10.7B-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
43 |
+
Hermes-2-Pro-Mistral-10.7B-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
44 |
+
Hermes-2-Pro-Mistral-10.7B-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
45 |
+
Hermes-2-Pro-Mistral-10.7B-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
46 |
+
Hermes-2-Pro-Mistral-10.7B-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
47 |
+
Hermes-2-Pro-Mistral-10.7B-Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
|
48 |
+
Hermes-2-Pro-Mistral-10.7B-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
49 |
+
Hermes-2-Pro-Mistral-10.7B-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
50 |
+
Hermes-2-Pro-Mistral-10.7B-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
51 |
+
Hermes-2-Pro-Mistral-10.7B-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
Hermes-2-Pro-Mistral-10.7B-IQ3_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1dc4ea6b52979b529e6ebe17700985bc93718d7a75de4cdbb0ad627b000c21b7
|
3 |
+
size 4845215008
|
Hermes-2-Pro-Mistral-10.7B-IQ3_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6edd9e3109e5fde0007fe0a8c08742c6ca90e43d8fb4480a82fe997dd5b17330
|
3 |
+
size 4691467552
|
Hermes-2-Pro-Mistral-10.7B-IQ4_NL.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:09381ce6d13ccdd6d8e1ea9d6e91c2428281a4910f3a8605e587d5b7db86517f
|
3 |
+
size 6141772064
|
Hermes-2-Pro-Mistral-10.7B-IQ4_XS.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0b94cb8e182a12182cbb6274a39d6b95af4988ec9d1164574bec0e0bdd6d2a58
|
3 |
+
size 5827817760
|
Hermes-2-Pro-Mistral-10.7B-Q2_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b761afceb9d71464bdda07f08d2e77f03801187913b4f632e1cfd44202458442
|
3 |
+
size 4003383584
|
Hermes-2-Pro-Mistral-10.7B-Q3_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0662fc499f4886f199ead0973c892d468651e71cd05da5840660358dbb637b03
|
3 |
+
size 5650914592
|
Hermes-2-Pro-Mistral-10.7B-Q3_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:85fe3405b9745a1c50290afcba02ae78943596219961f27988ae1c658091c218
|
3 |
+
size 5195832608
|
Hermes-2-Pro-Mistral-10.7B-Q3_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:991f7b9b82cdfe524bfc9edf21e6d04fc23e0248d15953f221ec9b7526399ed0
|
3 |
+
size 4664728864
|
Hermes-2-Pro-Mistral-10.7B-Q4_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:263a10338797c1cf0836be92c10551c5ca81a404850f17bc6e480027111b77d6
|
3 |
+
size 6072566048
|
Hermes-2-Pro-Mistral-10.7B-Q4_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ccd67deed9cd444e39fb9d221444ed0bc3b1d46b4e09e90b03f88e4cfd2cfafc
|
3 |
+
size 6461849888
|
Hermes-2-Pro-Mistral-10.7B-Q4_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4bcbf8cf7e13386f35324a30fce2e2e11e8cad8143cd046f1812d8788dfeb5b7
|
3 |
+
size 6118703392
|
Hermes-2-Pro-Mistral-10.7B-Q5_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7c791f54039407171b4e069614a13b7721f85b879e31bd039772bd44681e7aba
|
3 |
+
size 7397589280
|
Hermes-2-Pro-Mistral-10.7B-Q5_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f399cddb91416bedb109ed2e03f286fc63a4d6f7c908a03c2a2ea1e69c20831a
|
3 |
+
size 7598129440
|
Hermes-2-Pro-Mistral-10.7B-Q5_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f9eb53c81b9ca4f3f2154251eb23f3d0e3bfe5d9883e3a5bdafd1853f0b349fe
|
3 |
+
size 7397589280
|
Hermes-2-Pro-Mistral-10.7B-Q6_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:63b4dd7bed4b8649f5271600dba0fab5d8d79fa836b2f26254f9e63cd034fd58
|
3 |
+
size 8805426464
|
Hermes-2-Pro-Mistral-10.7B-Q8_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b7b0a090def0131d1b0d69c3468b860e59bc10dc71fb8a152207363799ef4754
|
3 |
+
size 11404434720
|
README.md
ADDED
@@ -0,0 +1,64 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
base_model: mistralai/Mistral-7B-v0.1
|
3 |
+
library_name: transformers
|
4 |
+
tags:
|
5 |
+
- mergekit
|
6 |
+
- merge
|
7 |
+
- Mistral
|
8 |
+
- instruct
|
9 |
+
- finetune
|
10 |
+
- chatml
|
11 |
+
- DPO
|
12 |
+
- RLHF
|
13 |
+
- gpt4
|
14 |
+
- synthetic data
|
15 |
+
- distillation
|
16 |
+
- function calling
|
17 |
+
- json mode
|
18 |
+
model-index:
|
19 |
+
- name: Hermes-2-Pro-Mistral-10.7B
|
20 |
+
results: []
|
21 |
+
license: apache-2.0
|
22 |
+
language:
|
23 |
+
- en
|
24 |
+
datasets:
|
25 |
+
- teknium/OpenHermes-2.5
|
26 |
+
widget:
|
27 |
+
- example_title: Hermes 2 Pro
|
28 |
+
messages:
|
29 |
+
- role: system
|
30 |
+
content: You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.
|
31 |
+
- role: user
|
32 |
+
content: Write a short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.
|
33 |
+
quantized_by: bartowski
|
34 |
+
pipeline_tag: text-generation
|
35 |
+
---
|
36 |
+
|
37 |
+
## Llamacpp Quantizations of Hermes-2-Pro-Mistral-10.7B
|
38 |
+
|
39 |
+
Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b2536">b2536</a> for quantization.
|
40 |
+
|
41 |
+
Original model: https://huggingface.co/Joseph717171/Hermes-2-Pro-Mistral-10.7B
|
42 |
+
|
43 |
+
Download a file (not the whole branch) from below:
|
44 |
+
|
45 |
+
| Filename | Quant type | File Size | Description |
|
46 |
+
| -------- | ---------- | --------- | ----------- |
|
47 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q8_0.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q8_0.gguf) | Q8_0 | 11.40GB | Extremely high quality, generally unneeded but max available quant. |
|
48 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q6_K.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q6_K.gguf) | Q6_K | 8.80GB | Very high quality, near perfect, *recommended*. |
|
49 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q5_K_M.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q5_K_M.gguf) | Q5_K_M | 7.59GB | High quality, very usable. |
|
50 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q5_K_S.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q5_K_S.gguf) | Q5_K_S | 7.39GB | High quality, very usable. |
|
51 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q5_0.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q5_0.gguf) | Q5_0 | 7.39GB | High quality, older format, generally not recommended. |
|
52 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q4_K_M.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q4_K_M.gguf) | Q4_K_M | 6.46GB | Good quality, uses about 4.83 bits per weight. |
|
53 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q4_K_S.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q4_K_S.gguf) | Q4_K_S | 6.11GB | Slightly lower quality with small space savings. |
|
54 |
+
| [Hermes-2-Pro-Mistral-10.7B-IQ4_NL.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-IQ4_NL.gguf) | IQ4_NL | 6.14GB | Decent quality, similar to Q4_K_S, new method of quanting, |
|
55 |
+
| [Hermes-2-Pro-Mistral-10.7B-IQ4_XS.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-IQ4_XS.gguf) | IQ4_XS | 5.82GB | Decent quality, new method with similar performance to Q4. |
|
56 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q4_0.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q4_0.gguf) | Q4_0 | 6.07GB | Decent quality, older format, generally not recommended. |
|
57 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q3_K_L.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q3_K_L.gguf) | Q3_K_L | 5.65GB | Lower quality but usable, good for low RAM availability. |
|
58 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q3_K_M.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q3_K_M.gguf) | Q3_K_M | 5.19GB | Even lower quality. |
|
59 |
+
| [Hermes-2-Pro-Mistral-10.7B-IQ3_M.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-IQ3_M.gguf) | IQ3_M | 4.84GB | Medium-low quality, new method with decent performance. |
|
60 |
+
| [Hermes-2-Pro-Mistral-10.7B-IQ3_S.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-IQ3_S.gguf) | IQ3_S | 4.69GB | Lower quality, new method with decent performance, recommended over Q3 quants. |
|
61 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q3_K_S.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q3_K_S.gguf) | Q3_K_S | 4.66GB | Low quality, not recommended. |
|
62 |
+
| [Hermes-2-Pro-Mistral-10.7B-Q2_K.gguf](https://huggingface.co/bartowski/Hermes-2-Pro-Mistral-10.7B-GGUF/blob/main/Hermes-2-Pro-Mistral-10.7B-Q2_K.gguf) | Q2_K | 4.00GB | Extremely low quality, *not* recommended.
|
63 |
+
|
64 |
+
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|