Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ tags:
|
|
16 |
- mergekit
|
17 |
- merge
|
18 |
- mistral
|
19 |
-
quantized_by:
|
20 |
pipeline_tag: text-generation
|
21 |
---
|
22 |
|
@@ -26,11 +26,11 @@ Find the original model card [here](https://huggingface.co/ockerman0/MN-12B-Star
|
|
26 |
|
27 |
## Base repo only contains the measurement file, see revisions for the quants.
|
28 |
|
29 |
-
- [measurement.json](https://huggingface.co/
|
30 |
-
- [3.0bpw](https://huggingface.co/
|
31 |
-
- [4.0bpw](https://huggingface.co/
|
32 |
-
- [5.0bpw](https://huggingface.co/
|
33 |
-
- [6.0bpw](https://huggingface.co/
|
34 |
|
35 |
## Notes
|
36 |
Making these was a lesson in pain and humility. It has been over two months since the day I decided "hm today i will learn how to make exl2 quants" <- (clueless). First my conda env
|
|
|
16 |
- mergekit
|
17 |
- merge
|
18 |
- mistral
|
19 |
+
quantized_by: tsss
|
20 |
pipeline_tag: text-generation
|
21 |
---
|
22 |
|
|
|
26 |
|
27 |
## Base repo only contains the measurement file, see revisions for the quants.
|
28 |
|
29 |
+
- [measurement.json](https://huggingface.co/tssst/MN-12B-Starcannon-v5.5-unofficial-EXL2/tree/main)
|
30 |
+
- [3.0bpw](https://huggingface.co/tssst/MN-12B-Starcannon-v5.5-unofficial-EXL2/tree/3.0bpw)
|
31 |
+
- [4.0bpw](https://huggingface.co/tssst/MN-12B-Starcannon-v5.5-unofficial-EXL2/tree/4.0bpw)
|
32 |
+
- [5.0bpw](https://huggingface.co/tssst/MN-12B-Starcannon-v5.5-unofficial-EXL2/tree/5.0bpw)
|
33 |
+
- [6.0bpw](https://huggingface.co/tssst/MN-12B-Starcannon-v5.5-unofficial-EXL2/tree/6.0bpw)
|
34 |
|
35 |
## Notes
|
36 |
Making these was a lesson in pain and humility. It has been over two months since the day I decided "hm today i will learn how to make exl2 quants" <- (clueless). First my conda env
|