auto-patch README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ tags:
|
|
23 |
static quants of https://huggingface.co/summykai/gemma3-27b-abliterated-dpo
|
24 |
|
25 |
<!-- provided-files -->
|
26 |
-
weighted/imatrix quants
|
27 |
## Usage
|
28 |
|
29 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
@@ -65,6 +65,6 @@ questions you might have and/or if you want some other model quantized.
|
|
65 |
|
66 |
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
67 |
me use its servers and providing upgrades to my workstation to enable
|
68 |
-
this work in my free time.
|
69 |
|
70 |
<!-- end -->
|
|
|
23 |
static quants of https://huggingface.co/summykai/gemma3-27b-abliterated-dpo
|
24 |
|
25 |
<!-- provided-files -->
|
26 |
+
weighted/imatrix quants are available at https://huggingface.co/mradermacher/gemma3-27b-abliterated-dpo-i1-GGUF
|
27 |
## Usage
|
28 |
|
29 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
|
|
65 |
|
66 |
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
67 |
me use its servers and providing upgrades to my workstation to enable
|
68 |
+
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
|
69 |
|
70 |
<!-- end -->
|