mradermacher commited on
Commit
9f75945
·
verified ·
1 Parent(s): 1e96434

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -1
README.md CHANGED
@@ -30,7 +30,7 @@ tags:
30
  static quants of https://huggingface.co/AIgotahole/Glm4-9B-RP-brb
31
 
32
  <!-- provided-files -->
33
- weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
34
  ## Usage
35
 
36
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
@@ -44,7 +44,13 @@ more details, including on how to concatenate multi-part files.
44
  | Link | Type | Size/GB | Notes |
45
  |:-----|:-----|--------:|:------|
46
  | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q2_K.gguf) | Q2_K | 4.1 | |
 
 
 
47
  | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q4_K_S.gguf) | Q4_K_S | 5.9 | fast, recommended |
 
 
 
48
 
49
  Here is a handy graph by ikawrakow comparing some lower-quality quant
50
  types (lower is better):
 
30
  static quants of https://huggingface.co/AIgotahole/Glm4-9B-RP-brb
31
 
32
  <!-- provided-files -->
33
+ weighted/imatrix quants are available at https://huggingface.co/mradermacher/Glm4-9B-RP-brb-i1-GGUF
34
  ## Usage
35
 
36
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
 
44
  | Link | Type | Size/GB | Notes |
45
  |:-----|:-----|--------:|:------|
46
  | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q2_K.gguf) | Q2_K | 4.1 | |
47
+ | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q3_K_S.gguf) | Q3_K_S | 4.7 | |
48
+ | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q3_K_M.gguf) | Q3_K_M | 5.1 | lower quality |
49
+ | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q3_K_L.gguf) | Q3_K_L | 5.3 | |
50
  | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q4_K_S.gguf) | Q4_K_S | 5.9 | fast, recommended |
51
+ | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q4_K_M.gguf) | Q4_K_M | 6.3 | fast, recommended |
52
+ | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q6_K.gguf) | Q6_K | 8.4 | very good quality |
53
+ | [GGUF](https://huggingface.co/mradermacher/Glm4-9B-RP-brb-GGUF/resolve/main/Glm4-9B-RP-brb.Q8_0.gguf) | Q8_0 | 10.1 | fast, best quality |
54
 
55
  Here is a handy graph by ikawrakow comparing some lower-quality quant
56
  types (lower is better):