File size: 4,270 Bytes
8eba4f3 69d73bc 8eba4f3 3514dd0 8eba4f3 755245a 8eba4f3 755245a 9a4eec4 fe7fb4b b9b7d86 0a5c5fa 8eba4f3 0a5c5fa 44cea97 8eba4f3 44cea97 0a5c5fa fe7fb4b 0a5c5fa b93b101 44cea97 0a5c5fa b93b101 9a4eec4 8eba4f3 69d73bc 5a44c24 8eba4f3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
---
base_model: elinas/chronos007-70b
language:
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
tags:
- chat
- roleplay
- storywriting
- merge
---
## About
weighted/imatrix quants of https://huggingface.co/elinas/chronos007-70b
<!-- provided-files -->
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ1_S.gguf) | i1-IQ1_S | 15.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ1_M.gguf) | i1-IQ1_M | 16.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 18.7 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ2_XS.gguf) | i1-IQ2_XS | 20.8 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ2_S.gguf) | i1-IQ2_S | 21.8 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ2_M.gguf) | i1-IQ2_M | 23.7 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q2_K.gguf) | i1-Q2_K | 25.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 27.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ3_XS.gguf) | i1-IQ3_XS | 28.6 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ3_S.gguf) | i1-IQ3_S | 30.3 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q3_K_S.gguf) | i1-Q3_K_S | 30.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-IQ3_M.gguf) | i1-IQ3_M | 31.4 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q3_K_M.gguf) | i1-Q3_K_M | 33.7 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q3_K_L.gguf) | i1-Q3_K_L | 36.6 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q4_K_S.gguf) | i1-Q4_K_S | 39.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q4_K_M.gguf) | i1-Q4_K_M | 41.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q5_K_S.gguf) | i1-Q5_K_S | 47.9 | |
| [GGUF](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q5_K_M.gguf) | i1-Q5_K_M | 49.2 | |
| [PART 1](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/chronos007-70b-i1-GGUF/resolve/main/chronos007-70b.i1-Q6_K.gguf.part2of2) | i1-Q6_K | 56.7 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|