Update README.md
Browse files
README.md
CHANGED
@@ -22,5 +22,7 @@ EXL2 quants of [Mistral-Large-Instruct-2407](https://huggingface.co/mistralai/Mi
|
|
22 |
[4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.0bpw)
|
23 |
[4.25 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.25bpw)
|
24 |
[4.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.5bpw)
|
|
|
|
|
25 |
|
26 |
[measurement.json](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/blob/main/measurement.json)
|
|
|
22 |
[4.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.0bpw)
|
23 |
[4.25 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.25bpw)
|
24 |
[4.50 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.5bpw)
|
25 |
+
[4.75 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/4.75bpw)
|
26 |
+
[5.00 bits per weight](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/tree/5.0bpw)
|
27 |
|
28 |
[measurement.json](https://huggingface.co/turboderp/Mistral-Large-Instruct-2407-123B-exl2/blob/main/measurement.json)
|