ArtusDev's picture
Update README.md
3694cc8 verified
metadata
base_model: NousResearch/Hermes-4-70B
base_model_relation: quantized
quantized_by: ArtusDev
language:
  - en
library_name: transformers
license: llama3
pipeline_tag: text-generation
tags:
  - Llama-3.1
  - instruct
  - finetune
  - reasoning
  - hybrid-mode
  - chatml
  - function calling
  - tool use
  - json mode
  - structured outputs
  - atropos
  - dataforge
  - long context
  - roleplaying
  - chat
  - exl3

EXL3 Quants of NousResearch/Hermes-4-70B

EXL3 quants of NousResearch/Hermes-4-70B using exllamav3 for quantization.

Quants

Quant(Revision) Bits per Weight Head Bits
2.5_H6 2.5 6
3.0_H6 3.0 6
3.5_H6 3.5 6
4.0_H6 4.0 6
4.25_H6 4.25 6
5.0_H6 5.0 6
6.0_H6 6.0 6
8.0_H8 8.0 8

Downloading quants with huggingface-cli

Click to view download instructions

Install hugginface-cli:

pip install -U "huggingface_hub[cli]"

Download quant by targeting the specific quant revision (branch):

huggingface-cli download ArtusDev/NousResearch_Hermes-4-70B-EXL3 --revision "5.0bpw_H6" --local-dir ./