mikudev commited on
Commit
9b64c43
·
verified ·
1 Parent(s): bb206cf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ## Description
2
 
3
  This is a GPTQ 4-bit quantized version of [Llama-3-Lumimaid-8B-v0.1](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1)
4
- This was quantized using for `4096 seqlen` using the [AutoGPTQ wikitext2 example](https://github.com/AutoGPTQ/AutoGPTQ/blob/main/examples/quantization/basic_usage_wikitext2.py)
5
 
6
  This is my first quant, so I could have messed up somewhere. However, I did some testing and it looks like it's working well.
7
  by mikudev
 
1
  ## Description
2
 
3
  This is a GPTQ 4-bit quantized version of [Llama-3-Lumimaid-8B-v0.1](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1)
4
+ This was quantized using for `8192 seqlen` using the [AutoGPTQ wikitext2 example](https://github.com/AutoGPTQ/AutoGPTQ/blob/main/examples/quantization/basic_usage_wikitext2.py)
5
 
6
  This is my first quant, so I could have messed up somewhere. However, I did some testing and it looks like it's working well.
7
  by mikudev