TheBloke commited on
Commit
edef355
·
1 Parent(s): b7264d5

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -99,7 +99,7 @@ Models are released as sharded safetensors files.
99
 
100
  | Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
101
  | ------ | ---- | -- | ----------- | ------- | ---- |
102
- | [main](https://huggingface.co/TheBloke/Luban-13B-AWQ/tree/main) | 4 | 128 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.25 GB
103
 
104
  <!-- README_AWQ.md-provided-files end -->
105
 
@@ -111,7 +111,7 @@ Documentation on installing and using vLLM [can be found here](https://vllm.read
111
  - When using vLLM as a server, pass the `--quantization awq` parameter, for example:
112
 
113
  ```shell
114
- python3 python -m vllm.entrypoints.api_server --model TheBloke/Luban-13B-GPTQ --quantization awq
115
  ```
116
 
117
  When using vLLM from Python code, pass the `quantization=awq` parameter, for example:
 
99
 
100
  | Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
101
  | ------ | ---- | -- | ----------- | ------- | ---- |
102
+ | [main](https://huggingface.co/TheBloke/Luban-13B-AWQ/tree/main) | 4 | 128 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | Processing, coming soon
103
 
104
  <!-- README_AWQ.md-provided-files end -->
105
 
 
111
  - When using vLLM as a server, pass the `--quantization awq` parameter, for example:
112
 
113
  ```shell
114
+ python3 python -m vllm.entrypoints.api_server --model TheBloke/Luban-13B-AWQ --quantization awq
115
  ```
116
 
117
  When using vLLM from Python code, pass the `quantization=awq` parameter, for example: