updated readme
Browse files
README.md
CHANGED
@@ -67,7 +67,7 @@ BPW = 5.0
|
|
67 |
|
68 |
# define variables
|
69 |
model_name = model_id.split("/")[-1]
|
70 |
-
|
71 |
|
72 |
```
|
73 |
|
@@ -75,13 +75,13 @@ quant_name = model_id.split("/")[-1] + f"-{BPW:.1f}-bpw-exl2"
|
|
75 |
```shell
|
76 |
!git-lfs install
|
77 |
# download the model to loacl directory
|
78 |
-
!git clone https://{username}:{HF_TOKEN}@huggingface.co/{model_id} {
|
79 |
```
|
80 |
|
81 |
#### Run Inference on quantized model using
|
82 |
```shell
|
83 |
# Run model
|
84 |
-
!python exllamav2/test_inference.py -m {
|
85 |
```
|
86 |
|
87 |
## Uses
|
|
|
67 |
|
68 |
# define variables
|
69 |
model_name = model_id.split("/")[-1]
|
70 |
+
|
71 |
|
72 |
```
|
73 |
|
|
|
75 |
```shell
|
76 |
!git-lfs install
|
77 |
# download the model to loacl directory
|
78 |
+
!git clone https://{username}:{HF_TOKEN}@huggingface.co/{model_id} {model_name}
|
79 |
```
|
80 |
|
81 |
#### Run Inference on quantized model using
|
82 |
```shell
|
83 |
# Run model
|
84 |
+
!python exllamav2/test_inference.py -m {model_name}/ -p "Tell me a funny joke about Large Language Models meeting a Blackhole in an intergalactic Bar."
|
85 |
```
|
86 |
|
87 |
## Uses
|