docs: fix VLLM installation guideline
Browse files
README.md
CHANGED
@@ -65,7 +65,11 @@ Make sure to install the latest version of `transformers` or `vllm`, eventually
|
|
65 |
pip install git+https://github.com/huggingface/transformers.git
|
66 |
```
|
67 |
|
68 |
-
|
|
|
|
|
|
|
|
|
69 |
|
70 |
### 🤗 transformers
|
71 |
|
@@ -91,7 +95,7 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
91 |
For vLLM, simply start a server by executing the command below:
|
92 |
|
93 |
```
|
94 |
-
# pip install vllm
|
95 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
96 |
```
|
97 |
|
|
|
65 |
pip install git+https://github.com/huggingface/transformers.git
|
66 |
```
|
67 |
|
68 |
+
For vLLM, make sure to install `vllm>=0.9.0`:
|
69 |
+
|
70 |
+
```bash
|
71 |
+
pip install "vllm>=0.9.0"
|
72 |
+
```
|
73 |
|
74 |
### 🤗 transformers
|
75 |
|
|
|
95 |
For vLLM, simply start a server by executing the command below:
|
96 |
|
97 |
```
|
98 |
+
# pip install vllm>=0.9.0
|
99 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
100 |
```
|
101 |
|