Text Generation
Transformers
Safetensors
falcon_h1
falcon-h1
ybelkada commited on
Commit
dc42c36
·
verified ·
1 Parent(s): 62276d6

docs: fix VLLM installation guideline

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -65,7 +65,11 @@ Make sure to install the latest version of `transformers` or `vllm`, eventually
65
  pip install git+https://github.com/huggingface/transformers.git
66
  ```
67
 
68
- Refer to [the official vLLM documentation for more details on building vLLM from source](https://docs.vllm.ai/en/latest/getting_started/installation/gpu.html#build-wheel-from-source).
 
 
 
 
69
 
70
  ### 🤗 transformers
71
 
@@ -91,7 +95,7 @@ model = AutoModelForCausalLM.from_pretrained(
91
  For vLLM, simply start a server by executing the command below:
92
 
93
  ```
94
- # pip install vllm
95
  vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
96
  ```
97
 
 
65
  pip install git+https://github.com/huggingface/transformers.git
66
  ```
67
 
68
+ For vLLM, make sure to install `vllm>=0.9.0`:
69
+
70
+ ```bash
71
+ pip install "vllm>=0.9.0"
72
+ ```
73
 
74
  ### 🤗 transformers
75
 
 
95
  For vLLM, simply start a server by executing the command below:
96
 
97
  ```
98
+ # pip install vllm>=0.9.0
99
  vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
100
  ```
101