Adilzhan Ismailov
commited on
Commit
•
fab1f2f
1
Parent(s):
0db0c8c
Set `tie_word_embeddings` to False
Browse filesFixes the warning below when using multiple GPUs. The parameter is not utilised by Llava (and set to `false` for Llama)
```
2023-12-14 19:03:27,491 - accelerate.utils.modeling - WARNING - The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function.
```
- config.json +1 -0
config.json
CHANGED
@@ -18,6 +18,7 @@
|
|
18 |
"torch_dtype": "float16",
|
19 |
"vocab_size": 32064
|
20 |
},
|
|
|
21 |
"torch_dtype": "float16",
|
22 |
"transformers_version": "4.36.0.dev0",
|
23 |
"vision_config": {
|
|
|
18 |
"torch_dtype": "float16",
|
19 |
"vocab_size": 32064
|
20 |
},
|
21 |
+
"tie_word_embeddings": false,
|
22 |
"torch_dtype": "float16",
|
23 |
"transformers_version": "4.36.0.dev0",
|
24 |
"vision_config": {
|