fix broken links to llama-cpp-python (#2)
Browse files- fix broken links to llama-cpp-python (b49430b264ddb3a67d2124f7c717f5b2168212ca)
Co-authored-by: Kenneth Hamilton <[email protected]>
README.md
CHANGED
@@ -146,7 +146,7 @@ For other parameters and how to use them, please refer to [the llama.cpp documen
|
|
146 |
|
147 |
## How to run in `text-generation-webui`
|
148 |
|
149 |
-
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20
|
150 |
|
151 |
## How to run from Python code
|
152 |
|
@@ -154,7 +154,7 @@ You can use GGUF models from Python using the [llama-cpp-python](https://github.
|
|
154 |
|
155 |
### How to load this model in Python code, using llama-cpp-python
|
156 |
|
157 |
-
For full documentation, please see: [llama-cpp-python docs](https://
|
158 |
|
159 |
#### First install the package
|
160 |
|
|
|
146 |
|
147 |
## How to run in `text-generation-webui`
|
148 |
|
149 |
+
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20-%20Model%20Tab.md#llamacpp).
|
150 |
|
151 |
## How to run from Python code
|
152 |
|
|
|
154 |
|
155 |
### How to load this model in Python code, using llama-cpp-python
|
156 |
|
157 |
+
For full documentation, please see: [llama-cpp-python docs](https://github.com/abetlen/llama-cpp-python).
|
158 |
|
159 |
#### First install the package
|
160 |
|