Update README.md
Browse files
README.md
CHANGED
@@ -97,7 +97,7 @@ The resulting 4bit version of the model is roughly 40GB in size. The model can r
|
|
97 |
|
98 |
In addition to running the model in gradio, as sketched above, you can also deploy on-premise using the ollama-library (version: v0.6.7). After setting up a ollama-"modelfile" according to your use case (e.g. the preferred system prompt and some additional setups can be found in the config files of the model) you can add the model to ollama like this:
|
99 |
```
|
100 |
-
ollama create ncos-
|
101 |
```
|
102 |
|
103 |
|
|
|
97 |
|
98 |
In addition to running the model in gradio, as sketched above, you can also deploy on-premise using the ollama-library (version: v0.6.7). After setting up a ollama-"modelfile" according to your use case (e.g. the preferred system prompt and some additional setups can be found in the config files of the model) you can add the model to ollama like this:
|
99 |
```
|
100 |
+
ollama create ncos-q4_0 -f ./ncos-gguf/Modelfile
|
101 |
```
|
102 |
|
103 |
|