prithivMLmods
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -212,6 +212,4 @@ Quantized models like **triangulum-10b-f16.gguf** are optimized for performance
|
|
212 |
|
213 |
## Conclusion
|
214 |
|
215 |
-
Running the **Triangulum-10B** model with Ollama provides a robust way to leverage open-source LLMs locally for diverse use cases. By following these steps, you can explore the capabilities of other open-source models in the future.
|
216 |
-
|
217 |
-
Happy experimenting!
|
|
|
212 |
|
213 |
## Conclusion
|
214 |
|
215 |
+
Running the **Triangulum-10B** model with Ollama provides a robust way to leverage open-source LLMs locally for diverse use cases. By following these steps, you can explore the capabilities of other open-source models in the future.
|
|
|
|