teknium commited on
Commit
ba3b410
·
verified ·
1 Parent(s): 6545947

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -14
README.md CHANGED
@@ -171,6 +171,15 @@ You can also run this model with vLLM, by running the following in your terminal
171
 
172
  You may then use the model over API using the OpenAI library just like you would call OpenAI's API.
173
 
 
 
 
 
 
 
 
 
 
174
  ## Prompt Format for Function Calling
175
 
176
  Our model was trained on specific system prompts and structures for Function Calling.
@@ -223,7 +232,6 @@ The stock fundamentals data for Tesla (TSLA) are as follows:
223
  This information provides a snapshot of Tesla's financial position and performance based on the fundamental data obtained from the yfinance API. It shows that Tesla has a substantial market capitalization and a relatively high P/E and P/B ratio compared to other stocks in its industry. The company does not pay a dividend at the moment, which is reflected by a 'Dividend Yield' of 'None'. The Beta value indicates that Tesla's stock has a moderate level of volatility relative to the market. The 52-week high and low prices give an idea of the stock's range over the past year. This data can be useful when assessing investment opportunities and making investment decisions.<|eot_id|><|start_header_id|>user<|end_header_id|>
224
  ```
225
 
226
-
227
  ## Prompt Format for JSON Mode / Structured Outputs
228
 
229
  Our model was also trained on a specific system prompt for Structured Outputs, which should respond with **only** a json object response, in a specific json schema.
@@ -259,16 +267,3 @@ GGUF Quants: https://huggingface.co/NousResearch/DeepHermes-3-Llama-3-3B-Preview
259
  year={2025}
260
  }
261
  ```
262
-
263
-
264
- ## Usage
265
-
266
- These models are compatible with [llama.cpp](https://github.com/ggerganov/llama.cpp) and similar frameworks.
267
-
268
- Example usage with llama.cpp:
269
- ```bash
270
- ./main -m /path/to/model.gguf -p "Hello, I am a language model" -n 128
271
- ```
272
-
273
- ## Upload Information
274
- Files were uploaded on Tue Mar 11 04:28:49 PDT 2025
 
171
 
172
  You may then use the model over API using the OpenAI library just like you would call OpenAI's API.
173
 
174
+ ## GGUF Llama.cpp Inference
175
+
176
+ These models are compatible with [llama.cpp](https://github.com/ggerganov/llama.cpp) and similar frameworks.
177
+
178
+ Example usage with llama.cpp:
179
+ ```bash
180
+ ./main -m /path/to/model.gguf -p "Hello, I am a language model" -n 128
181
+ ```
182
+
183
  ## Prompt Format for Function Calling
184
 
185
  Our model was trained on specific system prompts and structures for Function Calling.
 
232
  This information provides a snapshot of Tesla's financial position and performance based on the fundamental data obtained from the yfinance API. It shows that Tesla has a substantial market capitalization and a relatively high P/E and P/B ratio compared to other stocks in its industry. The company does not pay a dividend at the moment, which is reflected by a 'Dividend Yield' of 'None'. The Beta value indicates that Tesla's stock has a moderate level of volatility relative to the market. The 52-week high and low prices give an idea of the stock's range over the past year. This data can be useful when assessing investment opportunities and making investment decisions.<|eot_id|><|start_header_id|>user<|end_header_id|>
233
  ```
234
 
 
235
  ## Prompt Format for JSON Mode / Structured Outputs
236
 
237
  Our model was also trained on a specific system prompt for Structured Outputs, which should respond with **only** a json object response, in a specific json schema.
 
267
  year={2025}
268
  }
269
  ```