Mungert commited on
Commit
e2758f3
·
verified ·
1 Parent(s): b11759a

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -200,7 +200,7 @@ These models are optimized for **extreme memory efficiency**, making them ideal
200
  # <span id="testllm" style="color: #7F7FFF;">🚀 If you find these models useful</span>
201
  ❤ **Please click "Like" if you find this useful!**
202
  Help me test my **AI-Powered Network Monitor Assistant** with **quantum-ready security checks**:
203
- 👉 [Free Network Monitor](https://readyforquantum.com)
204
 
205
  💬 **How to test**:
206
  1. Click the **chat icon** (bottom right on any page)
@@ -226,7 +226,7 @@ I’m pushing the limits of **small open-source models for AI network monitoring
226
  🟢 **TurboLLM** – Uses **gpt-4-mini** for:
227
  - **Real-time network diagnostics**
228
  - **Automated penetration testing** (Nmap/Metasploit)
229
- - 🔑 Get more tokens by [downloading our Free Network Monitor Agent](https://readyforquantum.com/download/?utm_source=huggingface&utm_medium=referral&utm_campaign=huggingface_repo_readme)
230
 
231
  🔵 **HugLLM** – Open-source models (≈8B params):
232
  - **2x more tokens** than TurboLLM
@@ -237,10 +237,10 @@ I’m pushing the limits of **small open-source models for AI network monitoring
237
  1. `"Give me info on my websites SSL certificate"`
238
  2. `"Check if my server is using quantum safe encyption for communication"`
239
  3. `"Run a quick Nmap vulnerability test"`
240
- 4. '"Create a cmd processor to .. (what ever you want)" Note you need to install a Free Network Monitor Agent to run the .net code from. This is a very flexible and powerful feature. Use with caution!
241
 
242
  ### Final word
243
- I fund the servers to create the models files, run the Free Network Monitor Service and Pay for Inference from Novita and OpenAI all from my own pocket. All of the code for creating the models and the work I have done with Free Network Monitor is [open source](https://github.com/Mungert69). Feel free to use what you find useful. Please support my work and consider [buying me a coffee](https://www.buymeacoffee.com/mahadeva) .
244
  This will help me pay for the services and increase the token limits for everyone.
245
 
246
  Thank you :)
 
200
  # <span id="testllm" style="color: #7F7FFF;">🚀 If you find these models useful</span>
201
  ❤ **Please click "Like" if you find this useful!**
202
  Help me test my **AI-Powered Network Monitor Assistant** with **quantum-ready security checks**:
203
+ 👉 [Quantum Network Monitor](https://readyforquantum.com)
204
 
205
  💬 **How to test**:
206
  1. Click the **chat icon** (bottom right on any page)
 
226
  🟢 **TurboLLM** – Uses **gpt-4-mini** for:
227
  - **Real-time network diagnostics**
228
  - **Automated penetration testing** (Nmap/Metasploit)
229
+ - 🔑 Get more tokens by [downloading our Quantum Network Monitor Agent](https://readyforquantum.com/download/?utm_source=huggingface&utm_medium=referral&utm_campaign=huggingface_repo_readme)
230
 
231
  🔵 **HugLLM** – Open-source models (≈8B params):
232
  - **2x more tokens** than TurboLLM
 
237
  1. `"Give me info on my websites SSL certificate"`
238
  2. `"Check if my server is using quantum safe encyption for communication"`
239
  3. `"Run a quick Nmap vulnerability test"`
240
+ 4. '"Create a cmd processor to .. (what ever you want)" Note you need to install a Quantum Network Monitor Agent to run the .net code from. This is a very flexible and powerful feature. Use with caution!
241
 
242
  ### Final word
243
+ I fund the servers to create the models files, run the Quantum Network Monitor Service and Pay for Inference from Novita and OpenAI all from my own pocket. All of the code for creating the models and the work I have done with Quantum Network Monitor is [open source](https://github.com/Mungert69). Feel free to use what you find useful. Please support my work and consider [buying me a coffee](https://www.buymeacoffee.com/mahadeva) .
244
  This will help me pay for the services and increase the token limits for everyone.
245
 
246
  Thank you :)