βββββ βββββ βββ β β βββ ββ ββββ ββββββ
β ββ ββ β ββ ββ β β β β β β β β β
β βββββ ββββ β β β β ββ β β β ββββ β β ββββββ
βββββββ ββ ββ ββ ββ β β β β β β β β β β β β
βββββ βββββ ββ ββ β β β ββββ β βββββ ββ βββββ
βββ β ββ β
β
ββΰ¨ΰ§Λ THE PRIMΓTOILE ENGINE Λΰ¨ΰ§βqΛβ
β Visual Novel generation under starlight β
Version | Type | Strengths | Weaknesses | Recommended Use |
---|---|---|---|---|
Secunda-0.1-GGUF / RAW | Instruction | - Most precise - Coherent code - Perfected Modelfile |
- Smaller context / limited flexibility | Production / Baseline |
Secunda-0.3-F16-QA | QA-based Input | - Acceptable for question-based generation | - Less accurate than 0.1 - Not as coherent |
Prototyping (QA mode) |
Secunda-0.3-F16-TEXT | Text-to-text | - Flexible for freeform tasks | - Slightly off - Modelfile-dependent |
Experimental / Text rewrite |
Secunda-0.3-GGUF | GGUF build | - Portable GGUF of 0.3 | - Inherits 0.3 weaknesses | Lightweight local testing |
Secunda-0.5-RAW | QA Natural | - Best QA understanding - Long-form generation potential |
- Inconsistent output length - Some instability |
Research / Testing LoRA |
Secunda-0.5-GGUF | GGUF build | - Portable, inference-ready version of 0.5 | - Shares issues of 0.5 | Offline experimentation |
π Secunda-0.5-GGUF [EXPERIMENTAL]
For more concise and controlled results, I recommend using Secunda-0.1-GGUF (best) or Secunda-0.3-GGUF
Secunda-0.5-GGUF is the most verbose release of the Secunda model to date:
- Fine-tuned to generate fully structured Ren'Py
.rpy
scripts from natural language prompts β now available in GGUF format for fast local inference viallama.cpp
,ollama
,llamafile
, orLM Studio
. - Yet, it is also the most capricious and least reliable.
- It needs further prompt-engineering, though we believe our Modelfile might be enough to yield acceptable results.
β¨ What it generates
Secunda-0.5-GGUF excels at:
- β Creating very long scripts
- β Backgrounds & sprite declarations
- β Emotional, coherent dialogue
π Model Variants
Variant | Quant Type | Filename | Notes |
---|---|---|---|
π₯ Main | q8_0 |
secunda-0.5-q8_0.gguf |
Best quality. Use this for real content generation. |
π Experimental | tq2_0 |
secunda-0.5-tq2_0.gguf |
Smaller, faster. May lose stylistic precision. |
βοΈ Tiny Deploy | tq1_0 |
secunda-0.5-tq1_0.gguf |
Edge-case or testing deployments. |
π How to Run (Ollama)
ollama create secunda -f Modelfile
ollama run secunda
Prompt example:
You can write naturally, without any issues.
πΊ Important: Always run with the
Modelfile
that defines structured behavior and generation tags.
π§ͺ Evaluation
- β Over 450 test scripts generated
- β 75%+ valid Ren'Py syntax
- β Scene and tone fidelity preserved across chapters
π§Ό Training Integrity
β οΈ NO HUMAN-WRITTEN VISUAL NOVELS were used in training β οΈ Secunda was trained using machine-generated, structured examples and dataset augmentations. We respect the hard work of VN creators.
π If you love visual novels, support indie authors on itch.io!
π Citation
@misc{secunda2025gguf,
title={Secunda-0.5-GGUF},
author={Yaroster},
year={2025},
note={https://huggingface.co/Yaroster/Secunda-0.5-GGUF}
}
π Related Projects
- PrimΓ©toile β full pipeline for multi-chapter VN generation
- Secunda-0.3-F16-QA β QA-style narrative variant
- Secunda-0.3-F16-TEXT β unstructured story generation
- Secunda-0.1-RAW β original fine-tuned model
π« Final Words
β§ Because stories can spark from a single phrase β§ This is Secunda-0.5-GGUF β and it finally writes like a novelist.
- Downloads last month
- 67
Hardware compatibility
Log In
to view the estimation
1-bit
2-bit
8-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support