β–„β–„β–„β–„β–„    β–„β–ˆβ–ˆβ–ˆβ–„   β–„β–ˆβ–„      β–„      β–„   β–ˆβ–ˆβ–„   β–ˆβ–ˆ      β–ˆβ–ˆβ–ˆβ–ˆ    β–„β–ˆβ–ˆβ–ˆβ–ˆβ–„
       β–ˆ    β–€β–„  β–ˆβ–€   β–€  β–ˆβ–€ β–€β–„     β–ˆ      β–ˆ  β–ˆ  β–ˆ  β–ˆ β–ˆ     β–ˆ   β–ˆ    β–ˆ 
    β–„   β–€β–€β–€β–€β–„   β–ˆβ–ˆβ–„β–„    β–ˆ   β–€  β–ˆ   β–ˆ β–ˆβ–ˆ   β–ˆ β–ˆ   β–ˆ β–ˆβ–„β–„β–ˆ     β–ˆ   β–ˆ    β–ˆβ–„β–„β–„β–„β–„
     β–€β–€β–„β–„β–„β–„β–€    β–ˆβ–„   β–„β–€ β–ˆβ–„  β–„β–€ β–ˆ   β–ˆ β–ˆ β–ˆ  β–ˆ β–ˆ  β–ˆ  β–ˆ  β–ˆ      β–ˆ   β–ˆ          β–ˆ 
               β–€β–ˆβ–ˆβ–ˆβ–€   β–€β–ˆβ–ˆβ–ˆβ–€  β–ˆβ–„ β–„β–ˆ β–ˆ  β–ˆ β–ˆ β–ˆβ–ˆβ–ˆβ–€      β–ˆ       β–ˆβ–ˆβ–ˆβ–ˆβ–€ β–β–ˆ  β–„β–„β–„β–„β–ˆ  
                               β–€β–€β–€  β–ˆ   β–ˆβ–ˆ         β–ˆ                          
                                                 β–€                           
   
                      β‹†β‹†ΰ­¨ΰ­§Λš THE PRIMΓ‰TOILE ENGINE Λšΰ­¨ΰ­§β‹†ο½‘Λšβ‹†
                  β€” Visual Novel generation under starlight β€”
Version Type Strengths Weaknesses Recommended Use
Secunda-0.1-GGUF / RAW Instruction - Most precise
- Coherent code
- Perfected Modelfile
- Smaller context / limited flexibility Production / Baseline
Secunda-0.3-F16-QA QA-based Input - Acceptable for question-based generation - Less accurate than 0.1
- Not as coherent
Prototyping (QA mode)
Secunda-0.3-F16-TEXT Text-to-text - Flexible for freeform tasks - Slightly off
- Modelfile-dependent
Experimental / Text rewrite
Secunda-0.3-GGUF GGUF build - Portable GGUF of 0.3 - Inherits 0.3 weaknesses Lightweight local testing
Secunda-0.5-RAW QA Natural - Best QA understanding
- Long-form generation potential
- Inconsistent output length
- Some instability
Research / Testing LoRA
Secunda-0.5-GGUF GGUF build - Portable, inference-ready version of 0.5 - Shares issues of 0.5 Offline experimentation

πŸŒ™ Secunda-0.5-GGUF [EXPERIMENTAL]

For more concise and controlled results, I recommend using Secunda-0.1-GGUF (best) or Secunda-0.3-GGUF

Secunda-0.5-GGUF is the most verbose release of the Secunda model to date:

  • Fine-tuned to generate fully structured Ren'Py .rpy scripts from natural language prompts β€” now available in GGUF format for fast local inference via llama.cpp, ollama, llamafile, or LM Studio.
  • Yet, it is also the most capricious and least reliable.
  • It needs further prompt-engineering, though we believe our Modelfile might be enough to yield acceptable results.

✨ What it generates

Secunda-0.5-GGUF excels at:

  • βœ… Creating very long scripts
  • βœ… Backgrounds & sprite declarations
  • βœ… Emotional, coherent dialogue

πŸ“ Model Variants

Variant Quant Type Filename Notes
πŸ”₯ Main q8_0 secunda-0.5-q8_0.gguf Best quality. Use this for real content generation.
πŸŒ’ Experimental tq2_0 secunda-0.5-tq2_0.gguf Smaller, faster. May lose stylistic precision.
βš—οΈ Tiny Deploy tq1_0 secunda-0.5-tq1_0.gguf Edge-case or testing deployments.

πŸ›  How to Run (Ollama)

ollama create secunda -f Modelfile
ollama run secunda

Prompt example:

You can write naturally, without any issues.

πŸ”Ί Important: Always run with the Modelfile that defines structured behavior and generation tags.


πŸ§ͺ Evaluation

  • βœ… Over 450 test scripts generated
  • βœ… 75%+ valid Ren'Py syntax
  • βœ… Scene and tone fidelity preserved across chapters

🧼 Training Integrity

⚠️ NO HUMAN-WRITTEN VISUAL NOVELS were used in training ⚠️ Secunda was trained using machine-generated, structured examples and dataset augmentations. We respect the hard work of VN creators.

πŸ™ If you love visual novels, support indie authors on itch.io!


πŸ“š Citation

@misc{secunda2025gguf,
  title={Secunda-0.5-GGUF},
  author={Yaroster},
  year={2025},
  note={https://huggingface.co/Yaroster/Secunda-0.5-GGUF}
}

🌌 Related Projects


πŸ’« Final Words

✧ Because stories can spark from a single phrase ✧ This is Secunda-0.5-GGUF β€” and it finally writes like a novelist.

Downloads last month
67
GGUF
Model size
8.03B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

1-bit

2-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support