metadata
license: apache-2.0
datasets:
- openerotica/mixed-rp
- kingbri/PIPPA-shareGPT
- flammenai/character-roleplay-DPO
language:
- en
base_model:
- N-Bot-Int/ZoraBetaA2
pipeline_tag: text-generation
tags:
- unsloth
- Uncensored
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- roleplay
- conversational
Support Us Through
- [https://ko-fi.com/nexusnetworkint](Official Ko-FI link!)
GGUF Version
GGUF with Quants! Allowing you to run models using KoboldCPP and other AI Environments!
Quantizations:
Quant Type | Benefits | Cons |
---|---|---|
Q16_0 | β Highest accuracy (closest to full model) | β Requires significantly more VRAM/RAM |
β Best for complex reasoning & detailed outputs | β Slower inference compared to Q4 & Q5 | |
β Suitable for high-end GPUs & serious workloads | β Larger file size (takes more storage) |
Model Details:
Read the Model details on huggingface Model Detail Here