Dans-PersonalityEngine-V1.3.0-24b
Dans-PersonalityEngine is a versatile model series fine-tuned on 50+ specialized datasets, designed to excel at both creative tasks (like roleplay and co-writing) and technical challenges (such as code generation, tool use, and complex reasoning).
V1.3.0 introduces multilingual capabilities with support for 10 languages and enhanced domain expertise across multiple fields. The primary language is still English and that is where peak performance can be expected.
Multilingual Support
Arabic Chinese English French German Hindi Japanese Korean Portuguese Spanish
Key Details
BASE MODEL: mistralai/Mistral-Small-3.1-24B-Base-2503 LICENSE: apache-2.0 LANGUAGE: Multilingual with 10 supported languages CONTEXT LENGTH: 32768 tokens, 131072 with degraded recall
Recommended Settings
TEMPERATURE: 1.0 TOP_P: 0.9
Prompting Format
The model uses the following format I'll refer to as "DanChat-2":
<|system|>system prompt<|endoftext|><|user|>Hi there!<|endoftext|><|assistant|>Hey, how can I help?<|endoftext|>
Why not ChatML?
While ChatML is a standard format for LLMs, it has limitations. DanChat-2 uses special tokens for each role, this reduces biases and helps the model adapt to different tasks more readily.
SillyTavern Template
Inference Provider
This model and others are available from ⚡Mancer AI for those interested in high quality inference without owning or renting expensive hardware.
Training Process
The model was trained using Axolotl on 8x H100 GPUs for 50 hours. The resources to train this model were provided by Prime Intellect and Kalomaze.
Support Development
Development is limited by funding and resources. To help support:
- Contact on HF
- Email: [email protected]
- Downloads last month
- 7
Model tree for ArtusDev/PocketDoc_Dans-PersonalityEngine-V1.3.0-24b_EXL3_3.5bpw_H6
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503