Mika (Named after what my Claude-3 Opus chat called itself.) is a Model trained in a similar manner to Fett-uccine with synthetic RP data created by Claude also included.

Format

I've had the best results with ChatML Context Template and Mistral Instruct Template, however, YMMV.

Downloads last month
25
GGUF
Model size
7.24B params
Architecture
llama

3-bit

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .

Datasets used to train Epiculous/Mika-7B-GGUF

Collection including Epiculous/Mika-7B-GGUF