Mika (Named after what my Claude-3 Opus chat called itself.) is a Model trained in a similar manner to Fett-uccine with synthetic RP data created by Claude also included.

Format

I've had the best results with ChatML Context Template and Mistral Instruct Template, however, YMMV.

Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train Epiculous/Mika-7B-GPTQ

Collection including Epiculous/Mika-7B-GPTQ