CollectiveCognition-v1.1-Mistral-7B and airoboros-mistral2.2-7b glued together and finetuned with qlora of Pippa and LimaRPv3 dataset.

Description

This repo contains fp16 files of Mistral-11B-CC-Air-RP.

Model used

Prompt template: Alpaca or default

Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{prompt}

### Response:
USER: <prompt>
ASSISTANT:

The secret sauce

slices:
  - sources:
    - model: teknium/CollectiveCognition-v1.1-Mistral-7B
      layer_range: [0, 24]
  - sources:
    - model: teknium/airoboros-mistral2.2-7b
      layer_range: [8, 32]
merge_method: passthrough
dtype: float16

Special thanks to Sushi.

If you want to support me, you can here.

Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Undi95/Mistral-11B-CC-Air-RP

Merges
1 model
Quantizations
3 models

Collection including Undi95/Mistral-11B-CC-Air-RP