image/png This is a frankenmerge model.

MidnightMoon-16B

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
    - model: shisa-ai/shisa-v2-mistral-nemo-12b
      layer_range: [0, 8]
  - sources:
    - model: Elizezen/Himeyuri-v0.1-12B
      layer_range: [8, 24]
  - sources:
    - model: yamatazen/ForgottenMaid-12B
      layer_range: [8, 24]
  - sources:
    - model: shisa-ai/shisa-v2-mistral-nemo-12b
      layer_range: [24, 40]
merge_method: passthrough
dtype: bfloat16
out_dtype: bfloat16
tokenizer:
  source: shisa-ai/shisa-v2-mistral-nemo-12b
Downloads last month
32
Safetensors
Model size
16.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for yamatazen/MidnightMoon-16B