merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Model Stock merge method using mergekit-community/MN-Hekate-Nyktipolos-17B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float32
out_dtype: bfloat16
merge_method: model_stock
base_model: mergekit-community/MN-Hekate-Nyktipolos-17B
slices:
  - sources:
    - model: mergekit-community/MN-Hekate-Nyktipolos-17B
      layer_range: [0, 12]

  - sources:
    - model: mergekit-community/MN-Hekate-Nyktipolos-17B
      layer_range: [12, 20]
      parameters:
        weight: 5
    - model: mistralai/Mistral-Nemo-Base-2407
      layer_range: [12, 20]
      parameters:
        weight: 3
    - model: nbeerbower/mistral-nemo-bophades-12B
      layer_range: [12, 20]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
      layer_range: [12, 20]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: nbeerbower/mistral-nemo-wissenschaft-12B
      layer_range: [12, 20]
      parameters:
        weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]
    - model: HumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407
      layer_range: [12, 20]
      parameters:
        weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]

  - sources:
    - model: mergekit-community/MN-Hekate-Nyktipolos-17B
      layer_range: [20, 36]
      parameters:
        weight: 5
    - model: mistralai/Mistral-Nemo-Base-2407
      layer_range: [16, 32]
      parameters:
        weight: 2
    - model: Khetterman/AbominationScience-12B-v4
      layer_range: [16, 32]
    - model: nbeerbower/mistral-nemo-bophades-12B
      layer_range: [16, 32]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
      layer_range: [16, 32]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: HumanLLMs/Human-Like-Mistral-Nemo-Instruct-2407
      layer_range: [16, 32]
      parameters:
        weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]
    - model: nbeerbower/mistral-nemo-wissenschaft-12B
      layer_range: [16, 32]
      parameters:
        weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]

  - sources:
    - model: mergekit-community/MN-Hekate-Nyktipolos-17B
      layer_range: [36, 44]
      parameters:
        weight: 5
    - model: mistralai/Mistral-Nemo-Base-2407
      layer_range: [20, 28]
      parameters:
        weight: 2
    - model: Khetterman/AbominationScience-12B-v4
      layer_range: [20, 28]
    - model: nbeerbower/mistral-nemo-bophades-12B
      layer_range: [20, 28]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
      layer_range: [20, 28]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]

  - sources:
    - model: mergekit-community/MN-Hekate-Nyktipolos-17B
      layer_range: [44, 56]

tokenizer:
  source: union
  tokens:
    "[INST]":
      source: mergekit-community/MN-Hekate-Nyktipolos-17B
      force: true
    "[/INST]":
      source: mergekit-community/MN-Hekate-Nyktipolos-17B
      force: true
    "<|im_start|>":
      source: mergekit-community/MN-Hekate-Nyktipolos-17B
      force: true
    "<|im_end|>":
      source: mergekit-community/MN-Hekate-Nyktipolos-17B
      force: true
Downloads last month
4
Safetensors
Model size
16.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mergekit-community/MN-Hekate-Damnomeneia-17B