mergekit-uploader's picture
Upload folder using huggingface_hub
ad9a150 verified
metadata
base_model:
  - DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS
  - mergekit-community/MN-Sappho-j-12B
  - nbeerbower/mistral-nemo-wissenschaft-12B
  - mergekit-community/MN-Hekate-Nykhia-17B
  - mergekit-community/MN-Sappho-g3-12B
  - jtatman/mistral_nemo_12b_reasoning_psychology_lora
  - Nitral-Archive/Diogenes-12B
  - mistralai/Mistral-Nemo-Base-2407
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Model Stock merge method using mergekit-community/MN-Hekate-Nykhia-17B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float32
out_dtype: bfloat16
merge_method: model_stock
base_model: mergekit-community/MN-Hekate-Nykhia-17B
slices:
  - sources:
    - model: mergekit-community/MN-Hekate-Nykhia-17B
      layer_range: [0, 20]

  - sources:
    - model: mergekit-community/MN-Hekate-Nykhia-17B
      layer_range: [20, 36]
      parameters:
        weight: [0.5, 0.25]
    - model: mistralai/Mistral-Nemo-Base-2407
      layer_range: [16, 32]
      parameters:
        weight: [2, 1]
    - model: Nitral-Archive/Diogenes-12B
      layer_range: [16, 32]
      parameters:
        weight: [1, 2]
    - model: mergekit-community/MN-Sappho-g3-12B+jtatman/mistral_nemo_12b_reasoning_psychology_lora
      layer_range: [16, 32]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: nbeerbower/mistral-nemo-wissenschaft-12B
      layer_range: [16, 32]
      parameters:
        weight: [1.5, 1.49, 1.46, 1.4, 1.33, 1.25, 1.15, 1.05, 1]
    - model: mergekit-community/MN-Sappho-j-12B
      layer_range: [16, 32]
      parameters:
        weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]
    - model: DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS
      layer_range: [16, 32]
      parameters:
        weight: [1.0, 1.1, 1.2, 1.29, 1.37, 1.43, 1.48, 1.5, 1.5]

  - sources:
    - model: mergekit-community/MN-Hekate-Nykhia-17B
      layer_range: [36, 56]

tokenizer:
  source: union
  tokens:
    "[INST]":
      source: mergekit-community/MN-Hekate-Nykhia-17B
      force: true
    "[/INST]":
      source: mergekit-community/MN-Hekate-Nykhia-17B
      force: true
    "<|im_start|>":
      source: mergekit-community/MN-Hekate-Nykhia-17B
      force: true
    "<|im_end|>":
      source: mergekit-community/MN-Hekate-Nykhia-17B
      force: true