my-merged-model

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using ClaudioItaly/Vangelus-Secundus as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: ties
base_model: ClaudioItaly/Vangelus-Secundus
dtype: bfloat16

# --- SEZIONE CONFIGURAZIONE MODELLO FINALE ---
config:
  max_position_embeddings: 32768
  rope_scaling:
    rope_type: yarn
    factor: 4.0
    original_max_position_embeddings: 8192
# ---------------------------------------------

parameters:
  density: 0.7
  mask_threshold: 0.02
  normalize: true
  int8_mask: true

models:
  - model: Naphula/Jungle-Oasis-BRF-MPOA-9B
    parameters:
      weight: 0.35

  - model: ClaudioItaly/Exurbia-Delta9
    parameters:
      weight: 0.25

  - model: ClaudioItaly/Pullulation-2-9B
    parameters:
      weight: 0.15

  - model: sam-paech/Darkest-muse-v1
    parameters:
      weight: 0.10

  - model: sam-paech/Delirium-v1
    parameters:
      weight: 0.10

  - model: sam-paech/Quill-v1
    parameters:
      weight: 0.05
Downloads last month
13
Safetensors
Model size
10B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ClaudioItaly/Exubria-Strix