merged

This is a merge of pre-trained language models created using mergekit.

🎢 Nemo Nemo - Yena πŸ’½

Hopefully improved version of Wicked-Love-12B that focuses more on vocabulary diversity. I use a modified version of Mistral-V3-Tekken-E

Configuration

The following models were included in the merge:

The following YAML configuration was used to produce this model:

base_model: neighbooo/Wicked-Love-12B
dtype: bfloat16
merge_method: arcee_fusion
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 40]
        model: neighbooo/Wicked-Love-12B
      - layer_range: [0, 40]
        model: ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.2
Downloads last month
48
Safetensors
Model size
12.2B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for neighbooo/NeMo-NeMo-12B