Psyfighter2-Orca2-Erebus3-13B / mergekit_config.yml
FallenMerick's picture
Upload folder using huggingface_hub
f78b690 verified
raw
history blame
342 Bytes
models:
- model: KoboldAI/LLaMA2-13B-Psyfighter2
- model: E:\Machine Learning\Models\Orca2-Flat-BF16
parameters:
weight: 1.0
density: 0.4
- model: KoboldAI/LLaMA2-13B-Erebus-v3
parameters:
weight: 1.0
density: 0.2
merge_method: ties
base_model: KoboldAI/LLaMA2-13B-Psyfighter2
dtype: bfloat16