yam-pastiche-7B-franken / mergekit_config.yml
mayacinka's picture
Upload folder using huggingface_hub
1400f31 verified
raw
history blame contribute delete
391 Bytes
slices:
- sources:
- model: yam-peleg/Experiment28-7B
layer_range: [0, 10]
- sources:
- model: yam-peleg/Experiment26-7B
layer_range: [10, 20]
- sources:
- model: yam-peleg/Experiment24-7B
layer_range: [20, 30]
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range: [30, 32]
merge_method: passthrough
dtype: bfloat16