Llama-Mistral / mergekit_config.yml
ehristoforu's picture
Upload folder using huggingface_hub
0dbcb8e verified
raw
history blame contribute delete
228 Bytes
slices:
- sources:
- model: NousResearch/Meta-Llama-3-8B-Instruct
layer_range: [0, 24]
- sources:
- model: David-Xu/Mistral-7B-Instruct-v0.2
layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16