File size: 228 Bytes
0dbcb8e |
1 2 3 4 5 6 7 8 9 |
slices:
- sources:
- model: NousResearch/Meta-Llama-3-8B-Instruct
layer_range: [0, 24]
- sources:
- model: David-Xu/Mistral-7B-Instruct-v0.2
layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16 |