Llama3-35B-lingyang-v1 / mergekit_config.yml
wwe180's picture
Upload folder using huggingface_hub
54d31d7 verified
raw
history blame contribute delete
381 Bytes
slices:
- sources:
- model: "wwe180/mergekit-passthrough-bklxjmn+wave-on-discord/llama-3-70b-no-robots-adapter"
layer_range: [0, 21]
- sources:
- model: "wwe180/mergekit-passthrough-bklxjmn+wave-on-discord/llama-3-70b-no-robots-adapter"
layer_range: [20,38]
merge_method: passthrough
base_model: "wwe180/mergekit-passthrough-bklxjmn"
dtype: bfloat16