Llama-3.1-8b-Pruned-6-Layers / mergekit_config.yml
Na0s's picture
Upload folder using huggingface_hub
5961673 verified
raw
history blame
No virus
202 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 22]
model: meta-llama/Meta-Llama-3.1-8B
- sources:
- layer_range: [29, 32]
model: meta-llama/Meta-Llama-3.1-8B