Aplite-Instruct-4x8B-Llama-3
GGUF quant of Llama-3-Aplite-Instruct-4x8B.
<|eot_id|> bug has been fixed.
Disclaimer
This model is a research experiment and may generate incorrect or harmful content. The model's outputs should not be taken as factual or representative of the views of the model's creator or any other individual.
The model's creator is not responsible for any harm or damage caused by the model's outputs.
Join out Discord
If you'd like to discuss potential collaborations or applications, feel free to reach out to me on Discord: [https://discord.gg/KugcbJX5]
Meta Llama 3 is licensed under the Meta Llama 3 Community License, Copyright ยฉ Meta Platforms, Inc. All Rights Reserved.
- Downloads last month
- 37
Hardware compatibility
Log In
to view the estimation
3-bit
4-bit
5-bit
6-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for raincandy-u/Llama-3-Aplite-Instruct-4x8B-GGUF-MoE
Base model
meta-llama/Meta-Llama-3-8B-Instruct