Text Generation
Transformers
Safetensors
mixtral
mergekit
lazymergekit
jiayihao03/mistral-7b-instruct-Javascript-4bit
akameswa/mistral-7b-instruct-java-4bit
akameswa/mistral-7b-instruct-go-4bit
jiayihao03/mistral-7b-instruct-python-4bit
conversational
text-generation-inference
Inference Endpoints
4-bit precision
bitsandbytes
license: apache-2.0 | |
tags: | |
- mergekit | |
- lazymergekit | |
- jiayihao03/mistral-7b-instruct-Javascript-4bit | |
- akameswa/mistral-7b-instruct-java-4bit | |
- akameswa/mistral-7b-instruct-go-4bit | |
- jiayihao03/mistral-7b-instruct-python-4bit | |
# mixtral-4x7b-instruct-code | |
mixtral-4x7b-instruct-code is a MoE of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [jiayihao03/mistral-7b-instruct-Javascript-4bit](https://huggingface.co/jiayihao03/mistral-7b-instruct-Javascript-4bit) | |
* [akameswa/mistral-7b-instruct-java-4bit](https://huggingface.co/akameswa/mistral-7b-instruct-java-4bit) | |
* [akameswa/mistral-7b-instruct-go-4bit](https://huggingface.co/akameswa/mistral-7b-instruct-go-4bit) | |
* [jiayihao03/mistral-7b-instruct-python-4bit](https://huggingface.co/jiayihao03/mistral-7b-instruct-python-4bit) | |
## 🧩 Configuration | |
```yaml``` |