Text Generation
Transformers
Safetensors
mistral
Merge
mergekit
lazymergekit
jiayihao03/mistral-7b-instruct-Javascript-4bit
jiayihao03/mistral-7b-instruct-python-4bit
akameswa/mistral-7b-instruct-java-4bit
akameswa/mistral-7b-instruct-go-4bit
conversational
text-generation-inference
Inference Endpoints
4-bit precision
bitsandbytes
license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- jiayihao03/mistral-7b-instruct-Javascript-4bit | |
- jiayihao03/mistral-7b-instruct-python-4bit | |
- akameswa/mistral-7b-instruct-java-4bit | |
- akameswa/mistral-7b-instruct-go-4bit | |
# mistral-7b-instruct-code-ties | |
mistral-7b-instruct-code-ties is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [jiayihao03/mistral-7b-instruct-Javascript-4bit](https://huggingface.co/jiayihao03/mistral-7b-instruct-Javascript-4bit) | |
* [jiayihao03/mistral-7b-instruct-python-4bit](https://huggingface.co/jiayihao03/mistral-7b-instruct-python-4bit) | |
* [akameswa/mistral-7b-instruct-java-4bit](https://huggingface.co/akameswa/mistral-7b-instruct-java-4bit) | |
* [akameswa/mistral-7b-instruct-go-4bit](https://huggingface.co/akameswa/mistral-7b-instruct-go-4bit) | |
## 🧩 Configuration | |
```yaml | |
models: | |
- model: unsloth/mistral-7b-instruct-v0.2-bnb-4bit | |
- model: jiayihao03/mistral-7b-instruct-Javascript-4bit | |
parameters: | |
density: 0.85 | |
weight: 0.25 | |
- model: jiayihao03/mistral-7b-instruct-python-4bit | |
parameters: | |
density: 0.85 | |
weight: 0.25 | |
- model: akameswa/mistral-7b-instruct-java-4bit | |
parameters: | |
density: 0.85 | |
weight: 0.25 | |
- model: akameswa/mistral-7b-instruct-go-4bit | |
parameters: | |
density: 0.85 | |
weight: 0.25 | |
merge_method: ties | |
base_model: unsloth/mistral-7b-instruct-v0.2-bnb-4bit | |
parameters: | |
normalize: true | |
dtype: float16 | |
``` |