hfl-rc's picture
Update README.md
625f48c verified
|
raw
history blame
No virus
668 Bytes
metadata
license: apache-2.0
language:
  - zh
  - en

Chinese-Mixtral-LoRA

This repository contains Chinese-Mixtral-LoRA, which is further pre-trained on Mixtral-8x7B-v0.1.

Note: You must combine LoRA with the original Mixtral-8x7B-v0.1 to obtain full weight.

For full model, please see: https://huggingface.co/hfl/chinese-mixtral

For GGUF model (llama.cpp compatible), please see: https://huggingface.co/hfl/chinese-mixtral-gguf

Please refer to https://github.com/ymcui/Chinese-Mixtral/ for more details.