Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
wangrongsheng 's Collections
Aurora-Mixtral-8x7B
IvyGPT
ChatPaper
LLM-Detector
Noah-SLM

Aurora-Mixtral-8x7B

updated Jan 16, 2024
Upvote
1

  • wangrongsheng/Aurora

    Text Generation • Updated Jan 4, 2024 • 20

    Note Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.


  • Aurora:Activating Chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning

    Paper • 2312.14557 • Published Dec 22, 2023

  • wangrongsheng/Aurora-Plus

    Text Generation • Updated Jan 8, 2024 • 3

  • wangrongsheng/Aurora-dpo

    Text Generation • Updated Jan 16, 2024 • 1
Upvote
1
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs