Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
argilla
/
notux-8x7b-v1
like
164
Follow
Argilla
373
Text Generation
Transformers
TensorBoard
Safetensors
argilla/ultrafeedback-binarized-preferences-cleaned
5 languages
mixtral
dpo
rlaif
preference
ultrafeedback
Mixture of Experts
conversational
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
9
Train
Deploy
Use this model
Add MoE tag (mixture of experts)
#5
by
davanstrien
HF Staff
- opened
Jan 9, 2024
base:
refs/heads/main
β
from:
refs/pr/5
Discussion
Files changed
+1
-0
davanstrien
Jan 9, 2024
No description provided.
Add MoE tag (mixture of experts)
a29f3670
alvarobartt
Jan 9, 2024
Thanks for the suggestion
@
davanstrien
! ππ»
See translation
alvarobartt
changed pull request status to
merged
Jan 9, 2024
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Comment
Β·
Sign up
or
log in
to comment