Uploaded model
- Developed by: sommerzen
- License: apache-2.0
- Finetuned from model : utter-project/EuroMoE-2.6B-A0.6B-Instruct-Preview
This mixtral model was trained 2x faster with Unsloth and Huggingface's TRL library.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for sommerzen/EuroMoE-German-experiment2
Base model
utter-project/EuroMoE-2.6B-A0.6B-Preview