metadata
license: apache-2.0
pipeline_tag: text-generation
language:
- da
tags:
- pretrained
inference:
parameters:
temperature: 0.7
datasets:
- DDSC/partial-danish-gigaword-no-twitter
base_model: mistralai/Mistral-7B-v0.1
Model Card for Munin 7B Alpha
The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on Mistral-7B-v0.1.
It has been trained on Danish Gigaword using continual pretraining.
For full details of this model please read our release blog post.
Notice
Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.
The Danish Foundation Models Team
- From the Center for Humanities Computing at Aarhus University:
- Kenneth Enevoldsen ([email protected])
- Lasse Hansen ([email protected])
- Kristoffer Laigaard Nielbo ([email protected])
- From the Alexandra Institute:
- Peter Bjørn Jørgensen ([email protected])
- Rasmus Larsen ([email protected])
- Dan Saattrup Nielsen ([email protected])