Danish-Swedish Merged Model
This is a merge of the following models, all based on mistralai/Mistral-7B-v0.1
:
danish-foundation-models/munin-7b-alpha
, continued pretraining on Danish data;timpal0l/Mistral-7B-v0.1-flashback-v2
, continued pretraining on Swedish data.
Model Details
- Merged by: Dan Saattrup Nielsen
- Model type: Decoder model, based on
mistralai/Mistral-7B-v0.1
- Language(s): Danish and Swedish
- License: CC-BY-4.0
- Merge configuration:
dict( models=[ dict( model="danish-foundation-models/munin-7b-alpha", parameters=dict( weight=1.0, ), ), dict( model="timpal0l/Mistral-7B-v0.1-flashback-v2", parameters=dict( weight=1.0, ), ), ], merge_method="task_arithmetic", base_model="mistralai/Mistral-7B-v0.1", parameters=dict( int8_mask=True, normalize=True, ), dtype="bfloat16", )
- Downloads last month
- 17
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for merge-crew/da-sv-task-arithmetic
Merge model
this model