mhm-7b-v1.3 / README.md
Kquant03's picture
Update README.md
a3b78c3 verified
|
raw
history blame
452 Bytes
metadata
tags:
  - moe
  - merge
license: apache-2.0

image/jpeg

mhm-7-3

This is a merge of pre-trained language models created using mergekit.

Merged model based on mistral. created using dare_ties and models from top of openllm leaderboard.

Mixed 7 models into 1. 3 times merging.

Just an experiment.