Pelican-9b-v0.1 / README.md
NeuralNovel's picture
Update README.md
9095929 verified
|
raw
history blame
909 Bytes
metadata
base_model:
  - flemmingmiguel/MBX-7B
  - flemmingmiguel/MBX-7B-v3
tags:
  - mergekit
  - merge
license: apache-2.0

merge

This is a merge of pre-trained language models created using mergekit. this model is broken.

Merge Method

This model was merged using the passthrough merge method and only speaks german, somewhat obsessed with football.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


slices:
  - sources:
    - model: flemmingmiguel/MBX-7B-v3
      layer_range: [0, 32]
  - sources:
    - model: flemmingmiguel/MBX-7B
      layer_range: [20, 32]
merge_method: passthrough
dtype: float16