merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Passthrough merge method.
Models Merged
The following models were included in the merge:
- taozi555/deepRP-nemo-12b
- Delta-Vector/Rei-V2-12B
- PygmalionAI/Pygmalion-3-12B
- ReadyArt/The-Omega-Directive-M-12B-v1.0
- hardlyworking/Sapphire-12B
Configuration
The following YAML configuration was used to produce this model:
merge_method: passthrough
dtype: bfloat16
slices:
- sources:
- model: Delta-Vector/Rei-V2-12B
layer_range: [0,8]
- sources:
- model: PygmalionAI/Pygmalion-3-12B
layer_range: [8,16]
- sources:
- model: taozi555/deepRP-nemo-12b
layer_range: [16,24]
- sources:
- model: hardlyworking/Sapphire-12B
layer_range: [24,32]
- sources:
- model: ReadyArt/The-Omega-Directive-M-12B-v1.0
layer_range: [32,40]
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for mergekit-community/mergekit-passthrough-hdctkvu
Merge model
this model