Marcoro14-7B-slerp
Marcoro14-7B-slerp is a merge of the following models using mergekit:
🧩 Configuration
models:
- model: pankajmathur/orca_mini_v2_7b
parameters:
weight: 1.0
- model: WizardLM/WizardLM-7B-V1.0
parameters:
weight: 0.3
merge_method: linear
dtype: float16
- Downloads last month
- 20
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support