llama3s-merged
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using starmpcc/Asclepius-Llama2-13B as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model: starmpcc/Asclepius-Llama2-13B
dtype: bfloat16
merge_method: dare_ties
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: AdaptLLM/finance-LLM-13B
parameters:
density: 0.53
weight: 0.6
- layer_range: [0, 40]
model: starmpcc/Asclepius-Llama2-13B
parameters:
density: 0.5
weight: 0.4
parameters:
int8_mask: 1.0
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support