silverline.conc
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the NuSLERP merge method.
Models Merged
The following models were included in the merge:
- kromcomp/L3.1-Silverlinev4-12B
- merge/silverline.sub
Configuration
The following YAML configuration was used to produce this model:
chat_template: llama3
dtype: float32
merge_method: nuslerp
modules:
default:
slices:
- sources:
- layer_range: [0, 50]
model: merge/silverline.sub
parameters:
weight:
- filter: self_attn
value: 0.0002
- filter: mlp
value: 0.0004
- value: 0.0003
- layer_range: [0, 50]
model: kromcomp/L3.1-Silverlinev4-12B
parameters:
weight: 1.0
parameters:
normalize: 0.0
nuslerp_flatten: 0.0
tokenizer:
source: base
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for kromcomp/L3.1-Silverline.Concv1-12B
Base model
kromcomp/L3.1-Silverlinev4-12B