--- base_model: [] library_name: transformers tags: - mergekit - merge --- # flam-kit This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * schonsense/Llama-3.3-70B-Inst-Ablit-Flammades-SLERP * schonsense/Flamlama_70B_della ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: schonsense/Llama-3.3-70B-Inst-Ablit-Flammades-SLERP - model: schonsense/Flamlama_70B_della merge_method: slerp base_model: schonsense/Llama-3.3-70B-Inst-Ablit-Flammades-SLERP dtype: bfloat16 parameters: t: [ 0, 0.0, 0.0, 0.055, 0.109, 0.127, 0.145, 0.164, 0.182, 0.2, 0.218, 0.236, 0.255, 0.273, 0.291, 0.309, 0.327, 0.345, 0.364, 0.382, 0.4, 0.418, 0.436, 0.455, 0.473, 0.491, 0.509, 0.527, 0.545, 0.564, 0.582, 0.6, 0.588, 0.576, 0.564, 0.552, 0.54, 0.527, 0.515, 0.503, 0.491, 0.479, 0.467, 0.455, 0.442, 0.43, 0.418, 0.406, 0.394, 0.382, 0.369, 0.357, 0.345, 0.333, 0.321, 0.309, 0.297, 0.285, 0.273, 0.26, 0.248, 0.236, 0.224, 0.212, 0.2, 0.188, 0.176, 0.164, 0.151, 0.139, 0.127, 0.115, 0.103, 0.091, 0.079, 0.067, 0.055, 0, 0, 0 ] ```