--- base_model: [] library_name: transformers tags: - mergekit - merge --- # output This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using sometimesanotion/Slerp-Lamarckvevergence as a base. ### Models Merged The following models were included in the merge: * sometimesanotion/Chocolatine-Fusion-Qwenvergence ### Configuration The following YAML configuration was used to produce this model: ```yaml name: Lamarck-14B-v0.7-Fusion merge_method: arcee_fusion base_model: sometimesanotion/Slerp-Lamarckvevergence tokenizer_source: base parameters: int8_mask: true normalize: true rescale: false dtype: bfloat16 out_dtype: bfloat16 models: - model: sometimesanotion/Chocolatine-Fusion-Qwenvergence ```