--- base_model: [] library_name: transformers tags: - mergekit - merge --- ![images.jpeg](https://cdn-uploads.huggingface.co/production/uploads/65f86e388fafa15fc080db6c/FuPnXXaanpToPvxnxBA6y.jpeg) # output This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Linear DELLA](https://arxiv.org/abs/2406.11617) merge method using /gghfez_SmartMaid-123b as a base. ### Models Merged The following models were included in the merge: * /migtissera_Tess-3-Mistral-Large-2-123B * /TheDrummer_Behemoth-123B-v1 * /gghfez_Writer-Large-2411-v2.1 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: gghfez_Writer-Large-2411-v2.1 parameters: weight: 0.25 density: 0.4 - model: TheDrummer_Behemoth-123B-v1 parameters: weight: 0.40 density: 0.6 - model: migtissera_Tess-3-Mistral-Large-2-123B parameters: weight: 0.35 density: 0.5 merge_method: della_linear base_model: gghfez_SmartMaid-123b parameters: epsilon: 0.05 lambda: 1 int8_mask: true dtype: bfloat16 ```