--- base_model: - LatitudeGames/Wayfarer-Large-70B-Llama-3.3 - SentientAGI/Dobby-Unhinged-Llama-3.3-70B - SicariusSicariiStuff/Negative_LLAMA_70B library_name: transformers tags: - mergekit - merge --- # about Dark coloration L3.3 merge, to be included in my merges. Can also be tried as a standalone to have a darker Llama Experience, but I didn't take the time. Edit : I took the time, and it meets its purpose. - It's average on the basic metrics (smarts, perplexity), but it's not woke and unhinged indeed. - The model is not abliterated, though. It has refusals on the usual point-blank questions. - I will play with it more, because it has potential. My note : 3/5 as a standalone. 4/5 as a merge brick. Warning : this model can be brutal and vulgar, more than most of my previous merges. --- # benchs - PPL512 WikiText Eng : 3.66 (average ++) - ARC-C : 55.85 (average) - ARC-E : 77.72 (average) --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [SicariusSicariiStuff/Negative_LLAMA_70B](https://huggingface.co/SicariusSicariiStuff/Negative_LLAMA_70B) as a base. ### Models Merged The following models were included in the merge: * [LatitudeGames/Wayfarer-Large-70B-Llama-3.3](https://huggingface.co/LatitudeGames/Wayfarer-Large-70B-Llama-3.3) * [SentientAGI/Dobby-Unhinged-Llama-3.3-70B](https://huggingface.co/SentientAGI/Dobby-Unhinged-Llama-3.3-70B) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: model_stock models: - model: SentientAGI/Dobby-Unhinged-Llama-3.3-70B parameters: weight: 1.0 - model: LatitudeGames/Wayfarer-Large-70B-Llama-3.3 parameters: weight: 1.0 base_model: SicariusSicariiStuff/Negative_LLAMA_70B dtype: bfloat16 out_dtype: bfloat16 parameters: int8_mask: true normalize: true rescale: false filter_wise: false smooth: false allow_negative_weights: false chat_template: auto tokenizer: source: union ```