--- base_model: [] library_name: transformers tags: - mergekit - merge --- # The-Omega-Directive-12B-v1.0 5 layers removed. It doesnt work. Dont download it. I am testing out some fine tuning on these "scooped" models. Using [Acree-AI/PruneMe](https://github.com/arcee-ai/PruneMe), I detected the least used layers, and removed them. I am then hoping to fine tune the hell out the hell out of it, to rebalance the parameters. This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the Passthrough merge method. ### Models Merged The following models were included in the merge: * /storage/bases/The-Omega-Directive-M-12B-v1.0 ### Configuration The following YAML configuration was used to produce this model: ```yaml dtype: bfloat16 merge_method: passthrough modules: default: slices: - sources: - layer_range: [0, 25] model: /storage/bases/The-Omega-Directive-M-12B-v1.0 - sources: - layer_range: [31, 40] model: /storage/bases/The-Omega-Directive-M-12B-v1.0 ```