The-Omega-Directive-12B-v1.0

5 layers removed. It doesnt work. Dont download it. I am testing out some fine tuning on these "scooped" models.

Using Acree-AI/PruneMe, I detected the least used layers, and removed them.

I am then hoping to fine tune the hell out the hell out of it, to rebalance the parameters.

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method.

Models Merged

The following models were included in the merge:

  • /storage/bases/The-Omega-Directive-M-12B-v1.0

Configuration

The following YAML configuration was used to produce this model:

dtype: bfloat16
merge_method: passthrough
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 25]
        model: /storage/bases/The-Omega-Directive-M-12B-v1.0
    - sources:
      - layer_range: [31, 40]
        model: /storage/bases/The-Omega-Directive-M-12B-v1.0
Downloads last month
10
Safetensors
Model size
10.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for SuperbEmphasis/The-Omega-Directive-12B-EVISCERATED

Finetunes
1 model