athirdpath's picture
Update README.md
497550e verified
metadata
license: llama3.1

Merged as below:


slices:

  • sources:

    • model: athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit
    • layer_range: [0, 23]
  • sources:

    • model: athirdpath/Llama-3.1-Techne-RP-8b-v1
    • layer_range: [9, 31]

merge_method: passthrough

dtype: float16

tokenizer_source: athirdpath/Llama-3.1-Techne-RP-8b-v1


Then pretrained for 1 epoch on the Iambe dataset as an 11b model