This was made as a test to see if I my PC can handle merging.

Alpacino-SuperCOT-13B Recipe

Alpacino-13B + LLaMa-SuperCOT-13B (50%/50%)

Original Models:

Alpacino-13B: https://huggingface.co/digitous/Alpacino13b

LLaMa-SuperCOT-13B: https://huggingface.co/ausboss/llama-13b-supercot

Downloads last month
70
Safetensors
Model size
13B params
Tensor type
F16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Spaces using PJMixers-Archive/Alpacino-SuperCOT-13B 26