Plesio-32B

Created by
Delta-Vector
โ
Model Information
Plesio-32B
Another Series of Merges! Since i could never beat Archaeo-32B-KTO! This time starting off with a GLM merge between Rei and Neon (thanks auri!!!)
Using the Oh-so-great 0.2 Slerp merge weight with Neon as the Base.
Support me on Ko-Fi: https://ko-fi.com/deltavector
Quantized Versions
Available Downloads
- GGUF FormatFor use with LLama.cpp & Forks(Coming Soon!)
- EXL2 FormatFor use with TabbyAPI (Coming Soon!)
- EXL3 FormatFor use with TabbyAPI (Slower on Ampere))
Prompting
Model has been tuned with the GLM-4 formatting.
Samplers
For testing of this model, I used Temp=1, 0.1 Min-P.
See Merging Config
https://files.catbox.moe/j9kyfy.yml
Credits
Thank you to Lucy Knada, Auri, Ateron, Alicat, Intervitens, Cgato, Kubernetes Bad and the rest of Anthracite.
- Downloads last month
- 71
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Delta-Vector/Plesio-32B
Merge model
this model