Models FP16 models and Loras that we've created royallab/Pygmalion-2-13b-SuperCOT Text Generation • 13B • Updated Sep 13, 2023 • 854 • • 7 royallab/PsyOrca2-13b-DARE Text Generation • 13B • Updated Nov 28, 2023 • 790 • 1 royallab/Aetheria-L2-70B Text Generation • 69B • Updated Nov 27, 2023 • 8 • 9 royallab/ZephRP-m7b Text Generation • 7B • Updated Oct 12, 2023 • 12.1k • • 6
Exl2 Models that we have quantized to exllamav2's quantization format royallab/Buttercup-4x7B-exl2 Updated Jan 26, 2024 • 1 royallab/Norobara-ZLoss-8x7B-exl2 Updated Jan 4, 2024 • 1 royallab/Kimiko-10.7B-v3-exl2 Updated Jan 3, 2024 royallab/TinyLlama-1.1B-ckpt-2.5T-exl2 Updated Dec 22, 2023 • 3
Models FP16 models and Loras that we've created royallab/Pygmalion-2-13b-SuperCOT Text Generation • 13B • Updated Sep 13, 2023 • 854 • • 7 royallab/PsyOrca2-13b-DARE Text Generation • 13B • Updated Nov 28, 2023 • 790 • 1 royallab/Aetheria-L2-70B Text Generation • 69B • Updated Nov 27, 2023 • 8 • 9 royallab/ZephRP-m7b Text Generation • 7B • Updated Oct 12, 2023 • 12.1k • • 6
Exl2 Models that we have quantized to exllamav2's quantization format royallab/Buttercup-4x7B-exl2 Updated Jan 26, 2024 • 1 royallab/Norobara-ZLoss-8x7B-exl2 Updated Jan 4, 2024 • 1 royallab/Kimiko-10.7B-v3-exl2 Updated Jan 3, 2024 royallab/TinyLlama-1.1B-ckpt-2.5T-exl2 Updated Dec 22, 2023 • 3