Details

exl2 quant, 3.13bpw 8hb at main

Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including R136a1/TimeMax-20B-exl2