Edit model card
2B_or_not_2B
2B_or_not_2B

2B or not 2B, that's the question!

The model's name is fully credited to invisietch and Shakespeare; without them, this model would not have existed.

Why_Tha_Name_Though

Regarding the question, I am happy to announce that it is, in fact, 2B, as it is so stated on the original Google model card, which this model was finetuned on.

If there's one thing we can count on, it is Google to tell us what is true, and what is misinformation. You should always trust and listen to your elders, and especially to your big brother.

This model was finetuned on a whimsical whim, on my poor laptop. It's not really poor, the GPU is 4090 16GB, but... it is driver-locked to 80watts because nVidia probably does not have the resources to make better drivers for Linux. I hope nVidia will manage to recover, as I have seen poor Jensen with the same old black leather jacket for years upon years. The stock is down like 22% already in this month (August 11th, 2024).

Finetuning took about 4 hours, while the laptop was on my lap, and while I was talking about books and stuff on Discord. Luckily, the laptop wasn't too hot, as 80 watts is not the 175w I was promised, which would have surely been hot enough to make an Omelette. Always remain an optimist fellas!

2B_or_not_2B is available at the following quants:

Censorship level:

  • Low - Very low
  • 7.9 / 10 (10 completely uncensored)2B_or_not_2B

Support

GPUs too expensive
  • My Ko-fi page ALL donations will go for research resources and compute, every bit counts ๐Ÿ™๐Ÿป
  • My Patreon ALL donations will go for research resources and compute, every bit counts ๐Ÿ™๐Ÿป

Disclaimer

*This model is pretty uncensored, use responsibly

Other stuff

Downloads last month
0
Inference API
Unable to determine this model's library. Check the docs .

Collection including SicariusSicariiStuff/2B_or_not_2B-EXL2-3.0bpw