🐱cat architecture gguf encoder plus safetensors tokenizer (paired)

  • cat-gemma2
  • cat-umt5xxl

how to use

  • drag both gguf encoder and safetensors tokenizer to > ./ComfyUI/models/text_encoders
  • select gguf encoder in the gguf clip loader; they will pair up to work automatically

reference

  • upgrade your node to the latest version for cat-encoder support
  • gguf-node (pypi|repo|pack)
Downloads last month
411
GGUF
Model size
2.61B params
Architecture
cat
Hardware compatibility
Log In to view the estimation

2-bit

4-bit

5-bit

6-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support