This repo contains Gemma-3-1B 8-bit quantized rkllm model

Source of the model: Gemma-3-1B-it
Model was compiled to rkllm format using rkllm-toolkit v1.2.0

Chat with this model can be easily done on Axon SBC using vicharak-chat app.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Prem-S/Gemma-3-1B-rkllm

Finetuned
(140)
this model