mlx-community/ShowUI-2B-bf16-bf16

This model was converted to MLX format from prince-canuma/ShowUI-2B-bf16 using mlx-vlm version 0.1.14. Refer to the original model card for more details on the model.

Use with mlx

pip install -U mlx-vlm
python -m mlx_vlm.generate --model mlx-community/ShowUI-2B-bf16-bf16 --max-tokens 100 --temp 0.0 --prompt "Describe this image." --image <path_to_image>
Downloads last month
8
Safetensors
Model size
2.21B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for mlx-community/ShowUI-2B-bf16-bf16

Base model

Qwen/Qwen2-VL-2B
Finetuned
(136)
this model