float8 Qwen/Qwen3-8B-Base model

  • Developed by: q10
  • License: apache-2.0
  • Quantized from model : Qwen/Qwen3-8B-Base
Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for q10/Qwen3-8B-Base-float8

Base model

Qwen/Qwen3-8B-Base
Quantized
(27)
this model