Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
jerryzh168
/
Qwen3-8B-INT4
like
0
Text Generation
Transformers
PyTorch
English
qwen3
torchao
conversational
text-generation-inference
arxiv:
2507.16099
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
main
Qwen3-8B-INT4
Ctrl+K
Ctrl+K
1 contributor
History:
4 commits
jerryzh168
Upload README.md with huggingface_hub
320a2cf
verified
6 days ago
.gitattributes
Safe
1.57 kB
Upload tokenizer
6 days ago
README.md
11.9 kB
Upload README.md with huggingface_hub
6 days ago
added_tokens.json
Safe
707 Bytes
Upload tokenizer
6 days ago
chat_template.jinja
Safe
4.17 kB
Upload tokenizer
6 days ago
config.json
1.63 kB
Upload Qwen3ForCausalLM
6 days ago
generation_config.json
Safe
219 Bytes
Upload Qwen3ForCausalLM
6 days ago
merges.txt
Safe
1.67 MB
Upload tokenizer
6 days ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (14)
"torch._tensor._rebuild_from_type_v2"
,
"torch.int32"
,
"torchao.dtypes.affine_quantized_tensor.AffineQuantizedTensor"
,
"torch._utils._rebuild_wrapper_subclass"
,
"torchao.dtypes.uintx.tensor_core_tiled_layout.TensorCoreTiledLayout"
,
"torch.serialization._get_layout"
,
"torchao.dtypes.uintx.tensor_core_tiled_layout.TensorCoreTiledAQTTensorImpl"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.device"
,
"collections.OrderedDict"
,
"torch.BFloat16Storage"
,
"torch.bfloat16"
,
"torch.IntStorage"
,
"torchao.quantization.quant_primitives.ZeroPointDomain"
How to fix it?
4.94 GB
xet
Upload Qwen3ForCausalLM
6 days ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.BFloat16Storage"
What is a pickle import?
1.24 GB
xet
Upload Qwen3ForCausalLM
6 days ago
pytorch_model.bin.index.json
Safe
32.9 kB
Upload Qwen3ForCausalLM
6 days ago
special_tokens_map.json
Safe
613 Bytes
Upload tokenizer
6 days ago
tokenizer.json
Safe
11.4 MB
xet
Upload tokenizer
6 days ago
tokenizer_config.json
Safe
5.4 kB
Upload tokenizer
6 days ago
vocab.json
Safe
2.78 MB
Upload tokenizer
6 days ago