Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Mungert
/
KernelLLM-GGUF
like
3
Transformers
GGUF
ScalingIntelligence/KernelBench
GPUMODE/KernelBook
imatrix
conversational
License:
other
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
630e935
KernelLLM-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
20 commits
Mungert
Upload KernelLLM-q4_k_m.gguf with huggingface_hub
630e935
verified
21 days ago
.gitattributes
2.52 kB
Upload KernelLLM-q4_k_m.gguf with huggingface_hub
21 days ago
KernelLLM-bf16_q4_k.gguf
Safe
9.7 GB
xet
Upload KernelLLM-bf16_q4_k.gguf with huggingface_hub
21 days ago
KernelLLM-bf16_q6_k.gguf
Safe
10.8 GB
xet
Upload KernelLLM-bf16_q6_k.gguf with huggingface_hub
21 days ago
KernelLLM-bf16_q8_0.gguf
Safe
11.9 GB
xet
Upload KernelLLM-bf16_q8_0.gguf with huggingface_hub
21 days ago
KernelLLM-f16.gguf
Safe
16.1 GB
xet
Upload KernelLLM-f16.gguf with huggingface_hub
21 days ago
KernelLLM-f16_q4_k.gguf
Safe
9.7 GB
xet
Upload KernelLLM-f16_q4_k.gguf with huggingface_hub
21 days ago
KernelLLM-f16_q6_k.gguf
Safe
10.8 GB
xet
Upload KernelLLM-f16_q6_k.gguf with huggingface_hub
21 days ago
KernelLLM-f16_q8_0.gguf
Safe
11.9 GB
xet
Upload KernelLLM-f16_q8_0.gguf with huggingface_hub
21 days ago
KernelLLM-q2_k_l.gguf
Safe
3.66 GB
xet
Upload KernelLLM-q2_k_l.gguf with huggingface_hub
21 days ago
KernelLLM-q2_k_m.gguf
Safe
3.4 GB
xet
Upload KernelLLM-q2_k_m.gguf with huggingface_hub
21 days ago
KernelLLM-q2_k_s.gguf
Safe
3.08 GB
xet
Upload KernelLLM-q2_k_s.gguf with huggingface_hub
21 days ago
KernelLLM-q3_k_l.gguf
Safe
4.44 GB
xet
Upload KernelLLM-q3_k_l.gguf with huggingface_hub
21 days ago
KernelLLM-q3_k_m.gguf
Safe
4.19 GB
xet
Upload KernelLLM-q3_k_m.gguf with huggingface_hub
21 days ago
KernelLLM-q3_k_s.gguf
Safe
3.79 GB
xet
Upload KernelLLM-q3_k_s.gguf with huggingface_hub
21 days ago
KernelLLM-q4_k_l.gguf
Safe
5.26 GB
xet
Upload KernelLLM-q4_k_l.gguf with huggingface_hub
21 days ago
KernelLLM-q4_k_m.gguf
Safe
5.01 GB
xet
Upload KernelLLM-q4_k_m.gguf with huggingface_hub
21 days ago
KernelLLM-q5_k_l.gguf
Safe
6.08 GB
xet
Upload KernelLLM-q5_k_l.gguf with huggingface_hub
21 days ago
KernelLLM-q6_k_l.gguf
Safe
6.85 GB
xet
Upload KernelLLM-q6_k_l.gguf with huggingface_hub
21 days ago