Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
MaziyarPanahi
/
xLAM-8x22b-r-GGUF
like
1
Text Generation
GGUF
quantized
2-bit
3-bit
4-bit precision
5-bit
6-bit
8-bit precision
GGUF
imatrix
conversational
Model card
Files
Files and versions
Community
6
Use this model
811ee35
xLAM-8x22b-r-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
10 commits
MaziyarPanahi
Upload xLAM-8x22b-r.Q4_K_M.gguf-00002-of-00005.gguf with huggingface_hub
811ee35
verified
11 months ago
.gitattributes
Safe
3.23 kB
Upload xLAM-8x22b-r.Q4_K_M.gguf-00002-of-00005.gguf with huggingface_hub
11 months ago
README.md
Safe
2.96 kB
Update README.md (#4)
11 months ago
xLAM-8x22b-r.IQ1_M.gguf
Safe
32.7 GB
LFS
Upload folder using huggingface_hub (#2)
11 months ago
xLAM-8x22b-r.IQ1_S.gguf
Safe
29.7 GB
LFS
Upload xLAM-8x22b-r.IQ1_S.gguf with huggingface_hub
11 months ago
xLAM-8x22b-r.IQ2_XS.gguf
Safe
42 GB
LFS
Upload folder using huggingface_hub (#2)
11 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00001-of-00005.gguf
Safe
13.5 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00002-of-00005.gguf
Safe
12.6 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00003-of-00005.gguf
Safe
13.2 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00004-of-00005.gguf
Safe
13.3 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00005-of-00005.gguf
Safe
5.61 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00001-of-00005.gguf
Safe
17.1 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00002-of-00005.gguf
Safe
16.6 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00003-of-00005.gguf
Safe
17.4 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00004-of-00005.gguf
Safe
17.4 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00005-of-00005.gguf
Safe
6.86 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.Q2_K.gguf-00001-of-00005.gguf
Safe
11.8 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.Q2_K.gguf-00002-of-00005.gguf
Safe
11.4 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.Q2_K.gguf-00003-of-00005.gguf
Safe
12 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.Q2_K.gguf-00004-of-00005.gguf
Safe
12 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.Q2_K.gguf-00005-of-00005.gguf
Safe
4.79 GB
LFS
Upload folder using huggingface_hub (#3)
11 months ago
xLAM-8x22b-r.Q3_K_M.gguf-00005-of-00005.gguf
Safe
6.15 GB
LFS
Upload xLAM-8x22b-r.Q3_K_M.gguf-00005-of-00005.gguf with huggingface_hub
11 months ago
xLAM-8x22b-r.Q4_K_M.gguf-00002-of-00005.gguf
Safe
18.4 GB
LFS
Upload xLAM-8x22b-r.Q4_K_M.gguf-00002-of-00005.gguf with huggingface_hub
11 months ago
xLAM-8x22b-r.Q4_K_S.gguf-00002-of-00005.gguf
Safe
17.6 GB
LFS
Upload xLAM-8x22b-r.Q4_K_S.gguf-00002-of-00005.gguf with huggingface_hub
11 months ago
xLAM-8x22b-r.Q5_K_S.gguf-00005-of-00005.gguf
Safe
8.77 GB
LFS
Upload xLAM-8x22b-r.Q5_K_S.gguf-00005-of-00005.gguf with huggingface_hub
11 months ago