Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MaziyarPanahi
/
xLAM-8x22b-r-GGUF
like
1
Text Generation
GGUF
quantized
2-bit
3-bit
4-bit precision
5-bit
6-bit
8-bit precision
GGUF
imatrix
conversational
Model card
Files
Files and versions
Community
6
Use this model
refs/pr/4
xLAM-8x22b-r-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
6 commits
MaziyarPanahi
Update README.md
eefb219
verified
8 months ago
.gitattributes
2.91 kB
Upload folder using huggingface_hub (#3)
8 months ago
README.md
Safe
2.96 kB
Update README.md
8 months ago
xLAM-8x22b-r.IQ1_M.gguf
Safe
32.7 GB
LFS
Upload folder using huggingface_hub (#2)
9 months ago
xLAM-8x22b-r.IQ1_S.gguf
Safe
29.7 GB
LFS
Upload xLAM-8x22b-r.IQ1_S.gguf with huggingface_hub
9 months ago
xLAM-8x22b-r.IQ2_XS.gguf
Safe
42 GB
LFS
Upload folder using huggingface_hub (#2)
9 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00001-of-00005.gguf
Safe
13.5 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00002-of-00005.gguf
Safe
12.6 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00003-of-00005.gguf
Safe
13.2 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00004-of-00005.gguf
Safe
13.3 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ3_XS.gguf-00005-of-00005.gguf
Safe
5.61 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00001-of-00005.gguf
Safe
17.1 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00002-of-00005.gguf
Safe
16.6 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00003-of-00005.gguf
Safe
17.4 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00004-of-00005.gguf
Safe
17.4 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.IQ4_XS.gguf-00005-of-00005.gguf
Safe
6.86 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.Q2_K.gguf-00001-of-00005.gguf
Safe
11.8 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.Q2_K.gguf-00002-of-00005.gguf
Safe
11.4 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.Q2_K.gguf-00003-of-00005.gguf
Safe
12 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.Q2_K.gguf-00004-of-00005.gguf
Safe
12 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago
xLAM-8x22b-r.Q2_K.gguf-00005-of-00005.gguf
Safe
4.79 GB
LFS
Upload folder using huggingface_hub (#3)
8 months ago