Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
unsloth
/
granite-4.0-350m-GGUF
like
4
Follow
Unsloth AI
11.3k
Transformers
GGUF
language
unsloth
granite-4.0
conversational
arxiv:
0000.00000
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
ef85b9c
granite-4.0-350m-GGUF
4.36 GB
1 contributor
History:
23 commits
danielhanchen
Upload folder using huggingface_hub
ef85b9c
verified
about 2 months ago
.gitattributes
2.64 kB
Upload folder using huggingface_hub
about 2 months ago
README.md
Safe
33.9 kB
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-IQ4_NL.gguf
Safe
229 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-IQ4_XS.gguf
Safe
222 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q2_K.gguf
181 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q2_K_L.gguf
181 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q3_K_M.gguf
Safe
208 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q3_K_S.gguf
Safe
195 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q4_0.gguf
229 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q4_1.gguf
244 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q4_K_M.gguf
237 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q4_K_S.gguf
229 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q5_K_M.gguf
264 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-Q6_K.gguf
293 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-IQ1_M.gguf
136 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-IQ1_S.gguf
132 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-IQ2_M.gguf
163 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-IQ2_XXS.gguf
143 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-IQ3_XXS.gguf
172 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-Q2_K_XL.gguf
184 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-Q3_K_XL.gguf
213 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-Q4_K_XL.gguf
241 MB
xet
Upload folder using huggingface_hub
about 2 months ago
granite-4.0-350m-UD-Q5_K_XL.gguf
265 MB
xet
Upload folder using huggingface_hub
about 2 months ago