mradermacher
mradermacher/MediX-R1-30B-GGUF
No description available.
Model Documentation
About
quantize_version: 2 -->
output_tensor_quantised: 1 -->
convert_type: hf -->
vocab_type: -->
tags: -->
quants: x-f16 Q4_K_S Q2_K Q8_0 Q6_K Q3_K_M Q3_K_S Q3_K_L Q4_K_M Q5_K_S Q5_K_M IQ4_XS -->
quants_skip: -->
skip_mmproj: -->
static quants of https://huggingface.co/MBZUAI/MediX-R1-30Bprovided-files -->
*For a convenient overview and download list, visit our model page for this model.*
weighted/imatrix quants are available at https://huggingface.co/mradermacher/MediX-R1-30B-i1-GGUF
Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files.
Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | GGUF | mmproj-Q8_0 | 0.8 | multi-modal supplement | | GGUF | mmproj-f16 | 1.2 | multi-modal supplement | | GGUF | Q2_K | 11.4 | | | GGUF | Q3_K_S | 13.4 | | | GGUF | Q3_K_M | 14.8 | lower quality | | GGUF | Q3_K_L | 16.0 | | | GGUF | IQ4_XS | 16.7 | | | GGUF | Q4_K_S | 17.6 | fast, recommended | | GGUF | Q4_K_M | 18.7 | fast, recommended | | GGUF | Q5_K_S | 21.2 | | | GGUF | Q5_K_M | 21.8 | | | GGUF | Q6_K | 25.2 | very good quality | | GGUF | Q8_0 | 32.6 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better):

And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized.
Thanks
I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.
end -->
Files & Weights
| Filename | Size | Action |
|---|---|---|
| MediX-R1-30B.IQ4_XS.gguf | 15.42 GB | |
| MediX-R1-30B.Q2_K.gguf | 10.49 GB | |
| MediX-R1-30B.Q3_K_L.gguf | 14.81 GB | |
| MediX-R1-30B.Q3_K_M.gguf | 13.70 GB | |
| MediX-R1-30B.Q3_K_S.gguf | 12.38 GB | |
| MediX-R1-30B.Q4_K_M.gguf | 17.28 GB | |
| MediX-R1-30B.Q4_K_S.gguf | 16.26 GB | |
| MediX-R1-30B.Q5_K_M.gguf | 20.23 GB | |
| MediX-R1-30B.Q5_K_S.gguf | 19.63 GB | |
| MediX-R1-30B.Q6_K.gguf | 23.37 GB | |
| MediX-R1-30B.Q8_0.gguf | 30.25 GB | |
| MediX-R1-30B.mmproj-Q8_0.gguf | 0.66 GB | |
| MediX-R1-30B.mmproj-f16.gguf | 1.01 GB |