We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 35be8fa commit 3a886bdCopy full SHA for 3a886bd
docs/features/quantization/bnb.md
@@ -15,7 +15,7 @@ pip install bitsandbytes>=0.45.3
15
16
vLLM reads the model's config file and supports both in-flight quantization and pre-quantized checkpoint.
17
18
-You can find bitsandbytes quantized models on <https://huggingface.co/models?search=bitsandbytes>.
+You can find bitsandbytes quantized models on [Hugging Face](https://huggingface.co/models?search=bitsandbytes).
19
And usually, these repositories have a config.json file that includes a quantization_config section.
20
21
## Read quantized checkpoint
0 commit comments