--- base_model: Qwen/Qwen2.5-0.5B language: - en library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct/blob/main/LICENSE pipeline_tag: text-generation tags: - chat - autoquant - exl2 --- # Qwen2.5-0.5B-Instruct ## Introduction This model is based on the Qwen2.5-0.5B-Instruct model and is quantized in 4bits in the EXL2 format using the AutoQuant system : https://colab.research.google.com/drive/1b6nqC7UZVt8bx4MksX7s656GXPM-eWw4 You can learn more about the EXL2 format here : https://github.com/turboderp/exllamav2 Feel free to use it as you want