Qwen3.5-9b-Sushi-Coder-GGUF

This repository contains GGUF exports for:

  • bigatuna/Qwen3.5-9b-Sushi-Coder

Files:

  • Qwen3.5-9b-Sushi-Coder.Q4_K_M.gguf
  • Qwen3.5-9b-Sushi-Coder.Q8_0.gguf
  • Qwen3.5-9b-Sushi-Coder.BF16-mmproj.gguf

Usage note:

  • This is a multimodal Qwen 3.5 export, so the text GGUF should be used together with the BF16-mmproj file.
Downloads last month
639
GGUF
Model size
9B params
Architecture
qwen35
Hardware compatibility
Log In to add your hardware

4-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for bigatuna/Qwen3.5-9b-Sushi-Coder-GGUF

Quantized
(2)
this model