mlx-community/Huihui-GLM-4.5V-abliterated-mxfp4

This model was converted to MLX format from huihui-ai/Huihui-GLM-4.5V-abliterated using mlx-vlm with MXFP4 support.
Refer to the original model card for more details on the model.

Use with mlx

pip install git+https://github.com/zhutao100/mlx-vlm.git
python -m mlx_vlm.generate --model mlx-community/Huihui-GLM-4.5V-abliterated-mxfp4 --max-tokens 100 --temperature 0.0 --prompt "Describe this image." --image <path_to_image>
Downloads last month
131
Safetensors
Model size
21B params
Tensor type
U8
U32
BF16
F32
MLX
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support

Model tree for mlx-community/Huihui-GLM-4.5V-abliterated-mxfp4

Finetuned
zai-org/GLM-4.5V
Quantized
(12)
this model