EXL3 quantization of Nanbeige4.1-3B, 8 bits per weight, including output layers.

Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for isogen/Nanbeige4.1-3B-exl3-8bpw-h8

Quantized
(41)
this model