This is a MXFP4_MOE quantization of the model DeepSeek-V3.1-Terminus
Model quantized with BF16 GGUF's from: https://huggingface.co/unsloth/DeepSeek-V3.1-Terminus-GGUF
Original model: https://huggingface.co/deepseek-ai/DeepSeek-V3.1-Terminus
This model's GGUF's have been removed, in order to conserve my repos use of space.
If you want it, just message me, and I will make it available on demand.
- Downloads last month
- 127
Hardware compatibility
Log In
to view the estimation
4-bit
Model tree for noctrex/DeepSeek-V3.1-Terminus-MXFP4_MOE-GGUF
Base model
deepseek-ai/DeepSeek-V3.1-Base
Quantized
deepseek-ai/DeepSeek-V3.1-Terminus