Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
ModelCloud
/
MiniMax-M2-GPTQMODEL-W4A16
like
3
Follow
ModelCloud.AI
74
Text Generation
Safetensors
English
minimax
gptqmodel
modelcloud
chat
glm4.6
glm
instruct
int4
gptq
4bit
w4a16
conversational
custom_code
4-bit precision
License:
modelcloud
Model card
Files
Files and versions
xet
Community
1
Use this model
main
MiniMax-M2-GPTQMODEL-W4A16
Commit History
Add files using upload-large-folder tool
1e1c096
verified
Qubitium
commited on
Oct 28
Update README.md
971e327
verified
Qubitium
commited on
Oct 28
Update README.md
c06b04e
verified
Qubitium
commited on
Oct 28
Update README.md
b8e2e98
verified
Qubitium
commited on
Oct 28
Update README.md
b86ac22
verified
Qubitium
commited on
Oct 28
initial commit
2b762b6
verified
Qubitium
commited on
Oct 28