Update README.md
Browse files
README.md
CHANGED
|
@@ -72,7 +72,7 @@ Today, we release and open source MiniMax-M2, a **Mini** model built for **Max**
|
|
| 72 |
**MiniMax-M2** redefines efficiency for agents. It's a compact, fast, and cost-effective MoE model (230 billion total parameters with 10 billion active parameters) built for elite performance in coding and agentic tasks, all while maintaining powerful general intelligence. With just 10 billion activated parameters, MiniMax-M2 provides the sophisticated, end-to-end tool use performance expected from today's leading models, but in a streamlined form factor that makes deployment and scaling easier than ever.
|
| 73 |
|
| 74 |
<p align="center">
|
| 75 |
-
<img width="100%" src="figures/Bench.png">
|
| 76 |
</p>
|
| 77 |
|
| 78 |
---
|
|
|
|
| 72 |
**MiniMax-M2** redefines efficiency for agents. It's a compact, fast, and cost-effective MoE model (230 billion total parameters with 10 billion active parameters) built for elite performance in coding and agentic tasks, all while maintaining powerful general intelligence. With just 10 billion activated parameters, MiniMax-M2 provides the sophisticated, end-to-end tool use performance expected from today's leading models, but in a streamlined form factor that makes deployment and scaling easier than ever.
|
| 73 |
|
| 74 |
<p align="center">
|
| 75 |
+
<img width="100%" src="https://huggingface.co/MiniMaxAI/MiniMax-M2/resolve/main/figures/Bench.png">
|
| 76 |
</p>
|
| 77 |
|
| 78 |
---
|