danielhanchen commited on
Commit
56e88d0
·
verified ·
1 Parent(s): a74fd06

Upload folder using huggingface_hub

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +1 -0
  2. README.md +1369 -3
  3. chat_template.jinja +99 -0
  4. config.json +142 -0
  5. merges.txt +0 -0
  6. model.safetensors-00001-of-00094.safetensors +3 -0
  7. model.safetensors-00002-of-00094.safetensors +3 -0
  8. model.safetensors-00003-of-00094.safetensors +3 -0
  9. model.safetensors-00004-of-00094.safetensors +3 -0
  10. model.safetensors-00005-of-00094.safetensors +3 -0
  11. model.safetensors-00006-of-00094.safetensors +3 -0
  12. model.safetensors-00007-of-00094.safetensors +3 -0
  13. model.safetensors-00008-of-00094.safetensors +3 -0
  14. model.safetensors-00009-of-00094.safetensors +3 -0
  15. model.safetensors-00010-of-00094.safetensors +3 -0
  16. model.safetensors-00011-of-00094.safetensors +3 -0
  17. model.safetensors-00012-of-00094.safetensors +3 -0
  18. model.safetensors-00013-of-00094.safetensors +3 -0
  19. model.safetensors-00014-of-00094.safetensors +3 -0
  20. model.safetensors-00015-of-00094.safetensors +3 -0
  21. model.safetensors-00016-of-00094.safetensors +3 -0
  22. model.safetensors-00017-of-00094.safetensors +3 -0
  23. model.safetensors-00018-of-00094.safetensors +3 -0
  24. model.safetensors-00019-of-00094.safetensors +3 -0
  25. model.safetensors-00020-of-00094.safetensors +3 -0
  26. model.safetensors-00021-of-00094.safetensors +3 -0
  27. model.safetensors-00022-of-00094.safetensors +3 -0
  28. model.safetensors-00023-of-00094.safetensors +3 -0
  29. model.safetensors-00024-of-00094.safetensors +3 -0
  30. model.safetensors-00025-of-00094.safetensors +3 -0
  31. model.safetensors-00026-of-00094.safetensors +3 -0
  32. model.safetensors-00027-of-00094.safetensors +3 -0
  33. model.safetensors-00028-of-00094.safetensors +3 -0
  34. model.safetensors-00029-of-00094.safetensors +3 -0
  35. model.safetensors-00030-of-00094.safetensors +3 -0
  36. model.safetensors-00031-of-00094.safetensors +3 -0
  37. model.safetensors-00032-of-00094.safetensors +3 -0
  38. model.safetensors-00033-of-00094.safetensors +3 -0
  39. model.safetensors-00034-of-00094.safetensors +3 -0
  40. model.safetensors-00035-of-00094.safetensors +3 -0
  41. model.safetensors-00036-of-00094.safetensors +3 -0
  42. model.safetensors-00037-of-00094.safetensors +3 -0
  43. model.safetensors-00038-of-00094.safetensors +3 -0
  44. model.safetensors-00039-of-00094.safetensors +3 -0
  45. model.safetensors-00040-of-00094.safetensors +3 -0
  46. model.safetensors-00041-of-00094.safetensors +3 -0
  47. model.safetensors-00042-of-00094.safetensors +3 -0
  48. model.safetensors-00043-of-00094.safetensors +3 -0
  49. model.safetensors-00044-of-00094.safetensors +3 -0
  50. model.safetensors-00045-of-00094.safetensors +3 -0
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,3 +1,1369 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - unsloth
4
+ base_model:
5
+ - Qwen/Qwen3.5-397B-A17B
6
+ library_name: transformers
7
+ license: apache-2.0
8
+ license_link: https://huggingface.co/Qwen/Qwen3.5-397B-A17B/blob/main/LICENSE
9
+ pipeline_tag: image-text-to-text
10
+ ---
11
+
12
+ # Qwen3.5-397B-A17B
13
+
14
+ <img width="400px" src="https://qianwen-res.oss-accelerate.aliyuncs.com/logo_qwen3.5.png">
15
+
16
+ [![Qwen Chat](https://img.shields.io/badge/%F0%9F%92%9C%EF%B8%8F%20Qwen%20Chat%20-536af5)](https://chat.qwen.ai)
17
+
18
+ > [!Note]
19
+ > This repository contains model weights and configuration files for the post-trained model in the Hugging Face Transformers format.
20
+ >
21
+ > These artifacts are compatible with Hugging Face Transformers, vLLM, SGLang, etc.
22
+
23
+ Over recent months, we have intensified our focus on developing foundation models that deliver exceptional utility and performance. Qwen3.5 represents a significant leap forward, integrating breakthroughs in multimodal learning, architectural efficiency, reinforcement learning scale, and global accessibility to empower developers and enterprises with unprecedented capability and efficiency.
24
+
25
+ > [!Tip]
26
+ > For users seeking managed, scalable inference without infrastructure maintenance, the official Qwen API service is provided by [Alibaba Cloud Model Studio](https://modelstudio.alibabacloud.com/).
27
+ >
28
+ > In particular, **Qwen3.5-Plus** is the hosted version corresponding to Qwen3.5-397B-A17B with more production features, e.g., 1M context length by default, official built-in tools, and adaptive tool use.
29
+ > For more information, please refer to the [User Guide](https://www.alibabacloud.com/help/en/model-studio/text-generation).
30
+
31
+ ## Qwen3.5 Highlights
32
+
33
+ Qwen3.5 features the following enhancement:
34
+
35
+ - **Unified Vision-Language Foundation**: Early fusion training on multimodal tokens achieves cross-generational parity with Qwen3 and outperforms Qwen3-VL models across reasoning, coding, agents, and visual understanding benchmarks.
36
+
37
+ - **Efficient Hybrid Architecture**: Gated Delta Networks combined with sparse Mixture-of-Experts deliver high-throughput inference with minimal latency and cost overhead.
38
+
39
+ - **Scalable RL Generalization**: Reinforcement learning scaled across million-agent environments with progressively complex task distributions for robust real-world adaptability.
40
+
41
+ - **Global Linguistic Coverage**: Expanded support to 201 languages and dialects, enabling inclusive, worldwide deployment with nuanced cultural and regional understanding.
42
+
43
+ - **Next-Generation Training Infrastructure**: Near-100% multimodal training efficiency compared to text-only training and asynchronous RL frameworks supporting massive-scale agent scaffolds and environment orchestration.
44
+
45
+
46
+ ![Benchmark Results](https://qianwen-res.oss-accelerate.aliyuncs.com/Qwen3.5/Figures/qwen3.5_397b_a17b_score.png)
47
+
48
+ For more details, please refer to our blog post [Qwen3.5](https://qwen.ai/blog?id=qwen3.5).
49
+
50
+
51
+ ## Model Overview
52
+
53
+ - Type: Causal Language Model with Vision Encoder
54
+ - Training Stage: Pre-training & Post-training
55
+ - Language Model
56
+ - Number of Parameters: 397B in total and 17B activated
57
+ - Hidden Dimension: 4096
58
+ - Token Embedding: 248320 (Padded)
59
+ - Number of Layers: 60
60
+ - Hidden Layout: 15 \* (3 \* (Gated DeltaNet -> MoE) -> 1 \* (Gated Attention -> MoE))
61
+ - Gated DeltaNet:
62
+ - Number of Linear Attention Heads: 64 for V and 16 for QK
63
+ - Head Dimension: 128
64
+ - Gated Attention:
65
+ - Number of Attention Heads: 32 for Q and 2 for KV
66
+ - Head Dimension: 256
67
+ - Rotary Position Embedding Dimension: 64
68
+ - Mixture Of Experts
69
+ - Number of Experts: 512
70
+ - Number of Activated Experts: 10 Routed + 1 Shared
71
+ - Expert Intermediate Dimension: 1024
72
+ - LM Output: 248320 (Padded)
73
+ - MTP: trained with multi-steps
74
+ - Context Length: 262,144 natively and extensible up to 1,010,000 tokens.
75
+
76
+ > [!Important]
77
+ > Qwen3.5 models operate in thinking mode by default, generating thinking content signified by `<think>\n...</think>\n\n` before producing the final responses.
78
+ > To disable thinking content and obtain direct response, refer to the examples [here](#instruct-or-non-thinking-mode).
79
+
80
+ ## Benchmark Results
81
+
82
+ ### Language
83
+
84
+ <div style="font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,sans-serif;color:#1a1a2e;max-width:900px;margin:0 auto;padding:16px 0">
85
+ <table style="width:100%;border-collapse:collapse;font-size:13px">
86
+ <thead><tr>
87
+ <th style="padding:10px 12px;text-align:left;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95"></th>
88
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">GPT5.2</th>
89
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Claude 4.5 Opus</th>
90
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Gemini-3 Pro</th>
91
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Qwen3-Max-Thinking</th>
92
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">K2.5-1T-A32B</th>
93
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Qwen3.5-397B-A17B</th>
94
+ </tr></thead>
95
+ <tbody>
96
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Knowledge</td></tr>
97
+ <tr>
98
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMLU-Pro</td>
99
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.4</td>
100
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.5</td>
101
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.8</td>
102
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.7</td>
103
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.1</td>
104
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.8</td>
105
+ </tr>
106
+ <tr>
107
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMLU-Redux</td>
108
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">95.0</td>
109
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">95.6</td>
110
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">95.9</td>
111
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.8</td>
112
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.5</td>
113
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.9</td>
114
+ </tr>
115
+ <tr>
116
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SuperGPQA</td>
117
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.9</td>
118
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.6</td>
119
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.0</td>
120
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.3</td>
121
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">69.2</td>
122
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.4</td>
123
+ </tr>
124
+ <tr>
125
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">C-Eval</td>
126
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.5</td>
127
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.2</td>
128
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.4</td>
129
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.7</td>
130
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.0</td>
131
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.0</td>
132
+ </tr>
133
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Instruction Following</td></tr>
134
+ <tr>
135
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">IFEval</td>
136
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.8</td>
137
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.9</td>
138
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.5</td>
139
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.4</td>
140
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.9</td>
141
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.6</td>
142
+ </tr>
143
+ <tr>
144
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">IFBench</td>
145
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.4</td>
146
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">58.0</td>
147
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.4</td>
148
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.9</td>
149
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.2</td>
150
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.5</td>
151
+ </tr>
152
+ <tr>
153
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MultiChallenge</td>
154
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">57.9</td>
155
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.2</td>
156
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">64.2</td>
157
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.3</td>
158
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.7</td>
159
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.6</td>
160
+ </tr>
161
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Long Context</td></tr>
162
+ <tr>
163
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">AA-LCR</td>
164
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.7</td>
165
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.0</td>
166
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.7</td>
167
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.7</td>
168
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.0</td>
169
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.7</td>
170
+ </tr>
171
+ <tr>
172
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">LongBench v2</td>
173
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.5</td>
174
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">64.4</td>
175
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.2</td>
176
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">60.6</td>
177
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">61.0</td>
178
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.2</td>
179
+ </tr>
180
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">STEM</td></tr>
181
+ <tr>
182
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">GPQA</td>
183
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.4</td>
184
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.0</td>
185
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.9</td>
186
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.4</td>
187
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.6</td>
188
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.4</td>
189
+ </tr>
190
+ <tr>
191
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">HLE</td>
192
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">35.5</td>
193
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">30.8</td>
194
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">37.5</td>
195
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">30.2</td>
196
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">30.1</td>
197
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">28.7</td>
198
+ </tr>
199
+ <tr>
200
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">HLE-Verified¹</td>
201
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">43.3</td>
202
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">38.8</td>
203
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">48</td>
204
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">37.6</td>
205
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
206
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">37.6</td>
207
+ </tr>
208
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Reasoning</td></tr>
209
+ <tr>
210
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">LiveCodeBench v6</td>
211
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.7</td>
212
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.8</td>
213
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.7</td>
214
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.9</td>
215
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.0</td>
216
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.6</td>
217
+ </tr>
218
+ <tr>
219
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">HMMT Feb 25</td>
220
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">99.4</td>
221
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.9</td>
222
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">97.3</td>
223
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">98.0</td>
224
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">95.4</td>
225
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.8</td>
226
+ </tr>
227
+ <tr>
228
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">HMMT Nov 25</td>
229
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">100</td>
230
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.3</td>
231
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.3</td>
232
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.7</td>
233
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.1</td>
234
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.7</td>
235
+ </tr>
236
+ <tr>
237
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">IMOAnswerBench</td>
238
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.3</td>
239
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.0</td>
240
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.3</td>
241
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.9</td>
242
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.8</td>
243
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.9</td>
244
+ </tr>
245
+ <tr>
246
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">AIME26</td>
247
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">96.7</td>
248
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.3</td>
249
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.6</td>
250
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.3</td>
251
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.3</td>
252
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.3</td>
253
+ </tr>
254
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">General Agent</td></tr>
255
+ <tr>
256
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">BFCL-V4</td>
257
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.1</td>
258
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.5</td>
259
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.5</td>
260
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.7</td>
261
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.3</td>
262
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.9</td>
263
+ </tr>
264
+ <tr>
265
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">TAU2-Bench</td>
266
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.1</td>
267
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.6</td>
268
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.4</td>
269
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.6</td>
270
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.0</td>
271
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.7</td>
272
+ </tr>
273
+ <tr>
274
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">VITA-Bench</td>
275
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">38.2</td>
276
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">56.3</td>
277
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">51.6</td>
278
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">40.9</td>
279
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">41.9</td>
280
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">49.7</td>
281
+ </tr>
282
+ <tr>
283
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">DeepPlanning</td>
284
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">44.6</td>
285
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">33.9</td>
286
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">23.3</td>
287
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">28.7</td>
288
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">14.5</td>
289
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">34.3</td>
290
+ </tr>
291
+ <tr>
292
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Tool Decathlon</td>
293
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">43.8</td>
294
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">43.5</td>
295
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">36.4</td>
296
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">18.8</td>
297
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">27.8</td>
298
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">38.3</td>
299
+ </tr>
300
+ <tr>
301
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MCP-Mark</td>
302
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">57.5</td>
303
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">42.3</td>
304
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">53.9</td>
305
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">33.5</td>
306
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">29.5</td>
307
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">46.1</td>
308
+ </tr>
309
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Search Agent³</td></tr>
310
+ <tr>
311
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">HLE w/ tool</td>
312
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">45.5</td>
313
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">43.4</td>
314
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">45.8</td>
315
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">49.8</td>
316
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">50.2</td>
317
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">48.3</td>
318
+ </tr>
319
+ <tr>
320
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">BrowseComp</td>
321
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.8</td>
322
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.8</td>
323
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">59.2</td>
324
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">53.9</td>
325
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--/74.9</td>
326
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">69.0/78.6</td>
327
+ </tr>
328
+ <tr>
329
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">BrowseComp-zh</td>
330
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.1</td>
331
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.4</td>
332
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.8</td>
333
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">60.9</td>
334
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
335
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.3</td>
336
+ </tr>
337
+ <tr>
338
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">WideSearch</td>
339
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.8</td>
340
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.4</td>
341
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.0</td>
342
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">57.9</td>
343
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.7</td>
344
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.0</td>
345
+ </tr>
346
+ <tr>
347
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Seal-0</td>
348
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">45.0</td>
349
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">47.7</td>
350
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">45.5</td>
351
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">46.9</td>
352
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">57.4</td>
353
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">46.9</td>
354
+ </tr>
355
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Multilingualism</td></tr>
356
+ <tr>
357
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMMLU</td>
358
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.5</td>
359
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.1</td>
360
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.6</td>
361
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.4</td>
362
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.0</td>
363
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.5</td>
364
+ </tr>
365
+ <tr>
366
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMLU-ProX</td>
367
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.7</td>
368
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.7</td>
369
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.7</td>
370
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.5</td>
371
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">82.3</td>
372
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.7</td>
373
+ </tr>
374
+ <tr>
375
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">NOVA-63</td>
376
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.6</td>
377
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">56.7</td>
378
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">56.7</td>
379
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.2</td>
380
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">56.0</td>
381
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">59.1</td>
382
+ </tr>
383
+ <tr>
384
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">INCLUDE</td>
385
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.5</td>
386
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.2</td>
387
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.5</td>
388
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">82.3</td>
389
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.3</td>
390
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.6</td>
391
+ </tr>
392
+ <tr>
393
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Global PIQA</td>
394
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.9</td>
395
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.6</td>
396
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.2</td>
397
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.0</td>
398
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.3</td>
399
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.8</td>
400
+ </tr>
401
+ <tr>
402
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">PolyMATH</td>
403
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.5</td>
404
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.0</td>
405
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.6</td>
406
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">64.7</td>
407
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">43.1</td>
408
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.3</td>
409
+ </tr>
410
+ <tr>
411
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">WMT24++</td>
412
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.8</td>
413
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.7</td>
414
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.7</td>
415
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.6</td>
416
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.6</td>
417
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.9</td>
418
+ </tr>
419
+ <tr>
420
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MAXIFE</td>
421
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.4</td>
422
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.2</td>
423
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.5</td>
424
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.0</td>
425
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.8</td>
426
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.2</td>
427
+ </tr>
428
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Coding Agent</td></tr>
429
+ <tr>
430
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SWE-bench Verified</td>
431
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.0</td>
432
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.9</td>
433
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.2</td>
434
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.3</td>
435
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.8</td>
436
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.4</td>
437
+ </tr>
438
+ <tr>
439
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SWE-bench Multilingual</td>
440
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.0</td>
441
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.5</td>
442
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.0</td>
443
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.7</td>
444
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.0</td>
445
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.0</td>
446
+ </tr>
447
+ <tr>
448
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SecCodeBench</td>
449
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.7</td>
450
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.6</td>
451
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.4</td>
452
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">57.5</td>
453
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">61.3</td>
454
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.3</td>
455
+ </tr>
456
+ <tr>
457
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Terminal Bench 2</td>
458
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.0</td>
459
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">59.3</td>
460
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.2</td>
461
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">22.5</td>
462
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">50.8</td>
463
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">52.5</td>
464
+ </tr>
465
+ </tbody>
466
+ </table>
467
+
468
+ <p style="margin-top:12px;font-size:11px;color:#888">
469
+ * HLE-Verified: a verified and revised version of Humanity’s Last Exam (HLE), accompanied by a transparent, component-wise verification protocol and a fine-grained error taxonomy. We open-source the dataset at https://huggingface.co/datasets/skylenage/HLE-Verified.<br>
470
+ * TAU2-Bench:we follow the official setup except for the airline domain, where all models are evaluated by applying the fixes proposed in the Claude Opus 4.5 system card.<br>
471
+ * MCPMark: GitHub MCP server uses v0.30.3 from api.githubcopilot.com; Playwright tool responses are truncated at 32k tokens.<br>
472
+ * Seach Agent: most Search Agents built on our model adopt a simple context-folding strategy(256k): once the cumulative Tool Response length reaches a preset threshold, earlier Tool Responses are pruned from the history to keep the context within limits.<br>
473
+ * BrowseComp: we tested two strategies, simple context-folding achieved a score of 69.0, while using the same discard-all strategy as DeepSeek-V3.2 and Kimi K2.5 achieved 78.6.<br>
474
+ * WideSearch: we use a 256k context window without any context management.<br>
475
+ * MMLU-ProX: we report the averaged accuracy on 29 languages.<br>
476
+ * WMT24++: a harder subset of WMT24 after difficulty labeling and rebalancing; we report the averaged scores on 55 languages using XCOMET-XXL.<br>
477
+ * MAXIFE: we report the accuracy on English + multilingual original prompts (totally 23 settings).<br>
478
+ * Empty cells (--) indicate scores not yet available or not applicable.<br>
479
+ </p>
480
+
481
+ </div>
482
+
483
+ ### Vision Language
484
+
485
+ <div style="font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,sans-serif;color:#1a1a2e;max-width:900px;margin:0 auto;padding:16px 0">
486
+ <table style="width:100%;border-collapse:collapse;font-size:13px">
487
+ <thead><tr>
488
+ <th style="padding:10px 12px;text-align:left;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95"></th>
489
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">GPT5.2</th>
490
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Claude 4.5 Opus</th>
491
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Gemini-3 Pro</th>
492
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Qwen3-VL-235B-A22B</th>
493
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">K2.5-1T-A32B</th>
494
+ <th style="padding:10px 12px;text-align:center;font-weight:600;border-bottom:2px solid #7c3aed;color:#4c1d95">Qwen3.5-397B-A17B</th>
495
+ </tr></thead>
496
+ <tbody>
497
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">STEM and Puzzle</td></tr>
498
+ <tr>
499
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMMU</td>
500
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.7</td>
501
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.7</td>
502
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.2</td>
503
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.6</td>
504
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.3</td>
505
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.0</td>
506
+ </tr>
507
+ <tr>
508
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMMU-Pro</td>
509
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.5</td>
510
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.6</td>
511
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.0</td>
512
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">69.3</td>
513
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.5</td>
514
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.0</td>
515
+ </tr>
516
+ <tr>
517
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MathVision</td>
518
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.0</td>
519
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.3</td>
520
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.6</td>
521
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.6</td>
522
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.2</td>
523
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.6</td>
524
+ </tr>
525
+ <tr>
526
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Mathvista(mini)</td>
527
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.1</td>
528
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.0</td>
529
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.9</td>
530
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.8</td>
531
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.1</td>
532
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.3</td>
533
+ </tr>
534
+ <tr>
535
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">We-Math</td>
536
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.0</td>
537
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.0</td>
538
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.9</td>
539
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.8</td>
540
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.7</td>
541
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.9</td>
542
+ </tr>
543
+ <tr>
544
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">DynaMath</td>
545
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.8</td>
546
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.7</td>
547
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.1</td>
548
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">82.8</td>
549
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.4</td>
550
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.3</td>
551
+ </tr>
552
+ <tr>
553
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">ZEROBench</td>
554
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">9</td>
555
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">3</td>
556
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">10</td>
557
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">4</td>
558
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">9</td>
559
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">12</td>
560
+ </tr>
561
+ <tr>
562
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">ZEROBench_sub</td>
563
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">33.2</td>
564
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">28.4</td>
565
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">39.0</td>
566
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">28.4</td>
567
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">33.5</td>
568
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">41.0</td>
569
+ </tr>
570
+ <tr>
571
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">BabyVision</td>
572
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">34.4</td>
573
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">14.2</td>
574
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">49.7</td>
575
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">22.2</td>
576
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">36.5</td>
577
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">52.3/43.3</td>
578
+ </tr>
579
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">General VQA</td></tr>
580
+ <tr>
581
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">RealWorldQA</td>
582
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.3</td>
583
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.0</td>
584
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.3</td>
585
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.3</td>
586
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.0</td>
587
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.9</td>
588
+ </tr>
589
+ <tr>
590
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMStar</td>
591
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.1</td>
592
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.2</td>
593
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.1</td>
594
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.7</td>
595
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.5</td>
596
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.8</td>
597
+ </tr>
598
+ <tr>
599
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">HallusionBench</td>
600
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.2</td>
601
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">64.1</td>
602
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.6</td>
603
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.7</td>
604
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">69.8</td>
605
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">71.4</td>
606
+ </tr>
607
+ <tr>
608
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMBench<sub>EN-DEV-v1.1</td>
609
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.2</td>
610
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.2</td>
611
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.7</td>
612
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.7</td>
613
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.2</td>
614
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.7</td>
615
+ </tr>
616
+ <tr>
617
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SimpleVQA</td>
618
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">55.8</td>
619
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.7</td>
620
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.2</td>
621
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">61.3</td>
622
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">71.2</td>
623
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.1</td>
624
+ </tr>
625
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Text Recognition and Document Understanding</td></tr>
626
+ <tr>
627
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">OmniDocBench1.5</td>
628
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.7</td>
629
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.7</td>
630
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.5</td>
631
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.5</td>
632
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.8</td>
633
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.8</td>
634
+ </tr>
635
+ <tr>
636
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">CharXiv(RQ)</td>
637
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">82.1</td>
638
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.5</td>
639
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.4</td>
640
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.1</td>
641
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.5</td>
642
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.8</td>
643
+ </tr>
644
+ <tr>
645
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMLongBench-Doc</td>
646
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
647
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">61.9</td>
648
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">60.5</td>
649
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">56.2</td>
650
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">58.5</td>
651
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">61.5</td>
652
+ </tr>
653
+ <tr>
654
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">CC-OCR</td>
655
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.3</td>
656
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.9</td>
657
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.0</td>
658
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.5</td>
659
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.7</td>
660
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">82.0</td>
661
+ </tr>
662
+ <tr>
663
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">AI2D_TEST</td>
664
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.2</td>
665
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.7</td>
666
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.1</td>
667
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">89.2</td>
668
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.8</td>
669
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.9</td>
670
+ </tr>
671
+ <tr>
672
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">OCRBench</td>
673
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.7</td>
674
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.8</td>
675
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.4</td>
676
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.5</td>
677
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.3</td>
678
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.1</td>
679
+ </tr>
680
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Spatial Intelligence</td></tr>
681
+ <tr>
682
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">ERQA</td>
683
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">59.8</td>
684
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">46.8</td>
685
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.5</td>
686
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">52.5</td>
687
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
688
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.5</td>
689
+ </tr>
690
+ <tr>
691
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">CountBench</td>
692
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.9</td>
693
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">90.6</td>
694
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">97.3</td>
695
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">93.7</td>
696
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">94.1</td>
697
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">97.2</td>
698
+ </tr>
699
+ <tr>
700
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">RefCOCO(avg)</td>
701
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
702
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
703
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.1</td>
704
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">91.1</td>
705
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.8</td>
706
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">92.3</td>
707
+ </tr>
708
+ <tr>
709
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">ODInW13</td>
710
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
711
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
712
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">46.3</td>
713
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">43.2</td>
714
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
715
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">47.0</td>
716
+ </tr>
717
+ <tr>
718
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">EmbSpatialBench</td>
719
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.3</td>
720
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.7</td>
721
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">61.2</td>
722
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.3</td>
723
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.4</td>
724
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.5</td>
725
+ </tr>
726
+ <tr>
727
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">RefSpatialBench</td>
728
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
729
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
730
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.5</td>
731
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">69.9</td>
732
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
733
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.6</td>
734
+ </tr>
735
+ <tr>
736
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">LingoQA</td>
737
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.8</td>
738
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.8</td>
739
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.8</td>
740
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.8</td>
741
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">68.2</td>
742
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.6</td>
743
+ </tr>
744
+ <tr>
745
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">V*</td>
746
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.9</td>
747
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.0</td>
748
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.0</td>
749
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.9</td>
750
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.0</td>
751
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">95.8/91.1</td>
752
+ </tr>
753
+ <tr>
754
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Hypersim</td>
755
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
756
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
757
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
758
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">11.0</td>
759
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
760
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">12.5</td>
761
+ </tr>
762
+ <tr>
763
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SUNRGBD</td>
764
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
765
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
766
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
767
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">34.9</td>
768
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
769
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">38.3</td>
770
+ </tr>
771
+ <tr>
772
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">Nuscene</td>
773
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
774
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
775
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
776
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">13.9</td>
777
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
778
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">16.0</td>
779
+ </tr>
780
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Video Understanding</td></tr>
781
+ <tr>
782
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">VideoMME (w sub.)</td>
783
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86</td>
784
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.6</td>
785
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">88.4</td>
786
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.8</td>
787
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.4</td>
788
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.5</td>
789
+ </tr>
790
+ <tr>
791
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">VideoMME (w/o sub.)</td>
792
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.8</td>
793
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.4</td>
794
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.7</td>
795
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.0</td>
796
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.2</td>
797
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.7</td>
798
+ </tr>
799
+ <tr>
800
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">VideoMMMU</td>
801
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.9</td>
802
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.4</td>
803
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.6</td>
804
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.0</td>
805
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.6</td>
806
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">84.7</td>
807
+ </tr>
808
+ <tr>
809
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MLVU (M-Avg)</td>
810
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.6</td>
811
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.7</td>
812
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.0</td>
813
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">83.8</td>
814
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.0</td>
815
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">86.7</td>
816
+ </tr>
817
+ <tr>
818
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MVBench</td>
819
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">78.1</td>
820
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">67.2</td>
821
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.1</td>
822
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.2</td>
823
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.5</td>
824
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.6</td>
825
+ </tr>
826
+ <tr>
827
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">LVBench</td>
828
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.7</td>
829
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">57.3</td>
830
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.2</td>
831
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.6</td>
832
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.9</td>
833
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.5</td>
834
+ </tr>
835
+ <tr>
836
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MMVU</td>
837
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.8</td>
838
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.3</td>
839
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">77.5</td>
840
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">71.1</td>
841
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.4</td>
842
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.4</td>
843
+ </tr>
844
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Visual Agent</td></tr>
845
+ <tr>
846
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">ScreenSpot Pro</td>
847
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
848
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">45.7</td>
849
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.7</td>
850
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.0</td>
851
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
852
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.6</td>
853
+ </tr>
854
+ <tr>
855
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">OSWorld-Verified</td>
856
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">38.2</td>
857
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.3</td>
858
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
859
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">38.1</td>
860
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.3</td>
861
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.2</td>
862
+ </tr>
863
+ <tr>
864
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">AndoridWorld</td>
865
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
866
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
867
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
868
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.7</td>
869
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">--</td>
870
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">66.8</td>
871
+ </tr>
872
+ <tr><td colspan="7" style="padding:8px 12px;font-weight:600;color:#7c3aed;border-bottom:1px solid #e5e7eb;background:#faf5ff">Medical</td></tr>
873
+ <tr>
874
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">VQA-RAD</td>
875
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">69.8</td>
876
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.6</td>
877
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">74.5</td>
878
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.4</td>
879
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.9</td>
880
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.3</td>
881
+ </tr>
882
+ <tr>
883
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">SLAKE</td>
884
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.9</td>
885
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.4</td>
886
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.3</td>
887
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">54.7</td>
888
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">81.6</td>
889
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">79.9</td>
890
+ </tr>
891
+ <tr>
892
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">OM-VQA</td>
893
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">72.9</td>
894
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">75.5</td>
895
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">80.3</td>
896
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.4</td>
897
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">87.4</td>
898
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">85.1</td>
899
+ </tr>
900
+ <tr>
901
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">PMC-VQA</td>
902
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">58.9</td>
903
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">59.9</td>
904
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">62.3</td>
905
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">41.2</td>
906
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.3</td>
907
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">64.2</td>
908
+ </tr>
909
+ <tr>
910
+ <td style="padding:7px 12px;padding-left:20px;border-bottom:1px solid #f0f0f0;color:#444">MedXpertQA-MM</td>
911
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">73.3</td>
912
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">63.6</td>
913
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">76.0</td>
914
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">47.6</td>
915
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">65.3</td>
916
+ <td style="padding:7px 12px;text-align:center;border-bottom:1px solid #f0f0f0">70.0</td>
917
+ </tr>
918
+ </tbody>
919
+ </table>
920
+
921
+ <p style="margin-top:12px;font-size:11px;color:#888">
922
+ * MathVision:our model’s score is evaluated using a fixed prompt, e.g., “Please reason step by step, and put your final answer within \boxed{}.” For other models, we report the higher score between runs with and without the \boxed{} formatting.<br>
923
+ * BabyVision: our model’s score is reported with CI (Code Interpreter) enabled; without CI, the result is 43.3.<br>
924
+ * V*: our model’s score is reported with CI (Code Interpreter) enabled; without CI, the result is 91.1.<br>
925
+ * Empty cells (--) indicate scores not yet available or not applicable.<br>
926
+ </p>
927
+
928
+ </div>
929
+
930
+
931
+ ## Quickstart
932
+
933
+ For streamlined integration, we recommend using Qwen3.5 via APIs. Below is a guide to use Qwen3.5 via OpenAI-compatible API. For programmatic inference or offline batch processing, please consult our [documentation](https://qwenlm.github.io/Qwen3.5).
934
+
935
+ ### Serving Qwen3.5
936
+
937
+ Qwen3.5 can be served via APIs with popular inference frameworks.
938
+ In the following, we show example commands to launch OpenAI-Compatible API servers for Qwen3.5 models.
939
+
940
+
941
+ > [!Important]
942
+ > Inference efficiency and throughput vary significantly across frameworks.
943
+ > We recommend using the latest framework versions to ensure optimal performance and compatibility.
944
+ > For production workloads or high-throughput scenarios, dedicated serving engines such as SGLang or vLLM are strongly recommended.
945
+
946
+ > [!Important]
947
+ > The model has a default context length of 262,144 tokens.
948
+ > If you encounter out-of-memory (OOM) errors, consider reducing the context window.
949
+ > However, because Qwen3.5 leverages extended context for complex tasks, we advise maintaining a context length of at least 128K tokens to preserve thinking capabilities.
950
+
951
+ #### SGLang
952
+
953
+ [SGLang](https://github.com/sgl-project/sglang) is a fast serving framework for large language models and vision language models.
954
+ SGLang from the main branch of the open-source repository is required for Qwen3.5, which can be installed using the following command in a fresh environment:
955
+ ```shell
956
+ uv pip install 'git+https://github.com/sgl-project/sglang.git#subdirectory=python&egg=sglang[all]'
957
+ ```
958
+ See [its documentation](https://docs.sglang.ai/get_started/install.html) for more details.
959
+
960
+ The following will create API endpoints at `http://localhost:8000/v1`:
961
+
962
+ - **Standard Version**: The following command can be used to create an API endpoint with maximum context length 262,144 tokens using tensor parallel on 8 GPUs.
963
+
964
+ ```shell
965
+ python -m sglang.launch_server --model-path Qwen/Qwen3.5-397B-A17B --port 8000 --tp-size 8 --mem-fraction-static 0.8 --context-length 262144 --mamba-ssm-dtype float32 --reasoning-parser qwen3
966
+ ```
967
+
968
+ - **Tool Use**: To support tool use, you can use the following command.
969
+
970
+ ```shell
971
+ python -m sglang.launch_server --model-path Qwen/Qwen3.5-397B-A17B --port 8000 --tp-size 8 --mem-fraction-static 0.8 --context-length 262144 --mamba-ssm-dtype float32 --reasoning-parser qwen3 --tool-call-parser qwen3_coder
972
+ ```
973
+
974
+ - **Multi-Token Prediction (MTP)**: The following command is recommended for MTP:
975
+
976
+ ```shell
977
+ python -m sglang.launch_server --model-path Qwen/Qwen3.5-397B-A17B --port 8000 --tp-size 8 --mem-fraction-static 0.8 --context-length 262144 --mamba-ssm-dtype float32 --reasoning-parser qwen3 --speculative-algo NEXTN --speculative-num-steps 3 --speculative-eagle-topk 1 --speculative-num-draft-tokens 4
978
+ ```
979
+
980
+ #### vLLM
981
+
982
+ [vLLM](https://github.com/vllm-project/vllm) is a high-throughput and memory-efficient inference and serving engine for LLMs.
983
+ vLLM from the main branch of the open-source repository is required for Qwen3.5, which can be installed using the following command in a fresh environment:
984
+ ```shell
985
+ uv pip install vllm --torch-backend=auto --extra-index-url https://wheels.vllm.ai/nightly
986
+ ```
987
+ See [its documentation](https://docs.vllm.ai/en/stable/getting_started/installation/index.html) for more details.
988
+
989
+ For detailed Qwen3.5 usage guide, see the [vLLM Qwen3.5 recipe](https://docs.vllm.ai/projects/recipes/en/latest/Qwen/Qwen3.5.html).
990
+
991
+ The following will create API endpoints at `http://localhost:8000/v1`:
992
+
993
+ - **Standard Version**: The following command can be used to create an API endpoint with maximum context length 262,144 tokens using tensor parallel on 8 GPUs.
994
+
995
+ ```shell
996
+ vllm serve Qwen/Qwen3.5-397B-A17B --port 8000 --tensor-parallel-size 8 --max-model-len 262144 --mamba-ssm-cache-dtype float32 --reasoning-parser qwen3
997
+ ```
998
+
999
+ - **Tool Call**: To support tool use, you can use the following command.
1000
+
1001
+ ```shell
1002
+ vllm serve Qwen/Qwen3.5-397B-A17B --port 8000 --tensor-parallel-size 8 --max-model-len 262144 --mamba-ssm-cache-dtype float32 --reasoning-parser qwen3 --enable-auto-tool-choice --tool-call-parser qwen3_coder
1003
+ ```
1004
+
1005
+ - **Multi-Token Prediction (MTP)**: The following command is recommended for MTP:
1006
+
1007
+ ```shell
1008
+ vllm serve Qwen/Qwen3.5-397B-A17B --port 8000 --tensor-parallel-size 8 --max-model-len 262144 --mamba-ssm-cache-dtype float32 --reasoning-parser qwen3 --speculative-config '{"method":"qwen3_next_mtp","num_speculative_tokens":2}'
1009
+ ```
1010
+
1011
+ - **Text-Only**: The following command skips the vision encoder and multimodal profiling to free up memory for additional KV cache:
1012
+
1013
+ ```shell
1014
+ vllm serve Qwen/Qwen3.5-397B-A17B --port 8000 --tensor-parallel-size 8 --max-model-len 262144 --mamba-ssm-cache-dtype float32 --reasoning-parser qwen3 --limit-mm-per-prompt.video 0 --limit-mm-per-prompt.image 0
1015
+ ```
1016
+
1017
+ > [!Tip]
1018
+ > Because vLLM defaults to `--mamba-ssm-cache-dtype auto`, which resolves to `bfloat16` for Qwen3.5, the model's generation quality may suffer unless you explicitly set it to higher precision, i.e., `--mamba-ssm-cache-dtype float32`.
1019
+
1020
+ #### Hugging Face Transformers
1021
+
1022
+ Hugging Face Transformers contains a _lightweight_ server which can be used for quick testing and moderate load deployment.
1023
+ The latest `transformers` is required for Qwen3.5:
1024
+ ```shell
1025
+ pip install "transformers[serving] @ git+https://github.com/huggingface/transformers.git@main"
1026
+ ```
1027
+ See [its documentation](https://huggingface.co/docs/transformers/main/serving) for more details.
1028
+
1029
+ Then, run `transformers serve` to launch a server with API endpoints at `http://localhost:8000/v1`; it will place the model on accelerators if available:
1030
+ ```shell
1031
+ transformers serve --force-model Qwen/Qwen3.5-397B-A17B --port 8000 --continuous-batching
1032
+ ```
1033
+
1034
+ ### Using Qwen3.5 via the Chat Completions API
1035
+
1036
+ The chat completions API is accessible via standard HTTP requests or OpenAI SDKs.
1037
+ Here, we show examples using the OpenAI Python SDK.
1038
+
1039
+ Before starting, make sure it is installed and the API key and the API base URL is configured, e.g.:
1040
+ ```shell
1041
+ pip install -U openai
1042
+
1043
+ # Set the following accordingly
1044
+ export OPENAI_BASE_URL="http://localhost:8000/v1"
1045
+ export OPENAI_API_KEY="EMPTY"
1046
+ ```
1047
+
1048
+ > [!Tip]
1049
+ > We recommend using the following set of sampling parameters for generation
1050
+ > - Thinking mode: `temperature=0.6, top_p=0.95, top_k=20, min_p=0.0, presence_penalty=0.0, repetition_penalty=1.0`
1051
+ > - Instruct (or non-thinking) mode: `temperature=0.7, top_p=0.8, top_k=20, min_p=0.0, presence_penalty=1.5, repetition_penalty=1.0`
1052
+
1053
+ #### Text-Only Input
1054
+
1055
+ ```python
1056
+ from openai import OpenAI
1057
+ # Configured by environment variables
1058
+ client = OpenAI()
1059
+
1060
+ messages = [
1061
+ {"role": "user", "content": "Type \"I love Qwen3.5\" backwards"},
1062
+ ]
1063
+
1064
+ chat_response = client.chat.completions.create(
1065
+ model="Qwen/Qwen3.5-397B-A17B",
1066
+ messages=messages,
1067
+ max_tokens=81920,
1068
+ temperature=0.6,
1069
+ top_p=0.95,
1070
+ extra_body={
1071
+ "top_k": 20,
1072
+ },
1073
+ )
1074
+ print("Chat response:", chat_response)
1075
+ ```
1076
+
1077
+
1078
+ #### Image Input
1079
+
1080
+ ```python
1081
+ from openai import OpenAI
1082
+ # Configured by environment variables
1083
+ client = OpenAI()
1084
+
1085
+ messages = [
1086
+ {
1087
+ "role": "user",
1088
+ "content": [
1089
+ {
1090
+ "type": "image_url",
1091
+ "image_url": {
1092
+ "url": "https://qianwen-res.oss-accelerate.aliyuncs.com/Qwen3.5/demo/CI_Demo/mathv-1327.jpg"
1093
+ }
1094
+ },
1095
+ {
1096
+ "type": "text",
1097
+ "text": "The centres of the four illustrated circles are in the corners of the square. The two big circles touch each other and also the two little circles. With which factor do you have to multiply the radii of the little circles to obtain the radius of the big circles?\nChoices:\n(A) $\\frac{2}{9}$\n(B) $\\sqrt{5}$\n(C) $0.8 \\cdot \\pi$\n(D) 2.5\n(E) $1+\\sqrt{2}$"
1098
+ }
1099
+ ]
1100
+ }
1101
+ ]
1102
+
1103
+ response = client.chat.completions.create(
1104
+ model="Qwen/Qwen3.5-397B-A17B",
1105
+ messages=messages,
1106
+ max_tokens=81920,
1107
+ temperature=0.6,
1108
+ top_p=0.95,
1109
+ extra_body={
1110
+ "top_k": 20,
1111
+ },
1112
+ )
1113
+ print("Chat response:", chat_response)
1114
+ ```
1115
+
1116
+ #### Video Input
1117
+
1118
+ ```python
1119
+ from openai import OpenAI
1120
+ # Configured by environment variables
1121
+ client = OpenAI()
1122
+
1123
+ messages = [
1124
+ {
1125
+ "role": "user",
1126
+ "content": [
1127
+ {
1128
+ "type": "video_url",
1129
+ "video_url": {
1130
+ "url": "https://qianwen-res.oss-accelerate.aliyuncs.com/Qwen3.5/demo/video/N1cdUjctpG8.mp4"
1131
+ }
1132
+ },
1133
+ {
1134
+ "type": "text",
1135
+ "text": "How many porcelain jars were discovered in the niches located in the primary chamber of the tomb?"
1136
+ }
1137
+ ]
1138
+ }
1139
+ ]
1140
+
1141
+ # When vLLM is launched with `--media-io-kwargs '{"video": {"num_frames": -1}}'`,
1142
+ # video frame sampling can be configured via `extra_body` (e.g., by setting `fps`).
1143
+ # This feature is currently supported only in vLLM.
1144
+ #
1145
+ # By default, `fps=2` and `do_sample_frames=True`.
1146
+ # With `do_sample_frames=True`, you can customize the `fps` value to set your desired video sampling rate.
1147
+ response = client.chat.completions.create(
1148
+ model="Qwen/Qwen3.5-397B-A17B",
1149
+ messages=messages,
1150
+ max_tokens=81920,
1151
+ temperature=0.6,
1152
+ top_p=0.95,
1153
+ extra_body={
1154
+ "top_k": 20,
1155
+ "mm_processor_kwargs": {"fps": 2, "do_sample_frames": True},
1156
+ },
1157
+ )
1158
+
1159
+ print("Chat response:", chat_response)
1160
+ ```
1161
+
1162
+ #### Instruct (or Non-Thinking) Mode
1163
+
1164
+ > [!Important]
1165
+ > Qwen3.5 does not officially support the soft switch of Qwen3, i.e., `/think` and `/nothink`.
1166
+
1167
+ Qwen3.5 will think by default before response.
1168
+ You can obtain direct response from the model without thinking by configuring the API parameters.
1169
+ For example,
1170
+ ```python
1171
+ from openai import OpenAI
1172
+ # Configured by environment variables
1173
+ client = OpenAI()
1174
+
1175
+ messages = [
1176
+ {
1177
+ "role": "user",
1178
+ "content": [
1179
+ {
1180
+ "type": "image_url",
1181
+ "image_url": {
1182
+ "url": "https://qianwen-res.oss-accelerate.aliyuncs.com/Qwen3.5/demo/RealWorld/RealWorld-04.png"
1183
+ }
1184
+ },
1185
+ {
1186
+ "type": "text",
1187
+ "text": "Where is this?"
1188
+ }
1189
+ ]
1190
+ }
1191
+ ]
1192
+
1193
+ chat_response = client.chat.completions.create(
1194
+ model="Qwen/Qwen3.5-397B-A17B",
1195
+ messages=messages,
1196
+ max_tokens=32768,
1197
+ temperature=0.7,
1198
+ top_p=0.8,
1199
+ presence_penalty=1.5,
1200
+ extra_body={
1201
+ "top_k": 20,
1202
+ "chat_template_kwargs": {"enable_thinking": False},
1203
+ },
1204
+ )
1205
+ print("Chat response:", chat_response)
1206
+ ```
1207
+
1208
+ > [!Note]
1209
+ > If you are using APIs from Alibaba Cloud Model Studio, in addition to changing `model`, please use `"enable_thinking": False` instead of `"chat_template_kwargs": {"enable_thinking": False}`.
1210
+
1211
+
1212
+ ## Agentic Usage
1213
+
1214
+ Qwen3.5 excels in tool calling capabilities.
1215
+
1216
+ ### Qwen-Agent
1217
+
1218
+ We recommend using [Qwen-Agent](https://github.com/QwenLM/Qwen-Agent) to quickly build Agent applications with Qwen3.5.
1219
+
1220
+ To define the available tools, you can use the MCP configuration file, use the integrated tool of Qwen-Agent, or integrate other tools by yourself.
1221
+ ```python
1222
+ import os
1223
+ from qwen_agent.agents import Assistant
1224
+
1225
+ # Define LLM
1226
+ # Using Alibaba Cloud Model Studio
1227
+ llm_cfg = {
1228
+ # Use the OpenAI-compatible model service provided by DashScope:
1229
+ 'model': 'Qwen3.5-397B-A17B',
1230
+ 'model_type': 'qwenvl_oai',
1231
+ 'model_server': 'https://dashscope.aliyuncs.com/compatible-mode/v1',
1232
+ 'api_key': os.getenv('DASHSCOPE_API_KEY'),
1233
+
1234
+ 'generate_cfg': {
1235
+ 'use_raw_api': True,
1236
+ # When using Dash Scope OAI API, pass the parameter of whether to enable thinking mode in this way
1237
+ 'extra_body': {
1238
+ 'enable_thinking': True
1239
+ },
1240
+ },
1241
+ }
1242
+
1243
+ # Using OpenAI-compatible API endpoint.
1244
+ # functionality of the deployment frameworks and let Qwen-Agent automate the related operations.
1245
+ #
1246
+ # llm_cfg = {
1247
+ # # Use your own model service compatible with OpenAI API by vLLM/SGLang:
1248
+ # 'model': 'Qwen/Qwen3.5-397B-A17B',
1249
+ # 'model_type': 'qwenvl_oai',
1250
+ # 'model_server': 'http://localhost:8000/v1', # api_base
1251
+ # 'api_key': 'EMPTY',
1252
+ #
1253
+ # 'generate_cfg': {
1254
+ # 'use_raw_api': True,
1255
+ # # When using vLLM/SGLang OAI API, pass the parameter of whether to enable thinking mode in this way
1256
+ # 'extra_body': {
1257
+ # 'chat_template_kwargs': {'enable_thinking': True}
1258
+ # },
1259
+ # },
1260
+ # }
1261
+
1262
+ # Define Tools
1263
+ tools = [
1264
+ {'mcpServers': { # You can specify the MCP configuration file
1265
+ "filesystem": {
1266
+ "command": "npx",
1267
+ "args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/xxxx/Desktop"]
1268
+ }
1269
+ }
1270
+ }
1271
+ ]
1272
+
1273
+ # Define Agent
1274
+ bot = Assistant(llm=llm_cfg, function_list=tools)
1275
+
1276
+ # Streaming generation
1277
+ messages = [{'role': 'user', 'content': 'Help me organize my desktop.'}]
1278
+ for responses in bot.run(messages=messages):
1279
+ pass
1280
+ print(responses)
1281
+
1282
+ # Streaming generation
1283
+ messages = [{'role': 'user', 'content': 'Develop a dog website and save it on the desktop'}]
1284
+ for responses in bot.run(messages=messages):
1285
+ pass
1286
+ print(responses)
1287
+ ```
1288
+
1289
+ ### Qwen Code
1290
+
1291
+
1292
+ [Qwen Code](https://github.com/QwenLM/qwen-code) is an open-source AI agent for the terminal, optimized for Qwen models. It helps you understand large codebases, automate tedious work, and ship faster.
1293
+
1294
+ For more information, please refer to [Qwen Code](https://qwenlm.github.io/qwen-code-docs/).
1295
+
1296
+ ## Processing Ultra-Long Texts
1297
+
1298
+ Qwen3.5 natively supports context lengths of up to 262,144 tokens.
1299
+ For long-horizon tasks where the total length (including both input and output) exceeds this limit, we recommend using RoPE scaling techniques to handle long texts effectively., e.g., YaRN.
1300
+
1301
+ YaRN is currently supported by several inference frameworks, e.g., `transformers`, `vllm` and `sglang`.
1302
+ In general, there are two approaches to enabling YaRN for supported frameworks:
1303
+
1304
+ - Modifying the model configuration file:
1305
+ In the `config.json` file, change the `rope_parameters` fields in `text_config` to:
1306
+ ```json
1307
+ {
1308
+ "mrope_interleaved": true,
1309
+ "mrope_section": [
1310
+ 11,
1311
+ 11,
1312
+ 10
1313
+ ],
1314
+ "rope_type": "yarn",
1315
+ "rope_theta": 10000000,
1316
+ "partial_rotary_factor": 0.25,
1317
+ "factor": 4.0,
1318
+ "original_max_position_embeddings": 262144,
1319
+ }
1320
+ ```
1321
+
1322
+ - Passing command line arguments:
1323
+
1324
+ For `vllm`, you can use
1325
+ ```shell
1326
+ VLLM_ALLOW_LONG_MAX_MODEL_LEN=1 vllm serve ... --hf-overrides '{"text_config": {"rope_parameters": {"mrope_interleaved": true, "mrope_section": [11, 11, 10], "rope_type": "yarn", "rope_theta": 10000000, "partial_rotary_factor": 0.25, "factor": 4.0, "original_max_position_embeddings": 262144}}}' --max-model-len 1010000
1327
+ ```
1328
+
1329
+ For `sglang`, you can use
1330
+ ```shell
1331
+ SGLANG_ALLOW_OVERWRITE_LONGER_CONTEXT_LEN=1 python -m sglang.launch_server ... --json-model-override-args '{"text_config": {"rope_parameters": {"mrope_interleaved": true, "mrope_section": [11, 11, 10], "rope_type": "yarn", "rope_theta": 10000000, "partial_rotary_factor": 0.25, "factor": 4.0, "original_max_position_embeddings": 262144}}}' --context-length 1010000
1332
+ ```
1333
+
1334
+ > [!NOTE]
1335
+ > All the notable open-source frameworks implement static YaRN, which means the scaling factor remains constant regardless of input length, **potentially impacting performance on shorter texts.**
1336
+ > We advise modifying the `rope_parameters` configuration only when processing long contexts is required.
1337
+ > It is also recommended to modify the `factor` as needed. For example, if the typical context length for your application is 524,288 tokens, it would be better to set `factor` as 2.0.
1338
+
1339
+ ## Best Practices
1340
+
1341
+ To achieve optimal performance, we recommend the following settings:
1342
+
1343
+ 1. **Sampling Parameters**:
1344
+ - We suggest using `Temperature=0.6`, `TopP=0.95`, `TopK=20`, and `MinP=0` for thinking mode and using `Temperature=0.7`, `TopP=0.8`, `TopK=20`, and `MinP=0` for non-thinking mode.
1345
+ - For supported frameworks, you can adjust the `presence_penalty` parameter between 0 and 2 to reduce endless repetitions. However, using a higher value may occasionally result in language mixing and a slight decrease in model performance.
1346
+
1347
+ 2. **Adequate Output Length**: We recommend using an output length of 32,768 tokens for most queries. For benchmarking on highly complex problems, such as those found in math and programming competitions, we suggest setting the max output length to 81,920 tokens. This provides the model with sufficient space to generate detailed and comprehensive responses, thereby enhancing its overall performance.
1348
+
1349
+ 3. **Standardize Output Format**: We recommend using prompts to standardize model outputs when benchmarking.
1350
+ - **Math Problems**: Include "Please reason step by step, and put your final answer within \boxed{}." in the prompt.
1351
+ - **Multiple-Choice Questions**: Add the following JSON structure to the prompt to standardize responses: "Please show your choice in the `answer` field with only the choice letter, e.g., `"answer": "C"`."
1352
+
1353
+ 4. **No Thinking Content in History**: In multi-turn conversations, the historical model output should only include the final output part and does not need to include the thinking content. It is implemented in the provided chat template in Jinja2. However, for frameworks that do not directly use the Jinja2 chat template, it is up to the developers to ensure that the best practice is followed.
1354
+
1355
+ 5. **Long Video Understanding**: It is recommended to set the `longest_edge` parameter in the video_preprocessor_config file to 469,762,048 (corresponding to 224k video tokens) to enable higher frame-rate sampling for hour-scale videos and thereby achieve superior performance.
1356
+
1357
+ ### Citation
1358
+
1359
+ If you find our work helpful, feel free to give us a cite.
1360
+
1361
+ ```bibtex
1362
+ @misc{qwen3.5
1363
+ title = {{Qwen3.5}: Towards Native Multimodal Agents},
1364
+ author = {{Qwen Team}},
1365
+ month = {February},
1366
+ year = {2026},
1367
+ url = {https://qwen.ai/blog?id=qwen3.5}
1368
+ }
1369
+ ```
chat_template.jinja ADDED
@@ -0,0 +1,99 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {%- if tools %}
2
+ {{- '<|im_start|>system\n' }}
3
+ {%- if messages[0].role == 'system' %}
4
+ {{- messages[0].content + '\n\n' }}
5
+ {%- endif %}
6
+ {{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
7
+ {%- for tool in tools %}
8
+ {{- "\n" }}
9
+ {{- tool | tojson }}
10
+ {%- endfor %}
11
+ {{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
12
+ {%- else %}
13
+ {%- if messages[0].role == 'system' %}
14
+ {{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
15
+ {%- endif %}
16
+ {%- endif %}
17
+ {%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
18
+ {%- for forward_message in messages %}
19
+ {%- set index = (messages|length - 1) - loop.index0 %}
20
+ {%- set message = messages[index] %}
21
+ {%- set current_content = message.content if message.content is defined and message.content is not none else '' %}
22
+ {%- set tool_start = '<tool_response>' %}
23
+ {%- set tool_start_length = tool_start|length %}
24
+ {%- set start_of_message = current_content[:tool_start_length] %}
25
+ {%- set tool_end = '</tool_response>' %}
26
+ {%- set tool_end_length = tool_end|length %}
27
+ {%- set start_pos = (current_content|length) - tool_end_length %}
28
+ {%- if start_pos < 0 %}
29
+ {%- set start_pos = 0 %}
30
+ {%- endif %}
31
+ {%- set end_of_message = current_content[start_pos:] %}
32
+ {%- if ns.multi_step_tool and message.role == "user" and not(start_of_message == tool_start and end_of_message == tool_end) %}
33
+ {%- set ns.multi_step_tool = false %}
34
+ {%- set ns.last_query_index = index %}
35
+ {%- endif %}
36
+ {%- endfor %}
37
+ {%- for message in messages %}
38
+ {%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
39
+ {{- '<|im_start|>' + message.role + '\n' + message.content + '<|im_end|>' + '\n' }}
40
+ {%- elif message.role == "assistant" %}
41
+ {%- set m_content = message.content if message.content is defined and message.content is not none else '' %}
42
+ {%- set content = m_content %}
43
+ {%- set reasoning_content = '' %}
44
+ {%- if message.reasoning_content is defined and message.reasoning_content is not none %}
45
+ {%- set reasoning_content = message.reasoning_content %}
46
+ {%- else %}
47
+ {%- if '</think>' in m_content %}
48
+ {%- set content = (m_content.split('</think>')[-1]).lstrip('\n') %}
49
+ {%- set reasoning_content = (m_content.split('</think>')[0]).rstrip('\n') %}
50
+ {%- set reasoning_content = (reasoning_content.split('<think>')[-1]).lstrip('\n') %}
51
+ {%- endif %}
52
+ {%- endif %}
53
+ {%- if loop.index0 > ns.last_query_index %}
54
+ {%- if loop.last or (not loop.last and (not reasoning_content.strip() == '')) %}
55
+ {{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
56
+ {%- else %}
57
+ {{- '<|im_start|>' + message.role + '\n' + content }}
58
+ {%- endif %}
59
+ {%- else %}
60
+ {{- '<|im_start|>' + message.role + '\n' + content }}
61
+ {%- endif %}
62
+ {%- if message.tool_calls %}
63
+ {%- for tool_call in message.tool_calls %}
64
+ {%- if (loop.first and content) or (not loop.first) %}
65
+ {{- '\n' }}
66
+ {%- endif %}
67
+ {%- if tool_call.function %}
68
+ {%- set tool_call = tool_call.function %}
69
+ {%- endif %}
70
+ {{- '<tool_call>\n{"name": "' }}
71
+ {{- tool_call.name }}
72
+ {{- '", "arguments": ' }}
73
+ {%- if tool_call.arguments is string %}
74
+ {{- tool_call.arguments }}
75
+ {%- else %}
76
+ {{- tool_call.arguments | tojson }}
77
+ {%- endif %}
78
+ {{- '}\n</tool_call>' }}
79
+ {%- endfor %}
80
+ {%- endif %}
81
+ {{- '<|im_end|>\n' }}
82
+ {%- elif message.role == "tool" %}
83
+ {%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
84
+ {{- '<|im_start|>user' }}
85
+ {%- endif %}
86
+ {{- '\n<tool_response>\n' }}
87
+ {{- message.content }}
88
+ {{- '\n</tool_response>' }}
89
+ {%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
90
+ {{- '<|im_end|>\n' }}
91
+ {%- endif %}
92
+ {%- endif %}
93
+ {%- endfor %}
94
+ {%- if add_generation_prompt %}
95
+ {{- '<|im_start|>assistant\n' }}
96
+ {%- if enable_thinking is defined and enable_thinking is false %}
97
+ {{- '<think>\n\n</think>\n\n' }}
98
+ {%- endif %}
99
+ {%- endif %}
config.json ADDED
@@ -0,0 +1,142 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Qwen3_5MoeForConditionalGeneration"
4
+ ],
5
+ "image_token_id": 248056,
6
+ "model_type": "qwen3_5_moe",
7
+ "pad_token_id": 248055,
8
+ "text_config": {
9
+ "attention_bias": false,
10
+ "attention_dropout": 0.0,
11
+ "attn_output_gate": true,
12
+ "bos_token_id": null,
13
+ "torch_dtype": "bfloat16",
14
+ "eos_token_id": 248044,
15
+ "full_attention_interval": 4,
16
+ "head_dim": 256,
17
+ "hidden_act": "silu",
18
+ "hidden_size": 4096,
19
+ "initializer_range": 0.02,
20
+ "layer_types": [
21
+ "linear_attention",
22
+ "linear_attention",
23
+ "linear_attention",
24
+ "full_attention",
25
+ "linear_attention",
26
+ "linear_attention",
27
+ "linear_attention",
28
+ "full_attention",
29
+ "linear_attention",
30
+ "linear_attention",
31
+ "linear_attention",
32
+ "full_attention",
33
+ "linear_attention",
34
+ "linear_attention",
35
+ "linear_attention",
36
+ "full_attention",
37
+ "linear_attention",
38
+ "linear_attention",
39
+ "linear_attention",
40
+ "full_attention",
41
+ "linear_attention",
42
+ "linear_attention",
43
+ "linear_attention",
44
+ "full_attention",
45
+ "linear_attention",
46
+ "linear_attention",
47
+ "linear_attention",
48
+ "full_attention",
49
+ "linear_attention",
50
+ "linear_attention",
51
+ "linear_attention",
52
+ "full_attention",
53
+ "linear_attention",
54
+ "linear_attention",
55
+ "linear_attention",
56
+ "full_attention",
57
+ "linear_attention",
58
+ "linear_attention",
59
+ "linear_attention",
60
+ "full_attention",
61
+ "linear_attention",
62
+ "linear_attention",
63
+ "linear_attention",
64
+ "full_attention",
65
+ "linear_attention",
66
+ "linear_attention",
67
+ "linear_attention",
68
+ "full_attention",
69
+ "linear_attention",
70
+ "linear_attention",
71
+ "linear_attention",
72
+ "full_attention",
73
+ "linear_attention",
74
+ "linear_attention",
75
+ "linear_attention",
76
+ "full_attention",
77
+ "linear_attention",
78
+ "linear_attention",
79
+ "linear_attention",
80
+ "full_attention"
81
+ ],
82
+ "linear_conv_kernel_dim": 4,
83
+ "linear_key_head_dim": 128,
84
+ "linear_num_key_heads": 16,
85
+ "linear_num_value_heads": 64,
86
+ "linear_value_head_dim": 128,
87
+ "mamba_ssm_dtype": "float32",
88
+ "max_position_embeddings": 262144,
89
+ "mlp_only_layers": [],
90
+ "model_type": "qwen3_5_moe_text",
91
+ "moe_intermediate_size": 1024,
92
+ "mtp_num_hidden_layers": 1,
93
+ "mtp_use_dedicated_embeddings": false,
94
+ "num_attention_heads": 32,
95
+ "num_experts": 512,
96
+ "num_experts_per_tok": 10,
97
+ "num_hidden_layers": 60,
98
+ "num_key_value_heads": 2,
99
+ "output_router_logits": false,
100
+ "pad_token_id": null,
101
+ "partial_rotary_factor": 0.25,
102
+ "rms_norm_eps": 1e-06,
103
+ "rope_parameters": {
104
+ "mrope_interleaved": true,
105
+ "mrope_section": [
106
+ 11,
107
+ 11,
108
+ 10
109
+ ],
110
+ "partial_rotary_factor": 0.25,
111
+ "rope_theta": 10000000,
112
+ "rope_type": "default"
113
+ },
114
+ "router_aux_loss_coef": 0.001,
115
+ "shared_expert_intermediate_size": 1024,
116
+ "tie_word_embeddings": false,
117
+ "use_cache": true,
118
+ "vocab_size": 248320
119
+ },
120
+ "tie_word_embeddings": false,
121
+ "transformers_version": "5.2.0.dev0",
122
+ "unsloth_fixed": true,
123
+ "video_token_id": 248057,
124
+ "vision_config": {
125
+ "deepstack_visual_indexes": [],
126
+ "depth": 27,
127
+ "hidden_act": "gelu_pytorch_tanh",
128
+ "hidden_size": 1152,
129
+ "in_channels": 3,
130
+ "initializer_range": 0.02,
131
+ "intermediate_size": 4304,
132
+ "model_type": "qwen3_5_moe",
133
+ "num_heads": 16,
134
+ "num_position_embeddings": 2304,
135
+ "out_hidden_size": 4096,
136
+ "patch_size": 16,
137
+ "spatial_merge_size": 2,
138
+ "temporal_patch_size": 2
139
+ },
140
+ "vision_end_token_id": 248054,
141
+ "vision_start_token_id": 248053
142
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors-00001-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ef3af31ff138d1b4a4bdba1207ae727a8cf04f45e63805bfe0023b81019753d2
3
+ size 8589934736
model.safetensors-00002-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b57b19cbe857cf666625d41d06e8132a3f8c43cc61834f45ada9eca5b15e9d12
3
+ size 8589934736
model.safetensors-00003-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:30d5b86b581bfbee831f8fb0bd559f7c51841540c16ccfcb33b2c6e034da9189
3
+ size 8589934736
model.safetensors-00004-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7474662495601919a3a05e996efd43f9d48376e3af968ee16b37b7195713de94
3
+ size 8589934736
model.safetensors-00005-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77d5a0dcfb4bdf9e4c17e36cbea0e9cf2cdf3425e23459d5861349313a294cfb
3
+ size 8589934736
model.safetensors-00006-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:029ea9778e597ec289d42ac8585557b4b9eb34a8280771a0ad05fdb6f4588808
3
+ size 8589934736
model.safetensors-00007-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5ea70f2e4f1dceb3ca110474d5e5c9c8358686ff9483766f2c44569c27988291
3
+ size 8589934736
model.safetensors-00008-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ef7d13b2a76bef45e53c91d333946db077c220a03f4cddb554967536635d2b93
3
+ size 8589934736
model.safetensors-00009-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:99275054b8efa27b0edce1c558bf574fdf9710901a1f26ced05e3e66f6ee0bd7
3
+ size 8589934736
model.safetensors-00010-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7901ad36e1a8df569acc5fcffb304bc052f36fa0922c77c991aa2562d0105c36
3
+ size 8589934736
model.safetensors-00011-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6de37420e3d78c30df0cc3f3a5f8b1b7a6ce7dc387715c20f429d4e74a23da71
3
+ size 8589934736
model.safetensors-00012-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df24d49b01c11c2c2b935bb0221ed5f0c31f1efbb5a3499e2f1d0d257c6bbc5a
3
+ size 8589934736
model.safetensors-00013-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4050f0e009ce2555a586a62c89d45767cd1f248b51365ad5cdc28bc0c8c48324
3
+ size 8589934736
model.safetensors-00014-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5fbb1559e1b960c908ee74df7d925c54a46f7f76d914b3b79eb43e5b68a17f47
3
+ size 8589934736
model.safetensors-00015-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:893021b612b249746c1014fd0c22f571ade88f5deb14ffdc2210329598a68ed6
3
+ size 8589934736
model.safetensors-00016-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0106bb289e1a16e7aa1329376e866592c7d61af53b7c11a472b38fba50a7fb64
3
+ size 8589934736
model.safetensors-00017-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5ddb14834a14278ec9d3edb6a769cabd67b0318cc92dd4daf6ffc9fa090522e3
3
+ size 8589934736
model.safetensors-00018-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e9d42bce8cf3a69b845e1298236482a59763813a5f37a345b01a66c94fb085e5
3
+ size 8589934736
model.safetensors-00019-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aea6f596de60fc7bba9441da6c31b961f06edd3a23638e67c186683ee0627e94
3
+ size 8589934736
model.safetensors-00020-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e078c060b17934cd91dec1d90a8b88eef3edf6c96c8637705caf94d39053cfcc
3
+ size 8589934736
model.safetensors-00021-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0b294edfa9256f16c84ea7b22aeab8e266d80ebced07341735b9cca4055c0a40
3
+ size 8589934736
model.safetensors-00022-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f22caa1430d545017ed596ab399f92a66c40cddca31bba0d2a86e6b3b4c4c4d4
3
+ size 8589934736
model.safetensors-00023-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aff917f1549ce6960c11f9486fa9c139c5bd95f96b106f89f7e08cbe3a9bfd63
3
+ size 8589934736
model.safetensors-00024-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f22f29cbdb33447a10cccbad1a069705e99c3fd7c66d2a1d6cc019d195a97465
3
+ size 8589934736
model.safetensors-00025-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4495e64341b298bb5f04c36f0c4ce200a4d2f60d1c4cf8d5befaa60a89720d81
3
+ size 8589934736
model.safetensors-00026-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0bfda686c8e4b38bb325e7d41fb1f51ef37382f87b18619bbea93d7d18f83eae
3
+ size 8589934736
model.safetensors-00027-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b2bbe774d86bf517e948d5350d984952403500a840791384f047058f811b77c7
3
+ size 8589934736
model.safetensors-00028-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bc5224f3d310356962d13cdf0ab6bd4524aea1178b7f72295081b4e83fea89e4
3
+ size 8589934736
model.safetensors-00029-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:830a685a5ba653da5f456e7750987615b23310984f1dcbe80a9356954bfd0f93
3
+ size 8589934736
model.safetensors-00030-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:740234757240e6e1a50d3c842a929c4ba98d73ada9b10a5ab8409bd3caa92a69
3
+ size 8589934736
model.safetensors-00031-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e20add467014b484f3447a4d441d6b4920301b881a9329839b08b1491841eecc
3
+ size 8589934736
model.safetensors-00032-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:17c4e028a5b55553e5d3313f6aff158e3c075f41184f5c35f470f01b69229ce5
3
+ size 8589934736
model.safetensors-00033-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:20c5c321e6c87154d673d0169c94ce57c5498ebd06d42ee80be52d785923a8f0
3
+ size 8589934736
model.safetensors-00034-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:75e7d240200e82f9680968db70d85770422f87ff5632ad21e58aa9fa057f90cc
3
+ size 8589934736
model.safetensors-00035-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d72f01171636e891ab87e3466a10b447bd1b42d8f008859dc897cd2d089a5748
3
+ size 8589934736
model.safetensors-00036-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4cbce89cb5824105c2b6c97b5e9d3ef8e30d5aa93278c05d497413891ca850f0
3
+ size 8589934736
model.safetensors-00037-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:645592a274308e7671a1bb8870f13769e38eafcc010bfa54f65c1f6d4d5d1ca1
3
+ size 8589934736
model.safetensors-00038-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c253a56421c58daa85e8775901c43824b46e8143ba0dd0e938e2a7e6c0720115
3
+ size 8589934736
model.safetensors-00039-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:33656b5ddbe6dd702ffa4811acc118955aff4d68f08a0e4a7e1c330b1843dd64
3
+ size 8589934736
model.safetensors-00040-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53c2e8a4eb74e14c86396059b3f20f7560c48244dcab4c972a9074ceee76889c
3
+ size 8589934736
model.safetensors-00041-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b5bbcf8e2b785daee4557cb8d3b47349a921c0911f3364b69a089470b6309ba6
3
+ size 8589934736
model.safetensors-00042-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d208d22e719e3e05ead6a700ea706efea9d5a1bba89da3024d1f4c41b6d3776e
3
+ size 8589934736
model.safetensors-00043-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b418deab92a9d6c2198e2535b034913368b00b5e2846f14d0accca78ee20f585
3
+ size 8589934736
model.safetensors-00044-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4fe93007beb18e68392f3fdb2df725d5224bc3e7147edb6ac7cceec4383f4fa8
3
+ size 8589934736
model.safetensors-00045-of-00094.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5708aff99f2ff608fd059c8a8d1661df415a74fd8ff1c543173fa14689b19957
3
+ size 8589934736