Nex-N1.1
Nex is a next-generation, full-stack agentic platform that brings foundation models, synthetic data pipelines, RL training, agent frameworks, and deployment tools together in one unified ecosystem. DeepSeek-V3.1-Nex-N1.1 is the latest flagship release of the Nex-N1 series — a post-trained model designed to highlight agent autonomy, tool use, and real-world productivity. We are committed to making it easier than ever to build and deploy AI agents by offering researchers and entrepreneurs a high-performance, reliable, and cost-effective "out-of-the-box" agent system.
What's New in N1.1
- Enhanced Claude Skills Integration: Significantly improved ability to work with Claude Skills, delivering a more seamless tool-calling experience
- Improved Instruction Following: Substantially enhanced instruction adherence with greater stability and accuracy in complex tasks
- Upgraded Frontend Capabilities: Specialized optimization for frontend development scenarios, excelling in HTML, CSS, and JavaScript generation
- Advanced Vibe Coding: Further enhanced general-purpose coding abilities to meet diverse development needs
- 200K Context Length: Extended context window to 200K tokens, perfectly compatible with Claude Code and other long-context scenarios
- MTP Support: Added Multi-Token Prediction capability for improved inference efficiency
Usage
Local Deployment
We recommend sglang for serving Nex-series models locally:
python -m sglang.launch_server --model-path /path/to/your/model
Function Calling
Nex-series models support robust function-calling capabilities. To maximize the function-calling capabilities of the Nex-series models, we modified the tool parser of qwen3_coder, see: https://github.com/sgl-project/sglang/pull/13411. To enable this feature, simply add the --tool-call-parser qwen3_coder flag when launching the server:
python -m sglang.launch_server --model-path /path/to/your/model --tool-call-parser qwen3_coder
Mini Program Development
Nex-N1.1 is optimized for mini program development with enhanced 200K context support. For optimal performance, we recommend using Claude Code configured with both context7 and a search MCP.
claude mcp add --transport http context7 https://mcp.context7.com/mcp --header "CONTEXT7_API_KEY: [CONTEXT7_API_KEY]"
claude mcp add --transport stdio serper-search --env SERPER_API_KEY=[SERPER_API_KEY] -- npx -y serper-search-scrape-mcp-server
Refer to https://github.com/upstash/context7 for more details on setting up context7.
- Downloads last month
- 1
Model tree for demonwizard0/test-1
Base model
deepseek-ai/DeepSeek-V3.1-Base