Qwen3-Next Instruct blends hybrid attention, sparse MoE, and stability boosts for ultra-long context AI.
80B parameter AI model with hybrid reasoning, MoE architecture, support for 119 languages.
Excels in agentic coding and browser use and supports 256K context, delivering top results.
Advanced reasoing MOE mode excelling at reasoning, multilingual tasks, and instruction following
Chinese and English LLM targeting for language, coding, mathematics, reasoning, etc.
Advanced LLM for code generation, reasoning, and fixing across popular programming languages.
Powerful mid-size code model with a 32K context length, excelling in coding in multiple languages.
Chinese and English LLM targeting for language, coding, mathematics, reasoning, etc.