
Open reasoning model with 256K context window, native INT4 quantization and enhanced tool use

80B parameter AI model with hybrid reasoning, MoE architecture, support for 119 languages.

ByteDance open-source LLM with long-context, reasoning, and agentic intelligence.

High‑efficiency LLM with hybrid Transformer‑Mamba design, excelling in reasoning and agentic tasks.