Smaller Mixture of Experts (MoE) text-only LLM for efficient AI reasoning and math
Deploy this model now on your endpoint provider of choice