Explore
Models
Blueprints
GPUs
Docs
⌘K
Ctrl+K
?
Login
6 results for
Filters
Models (6)
Blueprints (0)
Other (0)
Sort By
score:DESC
Best Match
Z.ai
glm5
GLM-5 744B MoE enables efficient reasoning for complex systems and long-horizon agentic tasks.
Model
MoE
+3
8.55M
3w
Qwen
qwen3.5-397b-a17b
Next-gen Qwen 3.5 VLM (400B MoE) brings advanced vision, chat, RAG, and agentic capabilities.
Model
MoE
+4
6.9M
3w
Qwen
qwen3-coder-480b-a35b-instruct
Excels in agentic coding and browser use and supports 256K context, delivering top results.
Model
agentic coding
+4
3.92M
6mo
Qwen
qwen3-next-80b-a3b-instruct
Qwen3-Next Instruct blends hybrid attention, sparse MoE, and stability boosts for ultra-long context AI.
Model
chat
+2
11.63M
5mo
Stepfun-ai
step-3.5-flash
200B open-source reasoning engine with sparse MoE powering frontier agentic AI.
Model
Agentic
+3
7.22M
1mo
Mistral AI
mistral-large-3-675b-instruct-2512
A state-of-the-art general purpose MoE VLM ideal for chat, agentic and instruction based use cases.
Model
language generation
+4
6.36M
3mo
Items per page
24
1
1
of 1 pages