Explore
Models
Blueprints
GPUs
Docs
⌘K
Ctrl+K
?
Login
5 results for
Filters (1)
Models (5)
Blueprints (0)
Other (0)
Sort By
score:DESC
Best Match
DeepSeek AI
Downloadable
deepseek-v4-pro
DeepSeek V4 scales to 1M-token context windows with efficient MoE architecture for coding tasks.
Model
Moe
+3
Items per page
24
1
1
of 1 pages
1.23M
6d
Qwen
Free Endpoint
qwen3-coder-480b-a35b-instruct
Excels in agentic coding and browser use and supports 256K context, delivering top results.
Model
agentic coding
+3
3.21M
8mo
DeepSeek AI
Downloadable
deepseek-v4-flash
DeepSeek V4 Flash is a 284B MoE model with 1M-token context optimized for fast coding and agents.
Model
coding
+3
455K
6d
NVIDIA
Downloadable
nemotron-3-nano-30b-a3b
Open, efficient MoE model with 1M context, excelling in coding, reasoning, instruction following, tool calling, and more
Model
MoE
+3
9.28M
4mo
NVIDIA
Downloadable
nemotron-3-super-120b-a12b
Open, efficient hybrid Mamba-Transformer MoE with 1M context, excelling in agentic reasoning, coding, planning, tool calling, and more
Model
MoE
+4
42.51M
1mo