Explore
Models
Blueprints
GPUs
Docs
⌘K
Ctrl+K
?
Login
6 results for
Filters (2)
Models (6)
Blueprints (0)
Other (0)
Sort By
score:DESC
Best Match
OpenAI
Downloadable
gpt-oss-120b
Mixture of Experts (MoE) reasoning LLM (text-only) designed to fit within 80GB GPU.
Model
reasoning
+3
Items per page
24
1
1
of 1 pages
28.78M
9mo
OpenAI
Downloadable
gpt-oss-20b
Smaller Mixture of Experts (MoE) text-only LLM for efficient AI reasoning and math
Model
reasoning
+3
11.98M
9mo
Moonshotai
Deprecation in 7d
Free Endpoint
kimi-k2-instruct
State-of-the-art open mixture-of-experts model with strong reasoning, coding, and agentic capabilities
Model
coding
+3
11.05M
9mo
Mistral AI
Downloadable
mixtral-8x22b-instruct-v0.1
An MOE LLM that follows instructions, completes requests, and generates creative text.
Model
Advanced Reasoning
+4
2.2M
9mo
Mistral AI
Downloadable
mixtral-8x7b-instruct-v0.1
An MOE LLM that follows instructions, completes requests, and generates creative text.
Model
Advanced Reasoning
+4
576K
9mo
NVIDIA
Downloadable
nemotron-3-super-120b-a12b
Open, efficient hybrid Mamba-Transformer MoE with 1M context, excelling in agentic reasoning, coding, planning, tool calling, and more
Model
MoE
+4
46.68M
1mo