Explore
Models
Blueprints
GPUs
Docs
⌘K
Ctrl+K
?
Login
2 results for
Filters
Models (2)
Blueprints (0)
Other (0)
Sort By
score:DESC
Best Match
OpenAI
gpt-oss-120b
Mixture of Experts (MoE) reasoning LLM (text-only) designed to fit within 80GB GPU.
Model
text-to-text
+3
34.11M
7mo
OpenAI
gpt-oss-20b
Smaller Mixture of Experts (MoE) text-only LLM for efficient AI reasoning and math
Model
text-to-text
+3
7.06M
7mo
Items per page
24
1
1
of 1 pages