LM Studio on DGX Spark
Deploy LM Studio and serve LLMs on a Spark device
Basic idea
LM Studio is an application for discovering, running, and serving large language models entirely on your own hardware. You can run local LLMs like gpt-oss, Qwen3, Gemma3, DeepSeek, and many more models privately and for free.
This playbook shows you how to deploy LM Studio on an NVIDIA DGX Spark device to run LLMs locally with GPU acceleration. Running LM Studio on DGX Spark enables Spark to act as your own private, high-performance LLM server.
What you'll accomplish
You'll deploy LM Studio on an NVIDIA DGX Spark device to run gpt-oss 120B, and use the model from your laptop. More specifically, you will:
- Install llmster, a totally headless, terminal native LM Studio on the Spark
- Run LLM inference locally on DGX Spark via API
- Interact with models from your laptop using the LM Studio SDK
What to know before starting
- Set Up Local Network Access to your DGX Spark device
- Working with terminal/command line interfaces
- Understanding of REST API concepts
Prerequisites
Hardware Requirements:
- DGX Spark device with ARM64 processor and Blackwell GPU architecture
- Minimum 65GB GPU memory, 70GB or above is recommended
- At least 65GB available storage space, 70GB or above is recommended
Software Requirements:
- NVIDIA DGX OS
- Client device (Mac, Windows, or Linux)
- Laptop and DGX Spark must be on the same local network
- Network access to download packages and models
Ancillary files
All required assets can be found below. These sample scripts can be used in Step 4 of Instructions.
- run.js - Javascript script for sending a test prompt to Spark
- run.py - Python script for sending a test prompt to Spark
- run.sh - Bash script for sending a test prompt to Spark
Time & risk
- Estimated time: 15-30 minutes (including model download time, which may vary depending on your internet connection and the model size)
- Risk level: Low
- Large model downloads may take significant time depending on network speed
- Rollback:
- Downloaded models can be removed manually from the models directory.
- Uninstall LM Studio or llmster
- Last Updated: 02/06/2026
- First Publication