Use DGX Spark as a local or remote Vibe Coding assistant with Ollama and Continue
Install the latest version of Ollama using the following command:
curl -fsSL https://ollama.com/install.sh | sh
Once the service is running, pull the desired model:
ollama pull gpt-oss:120b
To allow remote connections (e.g., from a workstation using VSCode and Continue), modify the Ollama systemd service:
sudo systemctl edit ollama
Add the following lines beneath the commented section:
[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"
Environment="OLLAMA_ORIGINS=*"
Reload and restart the service:
sudo systemctl daemon-reload
sudo systemctl restart ollama
If using a firewall, open port 11434:
sudo ufw allow 11434/tcp
Verify that the workstation can connect to your DGX Spark's Ollama server:
curl -v http://YOUR_SPARK_IP:11434/api/version
Replace YOUR_SPARK_IP with your DGX Spark's IP address. If the connection fails please see the Troubleshooting tab.
For DGX Spark (ARM-based), download and install VSCode: Navigate to https://code.visualstudio.com/download and download the Linux ARM64 version of VSCode. After the download completes note the downloaded package name. Use it in the next command in place of DOWNLOADED_PACKAGE_NAME.
sudo dpkg -i DOWNLOADED_PACKAGE_NAME
If using a remote workstation, install VSCode appropriate for your system architecture.
Open VSCode and install Continue.dev from the Marketplace:
Or, configure your own modelsClick here to view more providersOllama as the ProviderAutodetectYour downloaded model will now be the default (e.g., gpt-oss:120b) for inference.
To connect a workstation running VSCode to a remote DGX Spark instance the following must be completed on that workstation:
Continue icon on the left paneOr, configure your own modelsClick here to view more providersOllama as the ProviderAutodetect as the Model.Continue will fail to detect the model as it is attempting to connect to a locally hosted Ollama server.
gear icon in the upper right corner of the Continue window and click on it.config.yaml will open. Take note of your DGX Spark's IP address.name: Config
version: 1.0.0
schema: v1
assistants:
- name: default
model: OllamaSpark
models:
- name: OllamaSpark
provider: ollama
model: gpt-oss:120b
apiBase: http://YOUR_SPARK_IP:11434
title: gpt-oss:120b
roles:
- chat
- edit
- autocomplete
Replace YOUR_SPARK_IP with the IP address of your DGX Spark.
Add additional model entries for any other Ollama models you wish to host remotely.