Vibe Coding in VS Code

30 MIN

Use DGX Spark as a local or remote Vibe Coding assistant with Ollama and Continue.dev

DGX Spark Vibe Coding

This playbook walks you through setting up DGX Spark as a Vibe Coding assistant — locally or as a remote coding companion for VSCode with Continue.dev.
While NVIDIA NIMs are not yet widely supported, this guide uses Ollama with GPT-OSS 120B to provide a high-performance local LLM environment.

What You'll Accomplish

You'll have a fully configured DGX Spark system capable of:

  • Running local code assistance through Ollama.
  • Serving models remotely for Continue.dev and VSCode integration.
  • Hosting large LLMs like GPT-OSS 120B using unified memory.

Prerequisites

  • DGX Spark (128GB unified memory recommended)
  • Internet access for model downloads
  • Basic familiarity with the terminal
  • Optional: firewall control for remote access configuration

Requirements

  • Ollama and an LLM of your choice (e.g., gpt-oss:120b)
  • VSCode
  • Continue.dev VSCode extension