To easily manage containers without sudo, you must be in the docker group. If you choose to skip this step, you will need to run Docker commands with sudo.
Open a new terminal and test Docker access. In the terminal, run:
docker ps
If you see a permission denied error (something like permission denied while trying to connect to the Docker daemon socket), add your user to the docker group so that you don't need to run the command with sudo.
sudo usermod -aG docker $USER
newgrp docker
Pull the Open WebUI container image with integrated Ollama:
docker pull ghcr.io/open-webui/open-webui:ollama
Start the Open WebUI container by running:
docker run -d -p 8080:8080 --gpus=all \
-v open-webui:/app/backend/data \
-v open-webui-ollama:/root/.ollama \
--name open-webui ghcr.io/open-webui/open-webui:ollama
This will start the Open WebUI container and make it accessible at http://localhost:8080. You can access the Open WebUI interface from your local web browser.
NOTE
Application data will be stored in the open-webui volume and model data will be stored in the open-webui-ollama volume.
Set up the initial administrator account for Open WebUI. This is a local account that you will use to access the Open WebUI interface.
You'll then download a language model through Ollama and configure it for use in Open WebUI. This download happens on your DGX Spark device and may take several minutes.
gpt-oss:20b in the search field.You can verify that the setup is working properly by testing model inference through the web interface.
Try downloading different models from the Ollama library at https://ollama.com/library.
You can try this set up with NVIDIA Sync so that you can monitor GPU and memory usage through the DGX Dashboard as you try different models.
If Open WebUI reports an update is available, you can update the container image by running:
docker pull ghcr.io/open-webui/open-webui:ollama
Steps to completely remove the Open WebUI installation and free up resources.
WARNING
These commands will permanently delete all Open WebUI data and downloaded models.
Stop and remove the Open WebUI container:
docker stop open-webui
docker rm open-webui
Remove the downloaded images:
docker rmi ghcr.io/open-webui/open-webui:ollama
Remove persistent data volumes:
docker volume rm open-webui open-webui-ollama