There are few dockerfiles which build from the nvidia/cuda image on hub.docker.com.
These examples are built from the Ubuntu 20.04 Nvidia CUDA image (specifically, nvidia/cuda:11.7.1-devel-ubuntu20.04) on Docker Hub using the the cuda-samples repo.
The Dockerfile.base
provides the common base image for all samples.
Pre-built images:
- https://hub.docker.com/r/openiss/openiss-cuda-base/tags
- https://hub.docker.com/r/openiss/openiss-cuda-devicequery/tags
- https://hub.docker.com/r/openiss/openiss-cuda-simplegl/tags
This will build the deviceQuery program from cuda-samples.
docker build -t device-query -f Dockerfile.deviceQuery .
To run it you need to use the --gpus
flag or else it will fail:
docker run --name device-query --gpus all --rm deviceQuery:latest
Warning: This will not work with WSL because it is not supported yet. See: https://docs.nvidia.com/cuda/wsl-user-guide/index.html#features-not-yet-supported
This will build the simpleGL sample
docker build -t simple-gl -f Dockerfile.simpleGL .
To run it use the you need to use the --gpus
flag and a have an X server running (See OpenGL examples).
docker run --name simple-gl --gpus all --rm -e DISPLAY=<your_ipv4_address>:0.0 simpleGL:latest