docker run
docker-compose.yml
Having reviewed the fundamental Docker components and their relevance to machine learning, let's gain some practical experience. One of the significant advantages of Docker is the availability of pre-built images on registries like Docker Hub. These images often contain curated environments with common ML frameworks and libraries already installed, saving considerable setup time and ensuring consistency.
In this hands-on exercise, we will pull and run a popular pre-built image designed for machine learning tasks. We'll use an official TensorFlow image that includes Jupyter Notebook, providing an interactive environment ready for experimentation.
First, we need to download the image from Docker Hub to our local machine. Open your terminal or command prompt and use the docker pull
command. We'll specify the image name and a tag. Tags often indicate versions or specific configurations (like CPU vs. GPU, or including development tools).
docker pull tensorflow/tensorflow:latest-jupyter
This command instructs Docker to:
tensorflow/tensorflow
.latest-jupyter
. This specific tag provides a recent TensorFlow build along with a ready-to-use Jupyter Notebook server.You will see Docker downloading the image layers. The time this takes depends on your internet connection and the image size.
Once the image is downloaded, we can create and start a container from it using docker run
. We will use several options:
-it
: Runs the container in interactive mode and allocates a pseudo-TTY, allowing us to interact with it.--rm
: Automatically removes the container when it exits. This is useful for temporary experiments.-p 8888:8888
: Maps port 8888 on your host machine to port 8888 inside the container. This is the default port used by Jupyter Notebook.tensorflow/tensorflow:latest-jupyter
: Specifies the image to use.Execute the following command:
docker run -it --rm -p 8888:8888 tensorflow/tensorflow:latest-jupyter
After running the command, the container will start, and the Jupyter Notebook server will launch inside it. You should see output in your terminal similar to this (the token will be different):
[...]
To access the notebook, open this file in a browser:
file:///root/.local/share/jupyter/runtime/nbserver-6-open.html
Or copy and paste one of these URLs:
http://127.0.0.1:8888/?token=a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4
or http://<container_hostname>:8888/?token=a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4
[...]
Copy the URL containing 127.0.0.1
or localhost
(including the token
part) and paste it into your web browser.
You should now see the Jupyter Notebook interface running inside the Docker container.
Within the Jupyter interface:
import tensorflow as tf
print(tf.__version__)
You now have a fully functional TensorFlow environment running inside an isolated container, accessible through your browser, without needing to install Python, TensorFlow, or Jupyter directly on your host machine.
To stop the Jupyter server and the container, go back to the terminal where you ran the docker run
command and press Ctrl+C
twice.
Because we used the --rm
flag, Docker will automatically remove the container after it stops. If you had omitted --rm
, the container would still exist in a stopped state. You could list it using docker ps -a
and remove it manually using docker rm <container_id_or_name>
.
Docker Hub hosts a vast collection of images. You can search for other ML-related images directly on the Docker Hub website. Many organizations provide official images for their tools (e.g., pytorch/pytorch
, nvidia/cuda
). There are also community-contributed images for various data science stacks. When choosing an image, pay attention to the publisher, download count, and recent updates to assess its reliability.
This exercise demonstrates the convenience of using pre-built Docker images. By pulling an existing image, you obtained a standardized, ready-to-use ML environment quickly. This ability to leverage pre-configured environments is a significant time-saver and helps ensure that you and your collaborators are working with the same setup, reducing the common "it works on my machine" problem. In the next chapter, we will learn how to build our own custom images tailored to specific project requirements.
© 2025 ApX Machine Learning