Drupal Development with Docker

Drupal developers have relied on local development environments like MAMP, WAMP, and Acquia Dev Desktop for years. While these are each effective solutions, they come with their share of problems. Do your PHP settings match your production environment? Are you using the same versions? What if you need to switch versions for different projects? How do you quickly change systems or on-board new team members quickly?

In this series we introduce Docker, a container runtime that allows you to run pre-packed, sandboxed Linux applications anywhere. We'll start by running a single container on the command line, build up to running Drupal in Docker, cover how to build your own containers, and how to add Docker to your Drupal development workflow.

To get started in this series, we'll assume you're familiar with the basics of using the command line, including how to enter and run commands. (You can review our Command Line Basics series if you need to brush up.) A familiarity with Drupal development practices and Linux is helpful but not required.

Author, Tess Flynn

Tess, also known as socketwench, is a devops engineer who has worked with Drupal for over a decade. She is the author of the Flag module for Drupal 8 and has given sessions at DrupalCons and DrupalCamps throughout the world on Drupal 8 development, Docker, and DevOps.

Tutorials in this course
Drupal 7, 8, 9, and 10
More information

Today's Drupal developer needs more than just a text editor and FTP. Best practice Drupal development involves a suite of tools, processes, and more than one server environment.

This tutorial is directed toward an audience that is not familiar with best practices in Drupal Development and methods involving version control with Git, IDEs, local development environments, and deployment environments (i.e. stage, live). Here we're providing a high-level overview of these topics with links to dive deeper if you need more information.

In this tutorial, we'll cover:

  • Introduce Version Control Systems such as Git
  • Discuss how Git can be used to deploy to remote web servers
  • Review programming-centric text editors and Integrated Development Environments
  • Identify the need for a local development environment.
  • Explain shared deployment environments including production and stage.
More information

Docker often seems like an impenetrable product. Is it a VM system? A suite of development tools? A clustering product? A software distribution facility? When the answer is "yes" to each of these, it only becomes more confusing. For the Drupal developer, Docker is a way to provide a local development environment to run web server software.

In this tutorial, we'll:

  • Define the terms hypervisor, virtual machine (VM), and containers
  • List the advantages of containers over VMs
  • List the advantages of Docker for Drupal developers
More information

Installing Docker is easy, but there are some details you may want to consider before you download and run the installer.

In this tutorial, we'll focus on:

  • Why Linux is Docker’s native environment
  • The difference between Docker edge vs. Docker stable
  • Why Docker for non-Linux requires a VM
More information

Now that we know what Docker is, what containers are, and how to install Docker, just how do we use containers? While graphical user interfaces (GUI) exist for Docker, the primary way to interact with it is via the command line.

In this tutorial, we'll:

  • Start Docker for Mac or Docker for Windows
  • Use the docker run command to run a container interactively
  • Break down the arguments of the docker run command
More information

Containers are sandboxed applications that can be run anywhere, no matter the underlying host OS. Docker containers are quite different from VMs in a number of ways that need to be understood before we can use them to develop Drupal sites.

In this tutorial, we'll:

  • Define a container
  • Explain how a container is a more limited environment than a VM
More information

When we use docker run to start a container, we download a compressed, ready-to-use container called an image. Images make containers easy to share via a registry like Docker Hub, but also affect how file storage works when using containers.

In this tutorial, we'll:

  • Discuss how file storage works in Docker containers
  • Describe images, base images, and the scratch image
  • Identify layers and show how layers make up Docker's filesystem
More information

Running a container interactively can be useful, but often it's not what we really need. A web server stack is made up of several components such as the Linux OS, the Apache web server, a PHP runtime, and a database such as MySQL. Collectively, we call this a LAMP stack. If we were to run these in Docker with what we now know, we'd have to keep open several terminal windows!

Obviously that's not what we want to do. Instead, we want to run the containers in the background. That way, we can use them like we would any web server. Fortunately, Docker makes running and managing a container in the a background easy with just a few commands.

In this tutorial, we'll:

  • Start a container in the background
  • Use docker ps to list running Docker containers
  • Use docker run to enter a container running in the background
  • Use docker kill to stop a container running in the background
More information

Often we don't want to run just one container at a time, but a set of containers that act together to provide a unit of functionality. Yet, docker run only starts one container at a time, with one command in each container at a time.

Docker Compose lets us overcome this limitation by allowing us to define a single file that describes multiple containers, their relationship to each other, and utilities to manage that set of containers as a single unit.

In this tutorial, we'll:

  • Introduce Docker Compose
  • Run multiple containers at once using Docker Compose
  • Identify the purpose of docker-compose.yml
  • Learn what resources a set of containers share
More information

Docker Compose allows us to manage several related containers as a single group. We define container sets by creating a creating the Compose file, docker-compose.yml.

In this tutorial, we'll:

  • Create the basic structure of the Compose file
  • Define a container set using off-the-shelf containers
  • Describe where to place it in your project
  • See how directory names are significant in Compose
More information

Once we have the docker-compose.yml file created, we can use it to work with a set of containers. Instead of the docker command, Docker Compose has its own command to work with multiple containers at once: docker-compose.

In this tutorial, we'll:

  • Cover the basic usage of the Compose command
  • Describe how to start, stop, and list running container sets
More information

One of the biggest questions when first learning Docker is "How do we get data into and out of containers?" We can use docker-compose exec to interact with them on the command line, but that doesn't fulfill our needs as developers. Docker provides several mechanisms to share data with the container, each with specific purposes. Docker Compose lets us leverage each of those easily with just a few lines of YAML.

In this tutorial, we'll:

  • Identify the various ways we can get data into containers
  • Define volumes
  • Describe how to use environment variables in Docker
  • Describe how to expose network ports from a container set
More information

Bind Volumes are essential to the Drupal developer when using Docker. They allow you to synchronize a directory on your laptop or workstation with a directory in the container. Changes can be replicated in either direction -- from the container to the host OS, or from the host OS to the container. You can add a bind volume to a Compose file with just a single line of code.

In this tutorial, we'll:

  • Describe how to use the volumes key in your Compose file
  • Best practices for describing mount points
  • Introduce different synchronization strategies for volumes and which to use
More information

Docker's goal is to treat containers as reusable, off-the-shelf pieces of infrastructure. Often, however, we need to tailor a container to our specific needs. We may need to enable debugging facilities, enable key configuration options, create databases, and set logins. Many development-oriented containers rely on environment variables to configure containers at runtime.

In this tutorial, we'll:

  • Set environment variables using a static value in the Compose file
  • Use an environment file to pass multiple variables at once
More information

Network communication is essential when developing for a multi-tier web application like Drupal. Docker automatically isolates each container it runs, only allowing explicit ports to be exposed to the host OS. Docker Compose takes this one step further.

In this tutorial, we'll:

  • Explain how Docker isolates containers
  • Expose a container's ports to the host OS
  • Re-map ports from the container to the host OS
More information

A common task when developing a Drupal site is loading the database into your local development environment. When working with non-Docker local development environments, command line tools or a graphical application are used to load the database dump. These methods also work for Docker with a bit of container configuration. With Docker, you can include all the tooling in containers, reducing the need for utilities on the host OS.

In this tutorial, we'll:

  • Outline the challenges for loading a database into a container
  • Identify the methods by which a database can be loaded into a container
More information

One of Docker's goals is to make it as simple to deploy and update infrastructure as it is to pull a product off of a shelf. At the center of this goal is Docker Hub, a massive, public, and free-to-use library of Docker images for you to use. As a free service, additional care is required to select images that will provide you with updated, secure, and well-maintained infrastructure.

In this tutorial, we'll:

  • Show how Hub is an integral part of using Docker
  • Describe registries and private registries
  • Identify official images vs. contributed images
  • Outline best practices for selecting an image on Hub
More information

When we issue docker run, Docker will attempt to download any images it doesn't have cached locally. Locally cached images take up disk space and are not automatically managed. Furthermore, once downloaded, Docker never updates them for you. You can list, update, or delete locally cached images using the docker command.

In this tutorial, we'll:

  • Show how to list all images currently stored on your host OS
  • Identify the disk space occupied by cached images
  • Learn to explicitly download and update a locally cached image
  • Learn to delete images on your system
More information

Often we need a particular version or variation of software in order to support our project. Our site might require Apache Solr 4.x, whereas the same project could be perfectly fine with the latest version of MySQL. Since image names need to be unique on Docker Hub, it'd be inconvenient to require a separate image name for each version of a particular piece of software. Docker solves this problem by using tags.

In this tutorial, we'll:

  • Define tags
  • Describe how tags are used
  • Explain the latest tag
  • Show how to find a container's tags on Docker Hub
  • Use a tag with docker commands, and in the Compose file
More information

Creating a container set to support Drupal development requires some specialized knowledge. Now that we understand containers, images, how to use Docker Compose (docker-compose), and how to select images on Docker Hub, we're ready to build a container set to support Drupal development.

In this tutorial, we'll:

  • Select images of software that we'll need to run Drupal
  • Create a new Compose file
  • Configure bind volumes and environment variables to support the site
  • Test the configuration

See Dockerize an Existing Project if you already have Drupal installed.

More information

One of the primary goals of Docker is to make it as easy to try out and deploy technical infrastructure as it is to pull an item off of a store shelf. But how are these containers built in the first place? A Docker image is built from a Dockerfile, a kind of container source code. The Dockerfile describes how to build and configure a single container.

In this tutorial, we'll:

  • Introduce Dockerfiles and see how Docker uses them to build container images
  • Outline the general structure of a Dockerfile
  • Describe how to build a new image from a Dockerfile
More information

Creating a custom image only requires a Dockerfile with a FROM directive, but since this only renames the image, how do we actually change it? When building a custom image, we often need to add files. Whether they are config files, scripts, compressed archives, or even application binaries, Docker makes it easy to add a file to an image with just one directive.

In this tutorial, we'll:

  • Describe how to position files relative to your Dockerfile
  • Use the COPY directive to add local files
  • Download remote files using the ADD directive
More information

The COPY and ADD directives make it easy to add configuration files or download archives to a container image. While we could install applications into a container using only those directives, it would be difficult and complex. Making matters worse, Docker provides no INSTALL directive. Instead, Docker provides a more general mechanism.

In this tutorial, we'll:

  • Introduce the RUN directive and how to run commands during a docker build
  • Use package managers to install applications
  • Describe best practices for installing software in images
More information

Installation is only one part of setting up a custom Docker image. With few exceptions, we'll want to configure the application to better suit our use case. Docker does not provide a standardized way for applications to be configured. Instead, we rely on the same techniques as we would when configuring the application on a bare-metal server.

In this tutorial, we'll:

  • Extract default configuration files from a Docker image
  • Give strategies for adding configuration files to the image
  • Outline the complexities of using configuration commands in a Dockerfile
More information

The goal of Docker containers is to let you select pieces of technical infrastructure as if you were pulling items off of a shelf. Toward that end, each container image can be configured with a default application to invoke when started with a docker run or as part of a container set with docker-compose up.

In this tutorial, we'll:

  • Describe the difference between the build-time and runtime environment of a container
  • Use the CMD directive to specify a default command to execute
  • Introduce the ENTRYPOINT directive and set a default shell in which to run your CMD
More information

The ENTRYPOINT directive allows us to specify a default shell to use in our custom image, but it can do more than that. Often, we will want to dynamically configure a container on startup by passing it environment variables or using Docker Secrets. By replacing the ENTRYPOINT with a custom script, we can perform this dynamic configuration prior to executing the default application.

In this tutorial, we'll:

  • Introduce custom entrypoint scripts
  • Describe several strategies for performing dynamic configuration in a container
More information

Often you'll find an image on Docker Hub that almost fits your requirements. For Drupal developers, often the off-the-shelf containers for PHP just aren't enough. Drupal often requires additional PHP extensions such as mbstring or gd. We may also want to use a slightly different configuration, or bake-in a utility like Drush or Drupal Console. Fortunately, Docker makes modifying existing containers easy in a Dockerfile.

In this tutorial, we'll:

  • List the reasons why you might modify an existing image
  • Describe the general process by which an existing image is modified
More information

When writing containers for a local development environment, security is often a lesser concern. This is fine as long as we never intend to put the containers we create in a production environment.

When we do want to make production-ready containers, however, our priorities change. While Docker tries to be secure by default, it can't protect us from badly configured or vulnerable applications. For that, we need to design our images to be more secure.

In this tutorial, we'll:

  • Outline the best practices for writing secure container images
  • Introduce the USER directive
  • Set file ownership using COPY and ADD
  • Use the RUN directive to set file permissions
More information

Getting Drupal to run in Docker requires a lot of moving parts. After installing Docker and Docker Compose, we need to select a collection of containers from Docker Hub and create a new docker-compose.yml file. Once we have environment variables and volumes configured, this only gives us the capability of running a Drupal site in Docker.

What if we already have a Drupal site we want to develop using Docker? In this tutorial, we'll show how to modify an existing project to minimize the setup time necessary for switching to a Docker-based environment.

In this tutorial, we'll:

  • Describe the best practices for project organization.
  • Use an environment file to configure containers.
  • Add a Docker-specific settings.php file.
More information

Dockerizing a project helps to simplify setting up new developer workstations, and on-boarding new team members. All the pieces of infrastructure necessary to get started are all in the Compose file. Yet, it's not as easy as it could be.

We still need to create a settings.local.php file with all the necessary database connection information and any setting overrides. In this tutorial, we'll move those out of the local settings file and into a Docker specific settings file that ships with all that information pre-configured out of the box.

In this tutorial, we'll:

  • Explain the motivation behind having a Docker-specific settings file.
  • Describe how to modify settings.php to detect when it's in a Docker environment.
  • Create a Docker-specific settings file with everything preconfigured.
More information

One of the problems with Dockerizing an existing project is that configuration information tends to proliferate everywhere. Not only do we have settings in docker-compose.yml, but also in our Docker-specific settings file settings.docker.php. If we change a setting in one place we need to copy and paste it everywhere else. This can make things difficult if we suddenly have the need to change a setting.

Fortunately, there's a way to centralize Docker configuration for our project by using an environment file.

In this tutorial, we'll:

  • Review what an environment file is and its format.
  • List the advantages of moving Docker configuration to an environment file.
  • Describe the .env file, and how it provides us further advantages.
More information

Throughout this series we've been focused on working with a single set of containers and a single site. For most Drupal developers, however, we're expected to work with multiple client sites, sometimes several different ones in the same day.

When we add Docker into the mix, it can seem overwhelmingly complicated when you're used to working with other tools. Fortunately, there are several simple practices that not only work well with Docker, but also support your workflow.

In this tutorial, we'll:

  • Outline the best practices when building a local development environment in Docker.
  • Compare the differences in workflow when using Docker compared to other local development environments.
  • Discuss various strategies to reduce resource use on team workstations.
More information

A multisite Drupal installation allows you to host multiple, separate websites while relying on the same set of code. Large organizations often rely on a multisite installation to cut down on the operational upkeep of multiple sites. Hosting a multisite in Docker poses several additional challenges. Fortunately, the process is not dissimilar from configuring a bare-metal server to run a multisite.

In this tutorial, we'll:

  • Outline a multisite's additional requirements for Docker containers.
  • Configure alternate, local domain names to resolve each site.
  • Learn how to configure a multisite to use alternate domain names.
More information

Today's Drupal sites often rely upon external services in order to serve their visitors. When hosting such a site in Docker, we need to take special consideration in order to allow the site to access these essential components.

In this tutorial, we'll:

  • Outline the steps necessary for troubleshooting connectivity issues
  • Discuss steps to protect API keys and other sensitive pieces of information
  • Touch upon your options if you require external libraries or utilities
More information

One of the key advantages of Docker is that it makes it much easier to share your containers with your team members. For most of this series, we've been relying on containers hosted on Docker Hub. When we need to create a custom container for our team, we want to leverage that same sharing infrastructure.

In this tutorial, we'll:

  • Describe the various methods of sharing containers.
  • Outline the advantages of using Docker Hub.
  • Briefly describe why and when you should use a private container image registry.
More information

For many container images on Docker Hub, the preferred approach is to create an automatic build. An automatic build integrates Hub with a public Git repository, providing you an effective, open, and best-practice approach to sharing your containers with your team, and with the world.

In this tutorial, we'll:

  • List the advantages of creating an automatic build compared to other approaches on Docker Hub.
  • Describe the process of creating an automatic build.
  • Outline how to organize your git repository for your images.
  • Learn how to configure and trigger the build on Hub.
More information

Often it's useful for a container image to provide several variants of itself under the same name on Docker Hub. Docker uses tags to identify these variants. You can configure your own tags as part of your automatic build on Docker Hub.

In this tutorial, we'll:

  • Outline the uses for tags
  • Discuss the best practices for tag names
  • Learn how to add tags to your automatic build on Docker Hub
More information

Docker Hub provides a free, easy to use way of distributing your container images. However, there are situations where sharing your container images is either not ideal, bad practice, or against legal requirements. In those cases, you will want to use a private registry instead.

In this tutorial, we'll:

  • Describe what a private registry is.
  • Learn how to run Docker's own registry image.
  • Learn how to push and pull images from the self-hosted registry.
  • Outline the production concerns for the registry image.
  • List other options for self-hosting your own images.
More information

Docker provides numerous advantages for us as Drupal developers. It simplifies the management of infrastructure for our projects while allowing customization to suit each project's needs. Running Docker in production also brings a number of advantages, but it also creates new concerns.

In this tutorial, we'll:

  • Outline the advantages Docker brings to the production environment.
  • Highlight the concerns when planning a production deployment of Docker.
  • Describe container orchestration and list several container orchestration platforms to choose from.
This course appears in the following guides:
Module Development
Explore Drupal APIs and development patterns in depth.

Develop Drupal Sites