
Today's Drupal sites often rely upon external services in order to serve their visitors. When hosting such a site in Docker, we need to take special consideration in order to allow the site to access these essential components.
In this tutorial, we'll:
- Outline the steps necessary for troubleshooting connectivity issues
- Discuss steps to protect API keys and other sensitive pieces of information
- Touch upon your options if you require external libraries or utilities
One of the key advantages of Docker is that it makes it much easier to share your containers with your team members. For most of this series, we've been relying on containers hosted on Docker Hub. When we need to create a custom container for our team, we want to leverage that same sharing infrastructure.
In this tutorial, we'll:
- Describe the various methods of sharing containers.
- Outline the advantages of using Docker Hub.
- Briefly describe why and when you should use a private container image registry.
For many container images on Docker Hub, the preferred approach is to create an automatic build. An automatic build integrates Hub with a public Git repository, providing you an effective, open, and best-practice approach to sharing your containers with your team, and with the world.
In this tutorial, we'll:
- List the advantages of creating an automatic build compared to other approaches on Docker Hub.
- Describe the process of creating an automatic build.
- Outline how to organize your git repository for your images.
- Learn how to configure and trigger the build on Hub.
Often it's useful for a container image to provide several variants of itself under the same name on Docker Hub. Docker uses tags to identify these variants. You can configure your own tags as part of your automatic build on Docker Hub.
In this tutorial, we'll:
- Outline the uses for tags
- Discuss the best practices for tag names
- Learn how to add tags to your automatic build on Docker Hub
Docker Hub provides a free, easy to use way of distributing your container images. However, there are situations where sharing your container images is either not ideal, bad practice, or against legal requirements. In those cases, you will want to use a private registry instead.
In this tutorial, we'll:
- Describe what a private registry is.
- Learn how to run Docker's own
registry
image. - Learn how to push and pull images from the self-hosted registry.
- Outline the production concerns for the
registry
image. - List other options for self-hosting your own images.
Docker provides numerous advantages for us as Drupal developers. It simplifies the management of infrastructure for our projects while allowing customization to suit each project's needs. Running Docker in production also brings a number of advantages, but it also creates new concerns.
In this tutorial, we'll:
- Outline the advantages Docker brings to the production environment.
- Highlight the concerns when planning a production deployment of Docker.
- Describe container orchestration and list several container orchestration platforms to choose from.
As new major versions of Drupal are released, contributed modules need to be updated for compatibility. As of right now (October 2021) there are a lot of contributed modules with a Drupal 8 release and a patch in the queue to make them work with Drupal 9. However, there's no official Drupal 9 compatible release for the module, so the module can't be installed with Composer. This creates a circular problem where you can't composer require
the module if you don't patch it, but you can't patch it until after it's been downloaded by Composer.
To help solve this common issue, Drupal.org provides a lenient Composer endpoint that publishes all modules as compatible with Drupal 9 regardless of whether that's true or not. By using it, you can composer require
the module and then use cweagans/composer-patches
to apply any necessary patches.
In this tutorial we'll:
- Add the lenient Composer endpoint to our project's composer.json file
-
composer require
a non-Drupal 9 compatible module - Use Composer to download and apply a patch that makes the module Drupal 9 compatible
By the end of this tutorial you should be able to use contributed modules that require a patch to be compatible with Drupal 9.
Upgrade to Drupal 10
FreeThere’s no one-size-fits-all path to upgrade from Drupal 9 to Drupal 10, but there is a set of common tasks that everyone will need to complete.
In this tutorial we’ll:
- Explain the differences between Drupal 9 and Drupal 10 that affect the upgrade path.
- Walk through the high-level steps required to upgrade from Drupal 9 to Drupal 10.
- Provide resources to help you create an upgrade checklist and start checking items off the list.
By the end of this tutorial you should be able to explain the major differences between Drupal 9 and 10, audit your existing Drupal 9 projects for Drupal 10 readiness, estimate the level of effort involved, and start the process of upgrading.
Upgrade to Drupal 9
FreeThere’s no one-size-fits-all path to upgrade from Drupal 8 to Drupal 9, but there is a set of common tasks that everyone will need to complete.
In this tutorial we’ll:
- Explain the differences between Drupal 8 and Drupal 9 that affect the upgrade path.
- Walk through the high-level steps required to upgrade from Drupal 8 to Drupal 9.
- Provide resources to help you create an upgrade checklist and start checking items off the list.
By the end of this tutorial you should be able to explain the major differences between Drupal 8 and 9, audit your existing Drupal 8 projects for Drupal 9 readiness, estimate the level of effort involved, and start the process of upgrading.
Local task links are the tabs you see when logged in as an administrator viewing a node on a Drupal site. In this tutorial we'll take a look at how local tasks are added within a custom module. We'll also see how to alter local tasks provided by other modules via hook_menu_local_tasks_alter()
.
Why Solr?
FreeDrupal has long provided a built-in search mechanism, so why do we need anything more? In this tutorial, we introduce Apache Solr, a free and open source search service that has several advantages and features beyond Drupal’s built-in search.
In this tutorial, we'll:
- Define Apache Solr
- Identify Apache Lucene, the legacy name for Solr
- List key features of Solr
- Identify the advantages of Solr compared to Drupal search
Apache Solr is not a Drupal module, but a server application like Varnish or MySQL. Before we can use Solr with Drupal, we must plan how we will deploy Solr to our production site.
In this tutorial, we'll:
- List the requirements for Solr installation
- Identify when to install Solr on new hardware
- Describe various installation methods
By the end of this tutorial you should be able to describe a typical Solr install, and begin to list out the various things you'll need to do to install Solr for your environments.
Use Solr Locally
FreeJust as you would for Drupal, you should always test your search configuration prior to deploying it to production. In this tutorial, we examine the various ways to set up Apache Solr locally on your system. Then we'll walk through setting up DDEV with Solr for local development.
In this tutorial, we'll:
- Describe options for running Solr locally
- List which popular local development environments provide Solr
- Show how to set up a local, DDEV-based local dev environment with Solr
By the end of this tutorial you should be able to list the different ways that Solr can be installed locally, choose an option that works for you, and see how to get started quickly with DDEV.
Solr compartmentalizes itself into cores (or collections if you're using SolrCloud). Each Solr core has its own directory, configuration, and set of search data. While a core can be thought of as an “index”, it is much more.
In this tutorial, we'll:
- Identify the difference between a Solr core and an index.
- List the various ways a core can be created.
- Explain why Search API needs a custom core configuration.
By the end of this tutorial you should be able to explain what Solr cores are, and how to create a Solr core (or collection) that is compatible with Drupal's Search API module.
In order for Drupal to work with Apache Solr, we need to add the Search API module. This module provides a generic interface for search backends, including Solr. Furthermore, it adds several features to search without the need for custom code.
In this tutorial, we'll:
- Describe why Search API is necessary to use Solr with Drupal
- Identify a Search API server
By the end of this tutorial you should be able to install the Search API and Search API Solr modules, and create the Search API server configuration required to connect Drupal and Solr.
When developing a Drupal site, it is best practice to maintain multiple environments: A production environment for your live site, a stage environment for “next version” development, and your local environment for debugging and creating new features. Solr adds further complexity as we should have a separate Solr server for each.
In this tutorial, we'll:
- Describe why different Solr servers should be used for each environment
- Explain why Config Split is not a solution for multiple environments
- Describe how to use config overrides for each environment
By the end of this tutorial you should be able to override your Search API server configuration with environment-specific settings.
While Solr compartmentalizes settings into cores, Search API organizes things into indexes. Each Search API index can have a unique set of settings and crawl a specified list of content types. Search API indexes can be created in the Search API admin interface.
In this tutorial, we'll:
- Identify a Search API index
- Describe how an index is related to a Solr core
By the end of this tutorial you should be able to create a new Search API Index connected to a Solr backend.
Creating an index alone is not enough. To populate the index, we need to specify the fields necessary to populate the index. Selecting the fields is accomplished in the Search API admin UI.
In this tutorial, we'll:
- Explain how to select what populates a search index
- Describe a field boost, and how it is used to customize results
By the end of this tutorial you should be able to add fields to a Search API Index so their content is available for searching, and then instruct Search API to index the content of your site.
Reference field types, such as taxonomy term fields, paragraph fields, or plain entity reference fields, refer to a completely separate entity within the site. This makes search configuration complicated as the typical scope of a search crawl is on a per-node (really a per-entity) basis. Fortunately there are known strategies to index these fields with ease.
In this tutorial, we'll:
- Describe why reference types pose a particular challenge to indexing
- Discuss the importance of display modes in indexing
- Highlight how the Rendered HTML Output field can be used to index paragraphs
By the end of this tutorial you should be able to add reference fields to your Search API Index and allow users to search their contents in the correct context.
One of Search API’s key advantages is that custom search pages can be created using Views. This allows a high degree of customization, while relying on a familiar toolset.
In this tutorial, we'll:
- Describe how to use Views to create a search page
- Explain search page best practices, including requiring input and no-results text
By the end of this tutorial you should be able to create a page that users can use to search your site's content using the Solr search backend.