In the previous tutorials, we learned to install and configure the Simple OAuth module. We also learned how to generate authentication tokens using different grants. In this tutorial, we will learn how to use a token to authenticate a request for a given Drupal user, and:
- Check if a particular route supports a specific type of authentication,
oauth2
in particular - Send an authentication token, like the ones we acquired in the previous tutorial in order to prove to Drupal that the request is made by a specific user
By the end of this tutorial you should be able to make authenticated requests, as a specific user, to your API.
In order to authenticate a request against the API server, we need to send an authentication token along with the request. For that we need to first obtain the token from the server. The various ways we can get a token from the server are called grants. Using one of them, we will obtain an access token and a refresh token.
In this tutorial we will:
- Learn how OAuth 2 grants work
- Learn how to generate and request authentication tokens
- Learn how to generate and request refresh tokens
By the end of this tutorial you should be able to exchange a user name and password for OAuth 2 authentication and refresh tokens so that your API client can make authenticated requests.
The Simple OAuth module can be used to configure Drupal as an OAuth 2 authentication provider. Doing so will allow third-party applications to authenticate users using any of the OAuth flows, and validate their roles and permissions.
If you're creating applications that access Drupal's data and need to act like a logged-in user you'll want to use OAuth for authentication. There are 2 steps to accomplishing this: first, you'll need to set up Drupal to act as an authentication provider (this tutorial). Second, you'll need to make the appropriate HTTP requests to obtain an access token, which is covered in the next tutorial, Make an Authenticated Request Using OAuth 2.
In this tutorial we will:
- Learn how to install the Simple OAuth Drupal module
- Configure the Simple OAuth module so we can generate tokens that can authenticate users in Drupal
- Demonstrate what the responses generated by the Simple OAuth module look like
By the end of this tutorial you should know how to install and configure the Simple OAuth module.
The term web services has been around for quite a while. Given that web services is such a broad topic, let's define what web services are and how we are going to refer to them throughout this series so we are all on the same page.
This tutorial is an introduction to web services that will help you:
- Learn what a web service is.
- Understand that this series focuses on HTTP web services, and mostly on REST principles.
- Get some examples of APIs in the wild and what type of consumers they have.
By the end of this tutorial, you'll be able to define what web services are, and how we'll use the term in the context of these tutorials.
As a theme developer you can extend an existing asset library to include custom CSS and/or JavaScript from your theme. This is useful when you want to add styles or behaviors to components provided by Drupal core or another module.
Sometimes there are CSS or JavaScript asset libraries attached to the page by Drupal core, a contributed module, or another theme, that do something you don't like, and you want to change it or even exclude it all together. There are a couple of different ways that themes can override, alter, or extend, an existing asset library in order to modify the CSS and JavaScript that get attached the page by other code belonging to another theme or module.
In this tutorial we'll learn how to:
- Extend an existing asset library using
libraries-extend
, so that our custom CSS and JavaScript is included whenever that library is used. - Override an existing asset library using
libraries-override
, to alter the definition of the library, and replace or exclude individual assets (or the entire library).
By the end of this tutorial you should be able to use your custom theme to override, extend, or alter any of the asset libraries added to the page by another theme or module.
When running migrations you can use the highwater_mark
source plugin configuration option to influence which rows are considered for import on subsequent migration runs. This allows you to do things like only look at new rows added to a large dataset. Or to reimport records that have changed since the last time the migration was run. The term, highwater mark, comes from water line marks found on structures in areas where water level changes are common. In running migrations, you can think of a highwater mark as a line that denotes how far the migration has progressed, and saying, "from now one, we only care about data created after this line".
Another common use case for highwater marks is when you're importing a large dataset and the system runs out of resources. Usually this will look like a migration failing because it timed out, or the process ran out of memory. A highwater mark should allow you to pickup from where you left off.
In this tutorial we'll:
- Define what a highwater mark is, and how you can use them to limit the rows considered for importing each time a migration is executed.
- Demonstrate how highwater marks can be used to reimport source records that have been modified since the previous time the migration was executed.
- Introduce the
track_changes
option.
By the end of this tutorial you should be able to define what a highwater_mark
is and how to use them to speed up the import of large datasets or force the migration to reimport records when the source data is changed.
You should also be aware of the track_changes
feature which is a slower, but more dynamic, method of checking for changes in source data and reimporting records when a change is found.
If you need to write a migration that is capable of being executed multiple times and picking up changes to previously imported data, you can use the track_changes
configuration option of most source plugins. This will tell the migration to keep a hash of each source row, and on subsequent runs of the same migration it will compare the previous hash to the current one. If they don't match this means the source data has changed, and Drupal will reimport that record and update the existing destination record with new data.
Using track_changes
differs from calling drush migrate:import --update
in that using --update
will force every record to be re-imported regardless of whether the source data has changed or not.
In this tutorial we'll:
- Learn how change tracking works to detect changes in your source data
- Use the
track_changes
option in a migration
Note that progress of a migration can also be tracked using highwater marks if the source data has something like a last_updated
timestamp column. Using highwater marks in this case is likely more efficient than using track_changes
. It's a good idea to understand how both features work, and then choose the appropriate one for each migration.
Before you can migrate your Drupal 7 site to the latest version of Drupal you'll need to be able to build the features that make up the current site. Part of this is evaluating all the modules you've got installed, figuring out what you're using them for, and if there's a version that's compatible with the latest version of Drupal along with a migration path.
I usually make a spreadsheet for this. But any list of the modules you’re currently using that allows you to keep track of how you plan to update them will work. You also want to keep track of where you are in the process of figuring that all out. Because it’s likely you’ll have some modules for which the path is clear, and others where it’s pretty murky and requires more in-depth research to find a path forward. Having a list means you can break that up into tasks, and ensure you’re not missing something. It'll also help you define when your migration is done as well as any final quality assurance (QA) tasks.
In this tutorial we'll:
- Start a list of the modules that make up our current site.
- Point to some tools that can help speed up the process of evaluating a module's readiness.
- Provide a set of questions that you can ask about each module you're using as part of your planning process.
By the end of this tutorial you should have a list of all the modules you're currently using, and some tools you can use to help you figure out how to move forward with each one.
If you want to be able to say you're done with your Drupal-to-Drupal migration, first you have to be able to define "done". And part of that is doing a content analysis and inventory. Performing a content analysis and inventory will help you ensure that you don't miss any fields or important records. It also gives you an opportunity to spend some time thinking about the overall information architecture of your site. You're already going to be doing a lot of work to migrate your content, so deciding to shuffle things a bit now might not add any significant extra time. Additionally, the latest Drupal (as of Drupal 10) is a different platform than either Drupal 6 or 7, and as such, there are some new best practices and new ways of doing things that might not have been available before.
In this tutorial we'll:
- Provide a set of questions you can ask yourself about the content of your current site to kick-start your analysis.
- Give an example of how we create a content type inventory in a spreadsheet and use that to help define "done" for our projects.
By the end of this tutorial you should be able to get started analyzing the content and content types of your existing site in order to start planning for your Drupal-to-Drupal migration.
Because current versions of Drupal are so much different from older versions like Drupal 7 upgrading to the latest version requires creating a new site using the latest version of Drupal and then migrating your old site's content and configuration into it.
There is no one right way to tackle a Drupal-to-Drupal migration. Instead, it's like walking down a path and coming to a fork in the trail and then choosing a direction over and over and over. Since every site is different, every path to a finished migration will be different, too. I know nobody wants to hear it, but every migration is its own unique adventure. Every successful migration will require its own custom code, weird shell scripts, and detailed lists of the exact order of things. But that doesn't mean there's no plan to follow.
A successful migration charts a course through the maze, while leveraging existing tools and experience to help find the shortest route and the right path at each fork--with minimal backtracking due to wrong choices.
In this tutorial we'll:
- Look at what makes a Drupal-to-Drupal migration (a major version upgrade) so tricky, and how to think about performing one
- Define 3 high-level approaches to performing a Drupal-to-Drupal migration
- Get a better idea of what the migration work entails, so you go into it with a proper mental model.
By the end of this tutorial you should be able to explain the different approaches to performing a Drupal-to-Drupal migration in broad strokes, and have a better picture of the work that will be involved.
The migration_lookup
process plugin allows you to populate entity reference fields when the entity that you want to reference is created by another migration, and you don't yet know what the unique ID will be in the Drupal database. Think of this as being able to answer the question: I have a migration that imports new users from a CSV file, and another that imports articles from a different CSV file. The articles' CSV file has a column that indicates which row from the users' CSV file is the author of this file. But in order to populate the author field for the Drupal node I need to know the ID of the imported user which is an auto-incremented ID that I'm not setting. So, what is the correct ID to use?
Another use case is circular dependencies, like hierarchical taxonomies with parent/child relationships. Or a content type with a field that references other nodes of the same type, like articles that reference other articles.
In this tutorial we'll:
- Learn about the
migration_lookup
process plugin and its configuration - Provide examples of using the plugin to populate entity reference fields during a migration
- Look at alternatives
By the end of this tutorial you should be able to use the migration_lookup
plugin to effectively manage entity relationships across multiple individual migrations.
When a migration is run the Migrate API creates a mapping table that keeps track of what source record was used to create which destination record. A record is automatically added for each successfully migrated row. This mapping table can be used later to:
- See if a row was previously imported
- Look up the ID of the entity created for the row
- Tracking if a source record has changed and needs to be re-imported.
In this tutorial we'll learn:
- What migration map tables are
- Why they exist, and what the Migrate API does with them
By the end of this tutorial you should be able to explain the use case for map tables and describe the data they contain.
Drupal has robust Cache API, and various caching layers (both internal and external to Drupal), that work together to decrease application load and boost performance. Drupal's APIs allow developers to declare the cacheability of data. How long can this be stored before it becomes stale? And under what conditions should it be invalidated? Drupal uses that information during the process of building a page to cache as much of the work it does as is possible so that it won't need to do it again. Additionally, Drupal bubbles up the cacheability data from everything required to build a page into HTTP response headers that caching layers external to Drupal can also use to cache the rendered HTML.
When these APIs are combined (and used appropriately), Drupal can be extremely fast for both anonymous and authenticated traffic. But doing so requires understanding the various caching layers, their roles, and their interconnections.
In this tutorial, we'll:
- Review the caching layers and systems behind them
- Learn about components of the Drupal cache system
By the end of this tutorial, you should have a broad understanding of the Drupal caching system, its layers, and a better understanding of where in the stack you should look to optimize for different scenarios.
Note: This tutorial is specific to Drupal sites hosted on the Acquia platform and covers integrating its features to improve performance.
The Acquia platform includes Memcache, Varnish, and Content Delivery Network (CDN) integration. In order for these to be as effective as possible, they should be configured and tuned for your specific use case. This tutorial provides an introduction to these utilities and common configuration. For more detail, you should consult the Acquia documentation.
In this tutorial, we'll:
- Learn what caching utilities are included in the Acquia platform
- Set up and tune different parts of Acquia's application caching level including Memcache and Varnish
By the end of this tutorial, you'll know what application-level caching options exist on Acquia's platform. And how to configure it, and your Drupal application, for better performance.
WebPageTest (webpagetest.org) is a free open source resource that runs performance tests on a site, provides educational reports about what it finds, and suggests optimizations you can make. The tests performed via the WebPageTest interface include Lighthouse tests, performance-specific tests, Core Web Vitals, visual comparisons, and traceroute tests. The tool also allows saving a history of tests if you sign up for a free account. This tool won't make your site faster on its own, but it will give you some good ideas about where to focus your efforts.
In this tutorial we'll:
- Learn how to run performance tests via the WebPageTest web interface
- Learn how to read and interpret the results
By the end of this tutorial, you should know how to use the WebPageTest online interface to analyze a Drupal site's performance.
Lighthouse is an open source, automated tool for analyzing your site's performance. Lighthouse is built-in to the Google Chrome browser. When auditing a page, Lighthouse runs various tests against the page and then reports how well the page did across a broad spectrum of metrics. While Lighthouse doesn't improve the performance of a Drupal site itself, it helps to establish a performance profile and point towards areas that could be improved.
Lighthouse requires the use of Google Chrome. Other browsers include their own performance auditing tools. While the exact usage of each tool varies, the end result is the same: a report that can be interpreted to suggest where to focus your performance-tuning efforts.
In this tutorial, we'll:
- Learn how to run Lighthouse tests against a Drupal site
- Interpret the results of the report generated by Lighthouse
- Provide guidance on next steps to take to address the performance issues Lighthouse finds in our Drupal site
By the end of this tutorial you should be able to use Lighthouse to profile a Drupal site, interpret the results, and know where to start on making improvements.
Content Delivery Networks (CDNs) play an important role in making a Drupal-powered site fast and secure. The distributed nature of CDNs allows serving web assets such as HTML files, JavaScript, CSS, and media assets through servers located in close geographical proximity to the users, thereby reducing the physical distance data has to travel between the user and the server, and improving performance.
In addition to providing a performance boost, CDNs may also act as a firewall and protect sites from common attacks such as DDoS. The popularity of CDNs has been growing over the past few years, and integrating with them has also gotten easier. Most Drupal web operation platforms, such as Acquia and Pantheon, offer integrations with CDNs out-of-the-box. Even if your hosting platform doesn't provide a CDN, you can always set up your Drupal site to use one.
In this tutorial we'll:
- Define what a CDN is and what it can offer for your site
- Learn about popular CDNs used with Drupal sites
- Review some contributed modules that you can use to help integrate your Drupal site with a CDN
By the end of this tutorial you should be able to define what a CDN is, list CDNs with Drupal integrations, and describe the steps you will need to take to set up your site to work with a CDN.
Drupal core is built with performance and scalability in mind. It is Fast by Default. But performance is often a by-product of your specific application, and depending on how you're using Drupal, you can further optimize your site using contributed modules. These modules range from debugging utilities to cache-related modules.
It's worthwhile to have a general idea of what's available in the contributed module space. And, when you need to address your site's unique performance needs, it helps if you already know about existing solutions.
In this tutorial we'll:
- Look at a few popular contributed modules that improve Drupal's performance
- Learn about the benefits these modules may provide to your site
- Provide tips on how to configure these modules
By the end of this tutorial you should be able to list some popular performance related Drupal modules and describe their use case.
Apache Bench (ab
) is a tool that comes with the commonly used Apache HTTP server. It is designed to give you an impression of how your current Apache installation performs. It will work for any HTTP server, not just Apache. Apache Bench shows you how many requests per second your server can serve. This metric is in part a measure of how long it takes Drupal (PHP) to process the request and create a response. While there are other things that the HTTP server does too, executing PHP is by far the most expensive when serving Drupal pages.
Therefore, Apache Bench helps profile your PHP code for new features, patches after their application, and PHP libraries used on the site. You can quickly compare before and after metrics as an indicator of the scale of the impact a change has.
In this tutorial, we'll:
- Learn how to run the Apache Bench tool on our local environment
- Learn to interpret the result of the tests
By the end of this tutorial, you should know how to benchmark and profile your local Drupal installation using Apache Bench (ab
).
Sites evolve over time. We're constantly adding and removing modules, modifying content, authoring custom plugins, and changing design elements. All of these changes impact our application's performance -- some more so than others. But if you're not measuring it, you can't know when your site inadvertently gets slower or by how much.
If you are responsible for a site's performance, it might be good to look into benchmarking it and establishing a performance budget early on, then monitor it on an ongoing basis. Many tools, paid and free, allow measuring key web performance indicators and backend code and server performance.
One-time measurements can be useful for immediate debugging, or when figuring out if that big new feature is going to have a negative impact on performance. But for long-term projects, it's helpful to have known baseline values and an established performance budget to see whether your performance improves or declines over time with every new feature.
Establishing the baseline (performance budget) and comparing future measurements is called site performance benchmarking.
In this tutorial, we'll:
- Learn the basics concepts of benchmarking
- Learn a benchmarking process and best practices
- List some commonly used tools for benchmarking Drupal
By the end of this tutorial, you should understand the concept of a performance budget, know when to benchmark your site, and list some tools available to help.