Consuming REST APIs with Drupal 8

One of our members recently asked this question in support:

Wonder if you have, or can suggest, a resource to learn how to access, authenticate (via OAuth preferably) and process JSON data from an external API?

In trying to answer the question I realized that I first needed to know more about what they are trying to accomplish. Like with most things Drupal, there's more than one right way to accomplish a task. Choosing a solution requires understanding what options are available and the pros and cons of each. This got me thinking about the various different ways one could consume data from an API and display it using Drupal 8.

The problem at a high level

You've got data in an external service, available via a REST API, that you need to display on one or more pages in a Drupal site. Perhaps accessing that data requires authentication via OAuth2 or an API token. There are numerous ways to go about it. Which one should you choose? And how should you get started?

Some questions to ask yourself before you start:

  • How much data are we talking about?
  • How frequently does the data you're consuming change, and how import is it that it's up-to-date? Are real-time updates required? Or is a short lag acceptable?
  • Does that data being consumed from the API need to be incorporated into the Drupal-generated pages' HTML output? How does it impact SEO?
  • How much control does a Drupal site administrator need to have over how the data is displayed?

While I'm certain this list is not exhaustive, here's are some of the approaches I'm aware of:

  • Use the Migrate API
  • Create a Views Query Plugin
  • Write a custom service that uses Guzzle or similar PHP SDK via Composer
  • Use JavaScript

I'll explain each one a little more, and provide some ideas about what you'll need to learn in order to implement them.

Option 1: Use the Migrate API

Use the Migrate API combined with the HTTP Fetchers in the Migrate Plus module to ingest data from an API and turn it into Drupal nodes (or any entity type).

In this scenario you're dealing with a data set that doesn't change frequently (a few times per day, maybe), and/or it's okay for the data displayed on the site to lag a little behind what's in the external data service. This approach is somewhat analogous to using a static site generator like Gatsby, or Sculpin, that requires a build to occur in order for the site to get updated.

In this case that build step is running your migration(s). The result is you'll end up with a Drupal entity for each record imported that would be no different than if a user had created a new node by filling out a form on your Drupal site. In addition, you get the complete extract, transform, load pipeline of the Migrate API to manipulate the ingested data as necessary.


  • If you've worked with Migrate API before, this path likely provides the least friction
  • Data is persisted into Drupal entities, which opens up the ability to use Views, Layout Builder, Field Formatters, and all the other powerful features of Drupal's Entity & Field APIs
  • You can use Migrate API process plugins to transform data before it's used by Drupal
  • Migrate Plus can handle common forms of authentication like OAuth 2 and HTTP Basic Auth


  • Requires a build step to make new or updated data available
  • Data duplication; you've now got an entity in Drupal that is a clone of some other existing data
  • Probably not the best approach for really large data sets

Learn more about this approach:

Option 2: Create a Views Query Plugin

Write a Views Query Plugin that teaches Views how to access data from a remote API. Then use Views to create various displays of that data on your site.

The biggest advantage of this approach is that you get the power of Views for building displays, without the need to persist the data into Drupal as Entities. This is approach is also well suited for scenarios where there's an existing module that already integrates with the third party API and provides a service you can use to communicate with the API.


  • You, or perhaps more importantly your editorial team, can use Views to build a UI for displaying and filtering the data
  • Displays built with Views integrate well with Drupal's Layout Builder and Blocks systems
  • Data is not persisted in Drupal and is queried fresh for each page view
  • Can use Views caching to help improve performance and reduce the need to make API calls for every page load


  • Requires a lot of custom code that is very specific to this one use-case
  • Requires in-depth understanding of the underpinnings of the Views API
  • Doesn't allow you to take advantage of other tools that interact with the Entity API

Learn more about this approach:

Option 3: Write a Service using Guzzle (or similar)

Write a Guzzle client, or use an existing PHP SDK to consume API data.

Guzzle is included in Drupal 8 as a dependency, which makes it an attractive and accessible utility for module developers. But you could also use another similar low-level PHP HTTP client library, and add it to your project as a dependency via Composer.

Guzzle is a PHP HTTP client that makes it easy to send HTTP requests and trivial to integrate with web services. --Guzzle Documentation

If you want the most control over how the data is consumed, and how it's displayed, you can use Guzzle to consume data from an API and then write one or more Controllers or Plugins for displaying that data in Drupal. Perhaps a page controller that provides a full page view of the data, and a block plugin that provides a summary view.

This approach could be combined with the Views Query Plugin approach above, especially if there's not an existing module that provides a means to communicate with the API. In this scenario, you could create a service that is a wrapper around Guzzle for accessing the API, then use that service to retrieve the data to expose to views.

If you need to do anything (POST, PUT, etc. ) other than GET from the API in question you'll almost certainly need to use this approach. The above two methods deal only with consuming data from an API.


  • Able to leverage any existing PHP SDK available for the external API
  • Some of the custom code you write could be reused outside of Drupal
  • Greatest level of control over what is consumed, and how the consumed data is handled
  • Large ecosystem of Guzzle middleware for handling common tasks like OAuth authentication


  • Little to no integration with Drupal's existing tools like Views and others that are tailored to work with Entities

Learn more about this approach:

Option 4: JavaScript

Use client-side JavaScript to query the API and display the returned data.

Another approach would be to write JavaScript that does the work of obtaining and displaying data from the API. Then integrate that JavaScript into Drupal as an asset library. A common example of something like this is a weather widget that displays the current weather for a user, or a Twitter widget that displays a list of most recent Tweets for a specific hash tag.

You could also create a corresponding Drupal module with an admin settings form that would allow a user the ability to configure various aspects of the JavaScript application. Then expose those configuration values using Drupal's JavaScript settings API.

While it's the least Drupal-y way of solving this problem, in many cases this might also be the easiest -- especially if the content you're consuming from the API is for display purposes only and there is no reason that Drupal needs to be aware of it.


  • Data is consumed and displayed entirely by the client, making it easier to keep up-to-date in real time.
  • Existing services often provide JavaScript widgets for displaying data from their system in real time that are virtually plug-and-play.
  • Code can be used independent of Drupal.


  • No server-side rendering, so any part of the page populated with data from the external API will not be visible to clients that don't support JavaScript. This also has potential SEO ramifications.
  • You can't query the API directly if it requires an API key that you need to keep secret (e.g., because the key has access to POST/PUT/DELETE resources). In that case, you would need server-side code to act as a proxy between the API and the JavaScript frontend
  • Drupal has no knowledge of the data that's being consumed.
  • Drupal has little control over how the data is consumed, or how it's displayed.

Learn more about this approach:

Honorary mention: Feeds module

The Feeds module is another popular method for consuming data from an API that serves as an alternative to the Migrate API approach outlined above. I've not personally used it with Drupal 8 yet, and would likely use the Migrate API based on the fact that I have much more experience with it. Feeds is probably worth at least taking a look at though.


There are a lot of different ways to approach the problem of consuming data from an API with Drupal. Picking the right one requires first understanding your specific use case, your data, and the level of control site administrators are going to need over how it's consumed and displayed. Remember to keep in mind that turning the data into Drupal entities can open up a whole bunch of possibilities for integration with other aspects of the Drupal ecosystem.

What other ways can you think of that someone might go about solving the problem of consuming data from an API with Drupal?

Related Topics: 


what about the GraphQL and GraphQL views modules? Isn't it able to consume, not just produce, endpoints?

you can also use feeds, probably the simplest solution

There's also core aggregator module for that, but t require to generate few plugins with drush)

An important con of the Javascript method is that you can't query the API directly if it requires an API key that you need to keep secret (e.g., because the key has access to POST/PUT/DELETE resources). In that case, you would need server-side code to act as a proxy between the API and the Javascript frontend.

That's a really good point. And an important thing to keep in mind. I'll add a note about that to the article. Thanks.

Option 5. Do it the Drupal way and use http client manager module. :-)

Hi fantastic website.

Great article. Thanks for sharing such knowledge. That's exactly what I was looking for.