We've built a nice site for John and Lisa that meets their needs, and has them happily uploading photos and videos to their site. In this lesson we'll tour the Band Wagon site, walking through how we addressed our implementation points, and then wrap up with a review of the modules we used and referenced during the series.
Additional resources
If you’ve completed the Band Wagon site to this point, you’ve built a solid foundation for a Drupal-based music fan site that provides the main features we wanted. In this lesson we'll look at some more modules to explore, and how they could enhance the site we've created:
Additional resources
Because we already the Media module installed, it’s really easy to enable support to post videos from YouTube on the Band Wagon site. In this lesson we'll enable Media Internet Services and Media: YouTube modules, and make sure everything is working properly to embed a YouTube video into our content.
Additional resources
The Media module for Drupal 7 not only supports adding and managing media that is uploaded from a user’s computer into Drupal, but it also comes with a submodule, called Media Internet Sources, that allows using media assets from various Internet locations. In this lesson we'll look at what media internet sources can do for us and quickly explain what stream wrappers are and how they relate to what we want to do on the site.
Additional resources
Let’s set up the WYSYWIG editor for the Band Wagon website, using the WYSIWYG module and CKEditor. Note that if you prefer a different WYSIWYG editor, such as TinyMCE or Aloha, the installation instructions are very similar. In this lesson we'll we'll not only get WYSIWYG set up, but we'll also enable the Media filter and integrate that with our WYSIWYG editor.
Additional resources
Text formats are an important security feature of Drupal, so it pays to understand them. A text format will “scan” your content and make HTML formatting changes to it before sending it to the browser for display. In this lesson we'll see what formats and filters are, and how they relate to each other, walk through the filter workflow, as well as reviewing the default formats that come with Drupal core.
Additional resources
We now have a site with most of the basic functionality the Band Wagon project needs to start. However, one important piece remains: streamlining the content editing process, and allowing easy image and video integration in posts. In this lesson we will compare content editing tools and discuss how to integrate our media directly into the body of our content.
Additional resources
Up to this point we've focused focused on creating nodes as the result of our migrations. The Migrate module however supports a number of different destinations that we can use when importing data. In this lesson we'll take a look at the destination classes that the Migrate module provides for us and talk about what each one is used for and where to find more information and examples of using them. Then we'll implement a migration that imports data as vocabulary terms using the MigrateDestinationTerm class.
Additional resources
It's almost unheard of to write a data import that just works on the first try. Our examples have all been written using data that is known to be in good shape, and to be honest we've avoided even trying to import some of the data because it ended up being problematic and we wanted to focus on working with the Migrate module and not debugging the problems in your source data. In the real world though, you're going to end up with problematic rows in your source data and you'll need to get them resolved.
In this lesson we'll run the complete player migration and end up with a couple of rows that fail to import because of an oversight in our code. We can use the migrate UI to get a sense of what is failing and why. Then we'll use a combination of options available to the drush migrate command and some strategically placed print_r's to debug and resolve the problem rows. Finally, we'll use a trick to get the Migrate module to re-import all the problem rows but not the already imported rows.
So far all the data we've been migrating has come from a MySQL database but Migrate supports a number of other data sources which we can access by using a different source migration class. In this lesson we'll look at the source migration classes that are available with the migrate module and talk about what each one could be used for. Then we'll implement a migration that imports data from a CSV file since that's another common way of receiving source data.
For your convenience, we've included a copy of the sample data in the companion files. This is a duplicate of the data from lesson Set Up Migrate Demo Site and Source Data. If you've been following the lessons in order, you should not need to download the sample data set again.
Additional resources
Migrate Source Class documentation
This lesson is a short one but it covers an important topic, multi-value fields. Almost any field in Drupal can be configured to support more than one value being entered for a single field. Our teams entityreference field is a good example of this, a player could have played for one or more team over the course of their career. This lesson will look at two different ways to map multiple values for a single field.
First we'll look at doing it in a callback method where we perform an additional query and then use the values returned by that query. And second we'll look at using the field mapping's separator method to take a column in a source row that has multiple values separated by a comma and import them as individual field values.
The infamous causality dilemma of the chicken and the egg examines which of the two came first constantly battling with the fact that you need one to produce the other. It's a vicious circle. In this lesson we're going to explore this dilemma in the context of data migrations. Imagine a scenario where you've got an article node type that has a reference field for similar articles which you need to populate with the node ID of the similar articles. During a migration when the article is being imported the article that is being referenced may or may not exist already. If it doesn't exist already how do we know what ID we need to put into the reference field?
One option would be to solve this problem using multiple passes. A first pass that goes through and creates all the articles, and a second that comes back through and updates the similar articles field. Though what happens if the similar articles field is required? You wouldn't be able to save the article without a value in that field the first time around? So you see how this quickly becomes another example of the chicken or the egg problem?
Lucky for us the migrate module has a solution to this called stub migrations. A process that allows creating a stub or a shell for the referenced but not yet created article so that we can use it's unique ID, then when that article is encountered in the migration it will update the stub rather than create a new article.
Additional resources
In this lesson we're going to take a look at creating relationships between two Drupal nodes during a migration. In our case we've got player and team nodes, and each player node has an entity reference to a team node which we need to populate during our migration. In order for this to work we need to ensure that the team node has already been created so that we know the unique node.nid to use in the entity reference field for the player.
To accomplish this we're going to write a migration for team data and ensure that it is run prior to our player migration being run. Then we're going to make use of the mapping between source and destination rows that the Migrate module is tracking for teams so that during a player migration we can lookup the corresponding team node's nid and make use of it.
In this lesson we're going to finish mapping the fields for the player migration and learn about how to deal with source data that requires some additional massaging before being saved to the destination. We'll learn about the use of field mapping callback functions and the migration's prepareRow method as possible spots to perform additional logic during a migration. Then we'll use these techniques to combine our player first and last name fields together for the node title field, deal with our birth and death date fields by concatenating the three source columns together into a single date string, and finally add some additional information to the notes field during import that will allow us to track imported records in the future.
Note: This lesson was recorded using the 7.x-2.6 version of the date module, however the 7.x-2.7 version is now out which contains some changes to the module's integration with the migrate module. The biggest change being that the date_migrate module is no longer required and has been deprecated. You can read more about the changes here: https://drupal.org/node/2034231
Additional resources
Although not required when writing your own migration, the Migrate module provides ways for us to decorate our migrations with additional information making it easier to keep track of who is working on what, what needs to get done, and related issues. We've already seen some of this in previous lessons with with the Do Not Migrate option for fields and ability to provide field mapping descriptions. In this lesson we'll take a look at how we can make use of the additional tools to do things like; Show the name, contact info, and roles of individuals working on the migration, pose questions to other team members via the migration UI, and link individual field mappings to related tickets in our bug tracking software. Making it easier for team members who don't want to be involved with the code to help move our migration along.
Additional resources
In this lesson we'll continue to add field mappings to the basic migration class created in the previous lesson. We'll see how to add more information about the available source fields. We'll map more of the player source data fields to their equivalent destination fields and learn about some of the many ways that fields can be mapped. Finally we'll also cover mapping sub-fields which allows us to import data for things like the text format of a node's body field or the alt column in an image field. Information that's contained within a meta field.
Additional resources
Running your own migration requires a bit of setup and boilerplate code, Drupal needs to know where to find the source data, and the Migrate module needs to be provided with some basic information about our custom code. In this lesson we'll look at setting up Drupal to be able to connect to multiple databases, and then create the skeleton of a simple module which will house our custom migration code and an implementation of hook_migrate_api(). Finally we'll create a base class from which we can begin writing our own custom migrations, and talk about why this is a good way to start organizing our migration.
Before we can write our own custom migration we need to construct the site that we're going to import data into and of course we need some source data to import. In this lesson we'll obtain some source data to work with and configure our Drupal site by installing a couple contributed modules, and creating the content types and fields necessary for our information architecture. During this process we'll also be taking a look at the baseball player and team data that we'll be importing and familiarizing ourselves a bit more with the tables and columns in our source database.
Note: The Lahman database structure has changed since this video was recorded, and the latest files provided by Lahman don't match what is used in the video. When following along with these examples you'll either need to use the 2012 source data, or make a few adjustments to field/table names throughout the source code as you follow along so that they match the current structure of the Lahman data.
Additional resources
In this lesson we're going to write our first custom data migration and start importing some of the player data in to Drupal. The primary concepts covered in this lesson are the creation of a migration class following the pattern necessary for Migrate module to be able to discover our code and then dealing with defining source and destination objects so that the Migrate module will know where to read data from and where to write data to. Finally we'll add a simple single field mapping where we map the player's name to the node.title field allowing us to run the migration for the first time and import some real data.