April 23, 2020

455 words 3 mins read

Importing MySQL Databases to MySQL Backed Aurora PART 2

Importing MySQL Databases to MySQL Backed Aurora PART 2

Backup your database so that RDS can import it for you.

As I discussed in my previous post, you can simply dump an sql file generated by mysldump into your Aurora instance. In the event you’re database is more than a couple of Gigabytes in size this can be time consuming and can be costly, as you need to have an EC2 instance available with decent bandwidth to perform the import from.

Fortunately Amazon allow you to import your database in a couple of different ways. One of which is to import your database from a backup located in S3.

There are a number of caveats to this which are all listed here

Some points to note are:

  • You can only import your data to a new DB instance, not an existing DB instance.
  • You must use Percona XtraBackup to create the backup of your on-premises database.
  • You can’t restore from an Amazon S3 bucket in a different AWS Region than your Amazon RDS DB instance.

There are other important points listed in that part of the RDS manual so you should read them at the very least to ensure you don’t run into any expected trouble.

Getting the Tools

As mentioned above you need to backup your database using the Percona XtraBackup tool. Additionally you may also want to install the innobackupx tool, which is a wrapper script, also from Percona, round the XtraBackup tool

Details on how to install these tools, along with detailed instructions can be found on the Percona website:

xtrabackup

innobackupx

A Basic Backup

To create a full backup of you database server:

$ innobackupex --user=DBUSER --password=DBUSERPASS /path/to/BACKUP-DIR/

Once its complete you can then run your BACKUP-DIR through tar and gzip to create a tar.gz file upload that to S3 and proceed

However, this can take a fair bit of time. Especially if your database is a few hundred gigabytes in size.

A Better Backup

Fortunately, Percona has a better way to create backups with xtrabackup that allows for faster, and smaller backup files.

Percona have their own custom streaming format called xbstream. You can read more about it here.
But for this all you need to know is that xbstream allows simultanious compression and streaming. So with this option we can perform the backup and the compression of our archive file at the same time. Couple that with the --parallel option which allows us to backup files in … well parallel, and the --compress-threads option which allows us to compress the files in parallel we can dramatically speed up our backup.

$ innobackupex --login-path=mysql_login /path/to/BACKUP-DIR --no-timestamp \
  --parallel=10 \
  --compress \
  --compress-threads=4

This results in an .xbstream file which you can then upload to S3 ready for your import.