Automate Database Backups with MySqlDump and PowerShell

Wednesday May 16th 2012 by Rob Gravelle

One of the Database Administrator's most important duties is performing regular scheduled database backups. To this end, tools like mysqldump are indispensable. Now, with the introduction of PowerShell, database backups can be more automated than ever.

One of the Database Administrator (DBA)'s most important duties is performing regular scheduled database backups.   Equally important is the maintenance of the backup files and practicing of the restoration procedure.  All in all, it can be a time intensive endeavor.  That's why having an established reliable backup and restore process is so important.  To this end, tools like mysqldump are indispensable. Now, with the introduction of PowerShell, database backups can be more automated than ever.

Using mysqldump

The mysqldump client is a free command line utility that comes with MySQL to perform database backups and restores.  It usually creates *.sql files with DROP table, CREATE table and INSERT into sql-statements of the source database, but it can also be used to generate files in CSV (or other delimited text) or XML format. To restore the database, simply execute the *.sql file on the destination database.  Here's the basic syntax for both:

# mysqldump -u root -p[root_password] [database_name] > dumpfilename.sql

# mysql -u root -p[root_password] [database_name] < dumpfilename.sql

For instance:
mysqldump -u root –adm#1$$$ sonar > sonar.sql

You may also want the stored procedures, functions, and triggers, unless you don't mind spending weeks to rebuild them. By default, mysqldump will backup all the triggers but NOT the stored procedures/functions. There are two mysqldump parameters that control this behavior:

  • –routines – FALSE by default
  • –triggers – TRUE by default

This means that if you want to include the triggers and stored procedures you only need to add the –routines command line parameter:

mysqldump <other mysqldump options> --routines > dumpfilename.sql 

It can sometimes be useful to import stored procedures and triggers to another database that already contains the data. In that case, you could run something like:

mysqldump --routines --no-create-info --no-data --no-create-db --skip-opt <database> > dumpfilename.sql

Then to import them to another database server you would run something like:

mysql <database> < dumpfilename.sql

Backing up Multiple Databases

Mysqldump also supports the designation of multiple or all databases to backup. 

mysqldump -u root -adm#1$$$--databases db1 db2 > db1_and_2.sql

The following example backs up all the databases on the MySQL server:

mysqldump -u root -adm#1$$$ --all-databases > c:\backups\all-databases.sql

In order to decide which databases to backup, you can execute the “show databases” command:


Where PowerShell Fits In

PowerShell's usefulness in database backup jobs is not so much the automation part; mysqldump can be automated by itself using cron jobs or the Windows Task Scheduler.  Rather, it's the ability to easily configure connection information for different environments, create dynamic backup file names incorporating information such as a timestamp, verify that the backup is completed and was successful, as well as provide many possibilities for when things go wrong.

There's no reason to create such a script from scratch, as there are many good ones to be found online.  A popular example can be found on the knowledgebase

It does all of the things mentioned above to fully automate the backup process. For instance, it uses a SELECT statement like the following to generate a timestamp, which is then used to create the backup file names:

                NOW() as TIMESTAMP 

...which results in:

Company            2012-05-01 13:02:44
information_schema 2012-05-01 13:02:44
mysql              2012-05-01 13:02:44
performance_schema 2012-05-01 13:02:44
sonar              2012-05-01 13:02:44
test               2012-05-01 13:02:44

It then invokes the backup command. The /c flag forces the O/S to wait for the backup to complete:

cmd /c " `"$pathtomysqldump`" -h $mysql_server -u $mysql_user
-p$mysql_password $dbname > $backuppathandfile "

Copying stored procedures would just be a matter of adding the –-routines flag to the command:

cmd /c " `"$pathtomysqldump`" --routines -h $mysql_server -u $mysql_user -p$mysql_password $dbname > $backuppathandfile"

Redirecting Mysqldump Errors

One thing missing from the appliedconsultancy script is that mysqldump errors are not trapped. We can wrap the execution so that stdout is piped to our backup file, while stderr is redirected to stdout so that it can be assigned to a variable.  We should ensure that the stderr output variable is an array because it could contain one or more error records. Each error object has a TargetObject property containing the original stderr output. Here's some code to print every error string to the console by wrapping the original dump statement in an @() array constructor:

$err = @((mysqldump -uroot -p --databases nonexistent | out-file C:\backup.sql -enc ascii) 2>&1)

Zipping Backup Files

Even in their backed up state the databases can be quite large. Luckily, there are a few options to compress the size.  One is to use the mysqldump --compress flag.  It compresses all information sent between the client and the server if both support compression. 

If the servers don't natively support compression, you can always use a zip utility such as Winzip or Infozip. Both support command line calls.  Here is some PowerShell code that compresses the backup files using Infozip:

#Zip the backup files
 Set-Location C:\autobackup\databases 
 ./zip *.sql *.bak | out-null 
 remove-Item *.bak 
 remove-Item *.sql 

The Set-Location cmdlet sets the location of the Powershell prompt to the folder where the zip.exe programs, as well as the backup (.sql) files reside. After zipping them up, the script deletes the actual backup files using the remove-item cmdlet.

Automating the Process using the Windows Task Scheduler

For those of you on Windows, setting up the Windows Task Scheduler to run PowerShell scripts is not as easy as you might expect it to be. To run the script you need to pass the powershell.exe the name of the script you want to run along with some supplementary  arguments using the 'Add Arguments' field. Here are the necessary arguments:

–Noninteractive –Noprofile -command "& 'c:\scripts\serverback.ps1'"

Notice that before you pass the script name you have to prepend an ampersand '&' symbol and enclose the entire script path in double quotes. Omitting any of those will cause your script to not run. There is more information about configuring the Windows Task Scheduler to run PowerShell scripts on the Techhead site.

Finally, don't forget to run the command “set-executionpolicy RemoteSigned” on your Powershell command prompt once to set up the necessary permissions or the script execution will be blocked.

See all articles by Rob Gravelle

And Finally…

One thing I've noticed about backups is just how quickly they can take up the whole hard disk.  There are two ways to avoid running out of hard drive space. The first is to buy a ridiculously large one. That's because, unless you want to remap your drives every now and again, you need a lot of space.  There was a time where that wasn't really feasible, but nowadays, spending a little extra can really give you some serious storage space.  The second option is to schedule the deleting of archived databases after a certain allotted amount of time. Ideally, you would follow both practices.


Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved