One of the features I’m really looking forward to in SQL 2008 is backup compression. I have found compressing a SQL backup (data or log file) reduces the size of the file size to 10-25% of the original file. If you have the time and the CPU to compress your backups you can take a significant load off of your backup media.
Unfortunately in SQL 2005 there is no native way to compress backups. In this post I’ll detail how to implement. It’s not as clean of a solution as SQL 2008 (where you just check a box), however it’s reliable, free, and meets the desired goal.
The first step is to produce the backup files as normal. Nothing changes here. Just create your backups with the native backup tools as you always have.
After the backup has been created we will use gzip to compress the files after they have been written to disk. Gzip has been in the Unix world forever, and there is a windows port of it available at http://www.gzip.org/. Grab the latest version and place it in your path (I put it in C:windowssystem32).
We want gzip to compress every file in the backup directory. Gzip will only compress files that are not already compressed so we just have to point it to the right directory and let it run.
gzip –r –fast d:SQLBackup
The –r is to recursively follow sub-directories.
The –fast is to use fast compression. You can remove this to use normal compression, or use –best to have it compress the data as much as possible. There is a trade-off between time to compress, and how much space savings you will get. You just need to try it with your data, your systems, and your backup windows to see what setting makes most sense for you.
Unfortunately the standard “Maintenance Cleanup Task” provided in the maintenance plans will not remove any file except standard SQL backup files. So once we compress the file we are no longer able to remove old backups with the native tools.
The solution I use to solve this is a command line utility called delold. Once again I place delold in the path (I put it in C:windowssystem32). To remove files from the folder SQLBackup which are older than 14 days we run:
delold d:SQLBackup -rd 14
The –r once again causes sub-directories to be followed recursively.
The “d 14” causes files with a time stamp older than 14 days to be deleted.
Now that we have the steps to compress and cleanup your backups we know need to add them to your currently scheduled backup jobs.
We want to run compression after every successful backup job. To do this open the SQL Agent job and add a step of type “Operating system (CmdExec) and give it the appropriate command line string:
Personally I have the removal of old backup jobs run after successful backups. However you may run them in a different SQL Agent job. Either way every time you had a “Maintenance Cleanup Task” you need to remove that step from your maintenance plan and add a new step to run delold to the SQL Agent job.
Once again in the SQL Agent job create a step of type “Operating system (CmdExec)” to run delold:
Finally we need to configure the steps to run in the correct order and to not continue to the next step if it fails. I’m going to make the assumption that you have the Backup, Compression, and Cleanup all in the same SQL Agent job. If not you will need to modify these steps to fit your needs.
On the backup step set “On Success Action” to continue to the next step. This means that if the backup step is successful continue to compress the backups.
However if the backups fail there is no reason to run compression. Set “On failure action” to “Quite the job reporting failure”, to end the SQL agent job. You need to set the Compress step the same way (Except On Success go to cleanup step)
The only difference on setting the Cleanup step is to “Quit the job reporting success” if cleanup successfully completes.
If your looking to save significant disk space and you can not move to SQL Server 2008 yet this may be a good solution for you. Just remember that your trading disk space for CPU and backup time. You’ll also have to decompress your backups before you can restore so that may increase your restore times.
However you probably need to move the backup files across the network during your backup and during your restore. The decreased size may save you more time moving data across the network then the compression/decompression takes. Again you will just have to run tests in your environment to see what the compression does to your backup windows and your storage needs.
As you can see doing compression without a 3rd party product isn’t the prettiest configuration you’ll ever have. There is some significant time spent configuring the solution upfront. However in my experience once you have the solution in place it is rock solid.
SoCalCodeCamp Fullerton January 26th, 2009 Had a great weekend at the SoCalCodeCamp at CSU Fullerton this weekend. I’m really...
Windows Server Core: Installing ADDS Role
Speaking @ SoCalCodeCamp, Fullerton
1/29/11-1/30/11: Speaking @ SoCalCodeCamp, Fullerton