Easily back up your Dynamics GP SQL Server databases to Azure Storage within minutes!

This blog has a new home at https://blog.steveendow.com/. All future content will be posted there.

Thank you!

By Steve Endow

At the GP Tech Conference 2017 in Fargo, ND, Windi Epperson from Advanced Integrators presented a session on Disaster Recovery. One topic she covered was utilizing the Dynamics GP Back Up Company function to store SQL backups in Azure.

Backing up SQL to Azure is a smart move. There are many stories of backups being mishandled, lost, or poorly maintained. Having a way to periodically send a copy to Azure is a great safety net.

However, this Dynamics GP client application feature requires manual intervention. It lacks scheduling and doesn’t use the “Copy-only backup” option. This means backups become part of the SQL backup chain if the customer also has automated SQL backups. As Windi noted, this might be suitable for small businesses that can consistently handle this manual task.

What about creating a SQL Server backup job to send occasional backups to Azure?

The process is surprisingly simple, requiring only a few minutes to set up and execute your first backup to Azure Storage.

NOTE: It seems SQL backups to Azure are compatible with SQL 2012 SP1 CU2 and later. The backup command syntax may differ slightly between SQL 2012 and 2014 compared to SQL 2016.

The most challenging aspect is configuring your Azure account and establishing the necessary Azure Storage account. It took a bit of effort to determine the appropriate settings.

Firstly, an Azure account is required, which should be straightforward to set up. You can begin the process here: https://azure.microsoft.com/en-us/free/

After setting up your Azure account and logging into the Azure Portal (https://portal.azure.com), select the “More Services” button at the bottom of the services list on the left. Type “storage” in the search bar, and a few choices will appear.

The newer “Storage Accounts” option (not “classic”) was my preference. To keep this in your services list, click the star on the right.

When the Storage Accounts page loads, select the New button at the top.

This will open the Create Storage Account page.

To create a storage account, assign it a distinct name and choose General Purpose. SQL Server 2014 doesn’t seem to work with the Blob storage account type, but this might be compatible with SQL Server 2016.

Select a Replication type. While the more robust replication choices come with higher costs, I’m unsure of the specific pricing for each. You can find a helpful article about storage types here. Geo Redundant Storage is advisable for recovering data if a single data center becomes inaccessible or is destroyed.

Pick an existing resource group or make a new one. Then choose where you want to locate your storage account. After configuring all settings, click Create. You can check the Pin to dashboard box to quickly access your account from the Azure dashboard.

The storage account creation will take a few moments.

When finished, it will display as Available.

Select the storage account to see its settings. Click the Containers item located under “Blob Service” on the left. Then click the + Container button to create a new storage container.

Name the new container and set the access level to Private.

Once the container is set up, click on “Access keys” in the storage account’s left-side menu.

Copy these keys and save them in a secure location.

Next, use your storage account name and one of the keys to create a “credential” on your SQL Server.

CREATE CREDENTIAL azuresqlbackup
WITH IDENTITY= ‘mygpbackups’
, SECRET = ‘yourazurestorageaccountkeyhere==’

You can then execute a simple backup script that utilizes the created SQL credential to perform the backup.

This script is designed to prevent disruption to the backup chain of local database backups by using the COPY_ONLY option. Additionally, it utilizes the COMPRESSION option, which significantly minimizes backup file size and boosts backup speed.

BACKUP DATABASE[DYNAMICS] 
TO URL = ‘https://mygpbackups.blob.core.windows.net/gpsqlbackups/2024-03-11T10:59:59ZDYNAMICS.bak' 
/* URL includes the endpoint for the BLOB service, followed by the container name, and the name of the backup file*/ 
WITH CREDENTIAL = ‘azuresqlbackup’, COPY_ONLY, COMPRESSION;
/* name of the credential you created in the previous step */ 
GO

BACKUP DATABASE[TWO]
TO URL = ‘https://mygpbackups.blob.core.windows.net/gpsqlbackups/2024-03-22T00:00:41ZTWO.bak' 
/* URL includes the endpoint for the BLOB service, followed by the container name, and the name of the backup file*/ 
WITH CREDENTIAL = ‘azuresqlbackup’, COPY_ONLY, COMPRESSION;    

/* name of the credential you created in the previous step */ 
GO  

Execute one of these backup scripts, and if everything is configured correctly, SQL will transfer the bak file to Azure.

Testing this backup on a GP 2016 virtual machine with a 100 mbit internet connection yielded impressive results.

A 250MB DYNAMICS database was backed up in 13.6 seconds, while a 1.3GB TWO database took 96 seconds.

This was without compression! Enabling compression significantly shrinks the backup file and drastically cuts down the backup duration. DYNAMICS went from 18 seconds to 4.5 seconds, and TWO improved from 96 seconds to 14 seconds!

My internet upload speed maxed out at 120 megabits.

Based on these tests, using Azure Storage for SQL Server backups seems to be a great choice. It doesn’t have to replace your primary daily backup solution, but I highly suggest customers send a couple of backups to Azure every week for safekeeping.

Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.](https://www.blogger.com/null)

You can also find him on Twitter, YouTube, and Google+

http://www.precipioservices.com

Licensed under CC BY-NC-SA 4.0