Introduction
This article is a short "how to" that demonstrates backup of data that is running inside an SQL 2012 Virtual Machine on the Azure platform from the Virtual Machine to a blob container.
Background
For various reasons including security and privacy, we have had to host some recent multi-tenant applications in separate MS SQL Databases. The databases themselves are hosted in multiple SQL 2012 data-servers, running on multiple different Virtual Machines. While there are a number of good database backup providers on the market, none had a pricing model that suited our particular configuration, so we had to put something together ourselves - this article is the result, hopefully it can be of benefit to someone else!
The method detailed here is based on "BACKUP TO URL" that is outlined here.
The setup
We have one core database that contains table with a listing of each separate client database; this acts as our starting point. We get a list of these databases, and backup each one in turn. You dont need this of course, you can simply enumerate all databases in the Data-server if you wish, this is simply our approach. To do the actual backup, we use the enhanced "BACKUP" SQL command that allow the use of a URL as a target device.
The target information required for the remote backup "device" is as follows:
http[s]:
The relevant parts are:
(1) You need to enter your ACCOUNTNAME on azure
(2) You give a container name (we called ours "SQLBackup"
(3) You give a filename (we created one dynamically)
The full SQL is simple:
BACKUP DATABASE AdventureWorks2012 TO URL = 'https://ACCOUNTNAME.blob.core.windows.net/mycontainer/AdventureWorks2012_1.bak' WITH CREDENTIAL = 'mycredential', STATS = 5;
Apart from the items (1..3) above, you are providing a "CREDENTIAL" to the command. The credential is details of your ACCOUNTNAME on Azure PLUS your blob container SHARED KEY.
To get both, go to your Azure dashboard and get the keys by selecting your container storage, and clicking "manage keys" at the bottom of the screen
Once you have the keys, you then need to get your blob container name –
this should be the same as your storage account name but if not it can be found
at the bottom of the blob dashboard:
To create the credential called for example “MyCred”, you need to run
this piece of SQL:
IF NOT EXISTS (SELECT * FROM sys.credentials WHERE credential_identity = 'MyCred') CREATE
CREDENTIAL MyCred WITH IDENTITY = 'MyCred' ,SECRET = [YOUR SECRET KEY]' ;
The credential “MyCred” is now available for use.
To be able to run a test, we need to set up a folder in your blob container
– we called ours “SQLBackup” – name yours whatever you wish.
We should have enough now to be able to run a test. Lets assume the
following:
Database name = “MyDatabase”
Credential = “MyCred”
Storage Container = “MyAzureStorage”
Storage folder within the container = “SQLBackup”
Secret = “ABC123”
Our test command is therefore as follows:
BACKUP DATABASE MyDatabase TO URL = 'http://MyAzureStorage.blob.core.windows.net/sqlbackup/MyDatabase.bak'
WITH CREDENTIAL = 'MyCred', COMPRESSION ,STATS = 5
If we run that, we should see a result something like this:
Processed 384 pages for database 'MyDatabase, file MyDatabase' on file
1.
100 percent processed.
Processed 2 pages for database MyDatabase', file MyDatabase_log' on file
1.
BACKUP DATABASE successfully processed 386 pages in 0.173 seconds
(17.397 MB/sec).
We can confirm the data has been transferred correctly by looking in our
Azure blob container
Wrapping
it all together
Ok, here’s the pain removal part… let’s put together a stored procedure
that iterates through each database and backs it up. The logic is as follows:
(1) Create a temp table to store a list of databases (delete it first if it
exists)
create procedure BackupToAzure as
begin
IF EXISTS
( SELECT * FROM tempdb.dbo.sysobjects WHERE ID =
OBJECT_ID('tempdb.dbo.#TablesToBackup'))
BEGIN
DROP TABLE #TablesToBackup
END
Create table #TablesToBackup(DBName
nvarchar(100))
(2) Select the table data ad insert into the temp table. In our case, we are storing the database name
along with the data-server name, separated with a colon, so we use the “CharIndex”
function to split out what we want first
insert into #TablesToBackup
select
SUBSTRING(LinkName,CHARINDEX(':', LinkName)+1,100) as DBaseName from ClientDatabases
We then use a table cursor to iterate
through the temp table, building an execution string as we go, made up of the
URL, FileName, etc.
declare @C cursor
set @C = cursor for
select
* from #TablesToBackup
declare @F nvarchar(100)
open @C
while 0=0
begin
fetch next from @C into @F
if not(@@FETCH_STATUS = 0)
break
DECLARE @pathName
NVARCHAR(512)
DECLARE @CMD
NVARCHAR(300)
DECLARE @URL
NVARCHAR(300)
DECLARE @Break
NVARCHAR(5)
DECLARE @FileDate NVARCHAR(20)
SET QUOTED_IDENTIFIER
OFF
Set @URL = 'https://MyAzureStorage.blob.core.windows.net/sqlbackup/'
Set @Break = '_'
Set @FileDate =
Convert(varchar(8), GETDATE(), 112) + '.bak'
Set @Cmd = @URL + @F
+ @Break + @FileDate
BACKUP DATABASE @F
TO URL = @CMD WITH
FORMAT, CREDENTIAL = 'MyCred', COMPRESSION ,STATS = 5
SET QUOTED_IDENTIFIER
ON
end
close @C
deallocate @C
DROP TABLE #TablesToBackup
end
OK, almost there. If we run the script, it creates the stored procedure.
When we execute the stored procedure, it iterates successfully through each
database and backs them up in turn.
That’s great, the only thing is we don’t want to have to go in manually
and run the stored procedure ourselves, so we create a scheduled task to carry
it out for us. In MS Management Studio, select SQL Server Agent, right-click
and create new job.
Give the job a name..
And add a new “Step”
It is important to select the database that the Stored Procedure is
located otherwise you may get an error. In the “Command” box, enter your “EXEC”
command with the name if your stored procedure.
Finally, create a new SCHEDULE item and fill in the timing details you
want…
The final thing is to be able to view whats going on – for this, we need
access to the job history. This is accessed with a right-click on the job task.
The history gives information on the health of the process
The only thing remaining is to backup the backup! … for the moment we
are testing out RedGates solution to backup the *blob container* that contains
all of our client data – I may report on that as we see long term results.
As with all my articles, I ask that you pease take a second to rate it so
I can track what people are interested in!
Many thanks and happy coding.