(Print this page)

Data in Azure – Part 2
Published date: Saturday, August 2, 2014
On: Moer and Éric Moreau's web site

This is the second (and last) part about storing databases in Azure.

Last month, we have copied a local database to one of the feature of Microsoft Azure called SQL Databases (see http://emoreau.com/Entries/Articles/2014/07/Data-in-Azure--Part-1.aspx).

This month, I will explore how to use Microsoft SQL Server Data Files in Azure. This feature requires that you have Microsoft SQL Server 2014 installed in your enterprise (or at least on your PC for testing).

What are Data Files in Azure?

Last month, we focused on SQL Databases which let you use a database without having to install a SQL Server in your Enterprise.

This month feature will use the engine in your enterprise but the storage of Azure. Your physical SQL files (like .mdf and .ldf) will be stored in your Azure storage account but the local engine will be used.

This means that you can have huge databases used locally but the files are really in the cloud. This is useful:

 

  • if you think of having larger data files than the place on your hard drives/SAN
  • to quickly pass the ownership of a database to another server

 

Downloadable demo

This is the very same application as last month. The sample is provided in both VB and C#. It was created using Visual Studio 2013 but surely can be used with earlier versions.

What you need

This feature is available only with MS SQL Server 2014 and an Azure account.

Create a storage

You will need a storage to store the files. From the Azure portal, create one much like what is shown in figure 1.

Figure 1: creating a storage

Once your storage has been created, click on it. A button titled “Manage Access Keys” should be displayed at the bottom of the portal opening the dialog shown in figure 2:

Figure 2: Managing your storage keys

You will need one of these 2 keys of the next step.

You also need to generate a secret

There is still one step (at the time of writing) which is still a bit of pain to generate before being able to use this feature. More precisely, you need to generate a secret and the way of generating it is not really friendly.

The easiest way is to download and install the free Azure Storage Explorer.

Once installed, run the Azure Storage Explorer and click the Add Account button. A dialog to create a connection to your storage will open as shown in figure 3. Enter the same storage name you created in the portal and paste in the account key one of the key you copied in the previous step.

Figure 3: Connecting to your storage

Still in the Azure Storage Explorer, once connected to your storage, click the New button under Container and name the container as shown in figure 4. A Storage is much like a disk while a Container would be much like a folder.

Figure 4: Creating a container in the Storage

Now that the container is created, select it from the list of containers and click the Security button and open the Shared Access Policies tab. Enter the values as shown in figure 5, click the Save Policies button and click the Close button (bottom-left).

Figure 5: creating policies for the container

We are not done yet!

Still with your container selected, click again on the Security button but this time open Shared Access Signature tab. Select the policy name you just created from the combo as shown in figure 6, click on the Generate Signature button and click the Copy to Clipboard button.

Figure 6: Generating the signature (the secret!)

We are done with this tool. You can now close it.

You can get more details from http://blogs.msdn.com/b/igorpag/archive/2013/10/23/create-a-sql-server-2014-database-directly-on-azure-blob-storage-with-sqlxi.aspx.

Create a credential in SSMS

We now have everything required to go back to our MS SQL Server Management Studio (SSMS) – don’t forget that 2014 is required here – to create a credential.

From within SSMS, create a new query and paste your signature that we have generated in the previous step before losing it.

Now, in the same query, paste this snippet:

USE master
GO
CREATE CREDENTIAL [https://YourStorageName.blob.core.windows.net/YourContainerName]
   WITH IDENTITY='Shared Access Signature', -- this is a mandatory string. 
   SECRET = 'sr=c&si=policy&sig=YourOwnSecret’
GO  

Because you pasted your signature at top of this query, it will be easy to fill the personal parts. Everything to the left of the ? in the signature goes between the square brackets on the CREATE CREDENTIAL statement. Everything to the right of the ? in the signature goes to the SECRET parameter. You can now execute this statement to generate the credential.

You can check that the credential was properly created by looking under Security in SSMS as shown in figure 7 or by executing this query:

SELECT * from sys.credentials

Figure 7: Exploring the credentials

Finally ready to create a database

After all these steps, we are now ready to create a database with the files located in our storage in Azure.

It cannot be done through the UI yet. You need to write a simple Create Database statement like this one:

CREATE DATABASE [TestAzureDF]
ON 
( NAME = foo_dat, FILENAME = 'https://articlestorage.blob.core.windows.net/datafiles/azuredf.mdf')
LOG ON 
( NAME = foo_log, FILENAME = 'https://articlestorage.blob.core.windows.net/datafiles/azuredf.ldf')
GO

In this query, you need to at least replace articlestorage with the name of your storage and datafiles with the name of your container. The other names (database, file name, …) can be kept as is or freely modified to fit your needs.

After you have executed this query (which may take a good 30 seconds), you have a database recognized by your local server but having its files in the Cloud.

From now on, if you are connected to the Internet, you can use this database without even knowing that the data are miles away from you!

To complete our demo, we need the same table as we had for our other database. So this script will create you exactly the same table but in the storage and even insert some data:

USE [TestAzureDF]
GO
CREATE TABLE [dbo].[Clients](
	[ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,
	[LastName] [varchar](50) NULL,
	[FirstName] [varchar](50) NULL
)
GO
INSERT INTO dbo.Clients ( LastName, FirstName )
VALUES  
  ( 'Dalton-DF', 'Joe' )
, ( 'Dalton-DF', 'Jack' )
, ( 'Dalton-DF', 'William' )
, ( 'Dalton-DF', 'Averell' )

What will the connection string looks like?

To close the loop, we need to go to our application to add a button to use this new data.

You might wander what will the connection string looks like? The data is in the cloud but the engine that queries the data is our local server.

From our application point-of-view, the data is coming from our own server. The application doesn’t know that the data is elsewhere. So our connection string will be exactly as a local server:

Return "Data Source=(local);Integrated Security=SSPI;Initial Catalog=TestAzureDF;"

If you did everything correctly, your application will work.

Some limitations

This technique might sound magic. In some ways, it is. But it has some limitations.

Latency can be one the first time we access the data. SQL Server is able to cache some data but the first hit can be rude with large database.

You might now think that you will create data files in a storage and have multiple SQL Servers engine connected to it. You cannot. Only one server at any time can have the files open (totally understandable because all the transactions and locking are done by the engine).

You might also be tempted to put your data files (.mdf) on your local drive and put the log files (.ldf) in the cloud. This can’t be done neither. Both files need to be in the same location.

Moving an existing database

If you have an existing database that you would like to move to an Azure storage, you can copy your files to your container and use sp_attach_db (or use the FOR ATTACH clause of the create database statement) to attach the files (you will need the very same credentials to be able to do it).

Not everything is free

If you liked what you just tested, you need to know that this is not a free service if you decide to adopt it.

If you decide to go with SQL Databases, check the pricing at http://azure.microsoft.com/en-us/pricing/details/sql-database/.

If you decide to go Data Files in Azure, you need to check for the storage pricing from http://azure.microsoft.com/en-us/pricing/details/storage/.

Benefits of Azure

You are probably already aware of the benefits of Azure. So I won’t list them all here. If we consider only the 2 features explored in this 2 parts article, there are great advantages.

I know, Azure is not free but depending on your needs, it can be a cheap, very cheap options. When you consider the price, you also need to consider other benefits like:

 

  • Near bottom-less storage
  • Offsite
  • Geo-redundant
  • Remote accessibility
  • No device management
  • Less hardware to manage/update/upgrade

 

Storing databases using Azure is surely not suited for all applications but it is surely a good contender for many of them.

There are also some facts to consider like:

 

  • No Internet = no data
  • Latency
  • Not all features are available (varies according to the mechanism you are using)

 

Other options

In these 2 articles I have explored only 2 ways of having a database stored in Azure. Azure also offers a Virtual Machine (VM) feature. The VMs hosted in Azure can also have SQL Server installed on them and your application can connect to it just like any other VM.

And many times, you don’t need a database but only a file to store/persist data. Storage would be a good option.

There are also other providers (like Google and Amazon) offering ways of storing data.

Conclusion

If you need/want a database to be stored in Azure, over the last 2 articles, I have demonstrated 2 ways of achieving it. The first method (last month) was by using Azure SQL Databases. This month, I have demonstrated a feature of SQL 2014 called Microsoft SQL Server Data Files in Azure.


(Print this page)