I often get asked this question when I am presenting technical information about Windows Azure or SQL Azure data storage to developer or DBA audiences. Of course the most accurate answer is ‘it depends’ (on bandwidth, latency, location, amount of data, etc…). However I thought it would be useful to give a very specific answer for a specific case as well.
To that end, today I am working on a project to set up a new Azure account (for upcoming live technical presentations). While doing so I’ll blog on the ‘how and ‘how long’ so that you can get a specific answer.
Here’s the scenario.
I have multiple local servers and databases, all running SQL Server 2008 R2 S1 or higher. I have multiple Azure accounts, all based in the US. I will doing some work soon overseas, so I want to set up an Azure account based in Europe for those talks. To do that, I will be performing the following operations:
1) Create a new Windows Azure account (I’ll create a trial account) for demo purposes – main link and page to do this is shown below.
2) Connect to the Windows Azure service using the portal (shown below). Set up a new SQL Azure database server in the Northern European region (I live in the US – Southern California).
3) Set up firewall access rule for the SQL Azure instance.
4) Create two new sample databases on the SQL Azure instance.
5) Populate those instances with sample data.
6) Create a Windows Azure storage container (for BLOB storage). Create a public container to hold an exported copy (*.bacpac file) of one of the SQL Azure databases (both schema and data).
7) Test importing the *.bacpac file back into the SQL Azure instance.
I used the following tools:
1) Windows Azure portal – http://windows.azure.com – logged in with my Windows Live ID.
2) SQL Azure Migration Wizard – from Codeplex – to create databases, database objects and populate those objects with data (FREE and 3rd party).
3) Windows Azure Storage – from Codeplex – to create, view and test a BLOB storage container (FREE and 3rd party).
4) Windows Azure portal (again) – to export the contents of one database as BLOB, and also to try importing the BLOB back in as a copy of the database.
Here are the results:
1) I hadn’t set up a new trial account for Windows Azure for a couple of months. Previously it took DAYS to get started. I was pleasantly surprised to see that my new account was ready within 3 minutes. I just had to complete verification of my request, by entering in a texted numeric code into the browser verification page, login using my Windows Live ID and then I could start. Total time 10 minutes.
2) When I connected to my new account in the Northern European Azure data center using the Windows Azure portal, I saw no difference in latency than when connecting to a Northern US data center (from Los Angeles) with the same tool. I was able to quickly create a new SQL Azure server, to assign firewall rules and to create a new Windows Azure storage account (BLOB storage, to test the import/export functionality now included in SQL Azure). This took me about 5 minutes total.
3) Next was to create and to populate two sample databases. I wanted to test this two ways. The first way was to migrate schema and data from an on-premise SQL Server instance to the cloud. The data transfer and application of scripts took about 4 minutes using the SSMW. Shown below is output from the tool, with the total time to generate the source scripts for both schema (DDL) and data (BCP) and then to apply those scripts to the destination.
After that I wanted to test migrating a database (schema and data) from one Windows Azure data center (US Northwest) to another Azure data center (Northern Europe). The application of the DDL and BCP scripts took took 6 minutes using the SSMW.
So the total time from start to finish to use the tool, generate the scripts for 2 databases, one local and another one already in the cloud and then to apply those scripts was about 15 minutes. I could have stopped there (at 30 minutes total from account sign up to deployment of databases), however I wanted to test out a couple of other features.
4) After that I wanted to test out the newly-added import/export functionality via the Azure portal. This is shown below (highlighted area).
To do this I had to create a named container in my Windows Azure storage account. I used the Windows Azure storage tool to do this. I then tested the new container out first by uploading a small blob file (screen shown below). That took less than 1 minute.
Then I tried out the export database functionality. I had a couple of tries to get the URL formed correctly. Unfortunately the ‘status’ reporting tool in the Windows Azure portal didn’t seem to work correctly, reporting failure, even after I managed to get the URL entered correctly (screen shown below). The export itself, after I managed to get the URL entered in correctly took less than 1 minute.
The Windows Azure storage tool was really useful in helping me to verify that the ‘database as BLOB’ exported correctly. I really think this tool is great!
5) Last I tried out importing the BLOB back into SQL Azure. In the case the status tool worked, reporting ‘completion’ (shown below). This import task took a whopping 1 minute.
I know that I am working with non-production sized (i.e. small) databases for this sample with both of my sample databases being in the MB size range. What has been interesting for me is the lack of difference in latency working with Windows Azure US and European data centers FROM THE US. As my ‘speaking tour’ gets underway this week, I’ll continue to test (and to blog) on the actual latency from the various locations that I’ll be presenting from.
Also, I am a BIG fan of GUI tools. They are real time savers and I will keep testing, pushing the limits of Microsoft tools as well as sharing useful 3rd party tools that I discover.
I am interested in hearing about your experiences with latency and tools with cloud database work. How’s it been going for you?