AWS RDS SQL Server vs. SQL Azure Smackdown – Importing Data

This is first in a series of comparisons between Amazon Web Services RDS SQL Server and SQL Azure. It is useful for me to understand exactly which features and tools work with cloud-deployed instances of SQL Server. In this screencast I take a look at common methods to import data. These include backup/restore, DACPAC and other tools such as the SQL Azure Migration Wizard (available from CodePlex).

Do the tools work? How well? Watch the video and find out.

First Look – SQL Server on Amazon Web Services RDS

Of course I had to try it out!  Here’s the announcement from @Werner – and the documentation from the AWS site on the new support for SQL Server in AWS RDS.  There is a free usage tier as follows:

“If you are a new Amazon RDS customer, you can get started with Amazon RDS for SQL Server with a Free Usage Tier, which includes 750 hours per month of Amazon RDS micro instances with SQL Server Express Edition, 20GB of database storage and 10 million I/O requests per month.”

Particularly of note is the announcement that in addition to supporting all editions of SQL Server 2008 R2 (i.e. Express, Developer, Standard and Enterprise), AWS intends to support SQL Server 2012 on RDS later this year as well.

I recorded a short screencast to show you start-to-connecting in SSMS with SQL Server on AWS RDS — enjoy!

BTW…take a look at out my new class ‘No SQL for the SQL Server Pro’ (May 22 in Anaheim, CA — reduced price too!)

TechEd Africa October 2011

Llewellyn and I are giving three presentations here in Durban.  I’ll link the slides and demo code below.

1) SQL Azure Tools

2) SQL Server 2008 R2 SP1 for Developers

Here’s a link to the T-SQL demos from this talk – here.

3) Test-Driven Development using Visual Studio 2010

Also you may enjoy the Approval Tests presentation (we use the Approval Tests library during the TDD talk)

How fast is Microsoft’s cloud?

I often get asked this question when I am presenting technical information about Windows Azure or SQL Azure data storage to developer or DBA audiences.  Of course the most accurate answer is ‘it depends’ (on bandwidth, latency, location, amount of data, etc…).  However I thought it would be useful to give a very specific answer for a specific case as well.

To that end, today I am working on a project to set up a new Azure account (for upcoming live technical presentations).  While doing so I’ll blog on the ‘how and ‘how long’ so that you can get a specific answer.

Here’s the scenario.

I have multiple local servers and databases, all running SQL Server 2008 R2 S1 or higher.  I have multiple Azure accounts, all based in the US.  I will doing some work soon overseas, so I want to set up an Azure account based in Europe for those talks.  To do that, I will be performing the following operations:

1) Create a new Windows Azure account (I’ll create a trial account) for demo purposes – main link and page to do this is shown below.

image

2) Connect to the Windows Azure service using the portal (shown below).  Set up a new SQL Azure database server in the Northern European region (I live in the US – Southern California).

image

3) Set up firewall access rule for the SQL Azure instance.

4) Create two new sample databases on the SQL Azure instance.

5) Populate those instances with sample data.

6) Create a Windows Azure storage container (for BLOB storage).  Create a public container to hold an exported copy (*.bacpac file) of one of the SQL Azure databases (both schema and data).

7) Test importing the *.bacpac file back into the SQL Azure instance.

I used the following tools:

1) Windows Azure portal – http://windows.azure.com – logged in with my Windows Live ID.

2) SQL Azure Migration Wizard – from Codeplex – to create databases, database objects and populate those objects with data (FREE and 3rd party).

3) Windows Azure Storage – from Codeplex – to create, view and test a BLOB storage container (FREE and 3rd party).

4) Windows Azure portal (again) – to export the contents of one database as BLOB, and also to try importing the BLOB back in as a copy of the database.

Here are the results:

1) I hadn’t set up a new trial account for Windows Azure for a couple of months.  Previously it took DAYS to get started.  I was pleasantly surprised to see that my new account was ready within 3 minutes.  I just had to complete verification of my request, by entering in a texted numeric code into the browser verification page, login using my Windows Live ID and then I could start.  Total time 10 minutes.

2) When I connected to my new account in the Northern European Azure data center using the Windows Azure portal, I saw no difference in latency than when connecting to a Northern US data center (from Los Angeles) with the same tool.  I was able to quickly create a new SQL Azure server, to assign firewall rules and to create a new Windows Azure storage account (BLOB storage, to test the import/export functionality now included in SQL Azure).  This took me about 5 minutes total.

3) Next was to create and to populate two sample databases.  I wanted to test this two ways.  The first way was to migrate schema and data from an on-premise SQL Server instance to the cloud. The data transfer and application of scripts took about 4 minutes using the SSMW.  Shown below is output from the tool, with the total time to generate the source scripts for both schema (DDL) and data (BCP) and then to apply those scripts to the destination. 

TotalTimeFromSoCal

After that I wanted to test migrating a database (schema and data) from one Windows Azure data center (US Northwest) to another Azure data center (Northern Europe).  The application of the DDL and BCP scripts took took 6 minutes using the SSMW.

ScriptingFromCloud_NorthAmerica

So the total time from start to finish to use the tool, generate the scripts for 2 databases, one local and another one already in the cloud and then to apply those scripts was about 15 minutes.  I could have stopped there (at 30 minutes total from account sign up to deployment of databases), however I wanted to test out a couple of other features.

4) After that I wanted to test out the newly-added import/export functionality via the Azure portal.  This is shown below (highlighted area).

image

To do this I had to create a named container in my Windows Azure storage account.  I used the Windows Azure storage tool to do this.  I then tested the new container out first by uploading a small blob file (screen shown below).  That took less than 1 minute.

TestBlob

Then I tried out the export database functionality.  I had a couple of tries to get the URL formed correctly.  Unfortunately the ‘status’ reporting tool in the Windows Azure portal didn’t seem to work correctly, reporting failure, even after I managed to get the URL entered correctly (screen shown below).  The export itself, after I managed to get the URL entered in correctly took less than 1 minute.

The Windows Azure storage tool was really useful in helping me to verify that the ‘database as BLOB’ exported correctly.  I really think this tool is great!

LoadingBLOG

5) Last I tried out importing the BLOB back into SQL Azure.  In the case the status tool worked, reporting ‘completion’ (shown below).  This import task took a whopping 1 minute.

ImportExportStatus

Conclusions

I know that I am working with non-production sized (i.e. small) databases for this sample with both of my sample databases being in the MB size range.  What has been interesting for me is the lack of difference in latency working with Windows Azure US and European data centers FROM THE US.  As my ‘speaking tour’ gets underway this week, I’ll continue to test (and to blog) on the actual latency from the various locations that I’ll be presenting from.

Also, I am a BIG fan of GUI tools.  They are real time savers and I will keep testing, pushing the limits of Microsoft tools as well as sharing useful 3rd party tools that I discover.

I am interested in hearing about your experiences with latency and tools with cloud database work.  How’s it been going for you?

SQL Server and SQL Azure for Developers

Here are the decks from my presentations for TechEd Australia this week.  The sessions will be recorded.  I’ll post the links after – thanks.
http://channel9.msdn.com/Events/TechEd/Australia/Tech-Ed-Australia-2011/COS203/player?w=640&h=480
Slide deck

And Here
http://channel9.msdn.com/Events/TechEd/Australia/Tech-Ed-Australia-2011/DAT302/player?w=640&h=480

Here is the demo code

1) T-SQL – here
2) OData and SQL Azure – here
3) Entity Framework – here