AWS, Azure, Microsoft, SQL Azure

Introducing SQL Azure Premium

Microsoft recently previewed SQL Azure Premium instance (beta).  This is set up on a per-server basis, that is a server can host SQL Azure WEB, BUSINESS or PREMIUM (P1 or P2) instances.  A database type can be upgraded on the same server.  Important note from Microsoft site :”We do not provide any SLA for SQL Database Premium during preview.”

I am interested in your feedback on this offering – please comment below.

Also to be complete, I am including a link to Amazon Web Services RDS (which includes full SQL Server 2008 or 2012) SLA, of particular note is the ‘provisioned IOPS section of their SLA.

AWS, Azure, Cloud, Microsoft, SQL Azure, SQL Server 2008, SQL Server 2012

AWS RDS SQL Server vs. SQL Azure Smackdown – Importing Data

This is first in a series of comparisons between Amazon Web Services RDS SQL Server and SQL Azure. It is useful for me to understand exactly which features and tools work with cloud-deployed instances of SQL Server. In this screencast I take a look at common methods to import data. These include backup/restore, DACPAC and other tools such as the SQL Azure Migration Wizard (available from CodePlex).

Do the tools work? How well? Watch the video and find out.

Microsoft, SQL Azure

Updated: SQL Azure web portal–does it work in Chrome?

The short answer would appear to be yes (after an update to the Silverlight installation that did require a browser restart). 

The Azure team had a number of announcements and releases today.  The most interesting of which for Azure in general is the support of Node.js.  Here’s a link to a 4 minute MSDN video demo.

There were also a number of interesting updates to SQL Azure.  These include the following:

1) Maximum database size now up from 50 GB to 150 GB

2) Price cap (for max size) at $ 499 US – was old cap for 50 GB

3) Federations – support creation in web management portal

4) Updated (Metro-style) management portal

Here’s a link to the SQL Azure announcements and a link to the Windows Azure announcements. 

Screen Shots

Here are some screen shots of the updated management portal.

First the ‘Overview’ screen – with clickable (Metro-style) tiles.

image

Next the Administration screen – it shows service usage.  Also you can create a new SQL Azure Federation by clicking the highlighted button (upper right).

image

Next is a query screen – as shown below you may note that you can now view both actual and estimated execution plans via GUI output in the web portal.  There is, however, still NO intellisense when you type in T-SQL queries.

image

Next is a nice addition – shown below is a dependency diagram.

image

Overall, this is a good release, adding some features that I will use.  What do you think, have you tried it out yet? 

Big Data, Data Science, noSQL, SQL Azure

Relational Cloud Storage is 50X More Expensive than NoSQL

In my new (post-Microsoft) career, I am looking forward to building cloud-based data solutions for customers around the world.  To that end, I decided to take a look at the ‘state of cloud data pricing’, so that I could supply some kind of pricing estimates to potential customers. 

You may be interested in what I found.

NoSQL Monthly Storage costs for 100 GB of data

There are two sides to the story.  First, the positive.  Unstructured, non-relational or NoSQL storage is remarkably cheap.  I took a look across all major vendors, and tried to hold 100 GB / storage and a monthly fee as constant.  Also I used 10GB/up and 100GB/down for usage.  For the vendors I looked at monthly pricing baselines (some had additional charges for PUT, GET, etc…) was from $9 to $24.  Below is a chart summarizing results.

image

Of course the only thing I am really measuring here is pure storage cost.  I did NOT take into account any SLAs around availability (or lack there of), ease of access via tools or APIs or actual performance of any of these services.  Also much to the probable consternation of the vendors, I ‘mixed’ both cloud storage intended for personal use (such as DropBox or Amazon CloudStorage) with cloud storage intended for business use (such as Microsoft Azure Tables or Amazon SimpleDB).  Also, of note is Database.com (from SalesForce.com).  Although I was able to set up a test account, pricing seems to be quotable only in terms of transactions and not in terms of storage space, so I couldn’t think of a way to include their offering in this comparison.

Relational Cloud Storage costs

Of course the full RDBS in the cloud services are a younger market, however, even I was startled at the dramatic difference in pricing.  I am fully aware of the advantages of RDBMS implementations (vs. NoSQL), having ‘evangelized’ SQL Azure for the past year when I was working as a Microsoft evangelist.  Those advantages include familiar tooling, programming models, transactional support, query tuning support and built-in high availability and are not insignificant.  However, the cost for 100 GB / month for cloud-based RDMS systems vs. NoSQL solutions is also significant – that cost is around 50x GREATER than for the same amount of non-relational storage.  Below is a comparison of the vendors who had pricing information that I could make sense of.

image

There were far fewer vendors in this field.  Google has a beta offering, hosting mySQL, but hasn’t announced pricing that I could find on their site.  You’ll note that I ‘rolled-up’ the Amazon offerings for mySQL or Oracle to around $ 600 /month as a baseline.  Actual inputs included many other options.  I’ll include screen shots at the end of this blog, so that you can see all of the input parameters I used.  I addition to the other offerings, also offers SQL Server on one of their EC2 instances.  One particularly difficult aspect of comparison for cloud-hosted RDBMS systems is that Amazon’s rate vary by storage, input/output, other factors AND by compute size (i.e. small, med, large, huge, etc…) whereas Microsoft prices by amount stored and in/out only.  Still, Amazon’s offerings appear to be around 1/2 the prices of Microsoft’s SQL Azure. Also of note is that the current size capacity for SQL Azure databases is 50 GB, so the pricing here reflects purchasing two instances for a total of 100 GB of storage between those two instances.

So taking the high-end of both RDBMS (Microsoft SQL Azure) and dividing by the high-end of NoSQL (Amazon S3) that is $1000/$24 or 41X more; on the low-end we have mySQL on Amazon and personal CloudStorage on Amazon or $600/$9 or 67X more, adding 41+67 = 108, averaging around 50X greater.

I will also say that I did these comparisons as fairly as I could, however it was quite difficult to compare vendor-to-vendor as the service offerings differ.  To that end, as mentioned, I will include screen shots from the vendor’s own websites at the end of this blog post so that you can see exactly how I did these comparisons.

Conclusions

Clearly moving data to the cloud has many elements of unpredictability.  Based on my quick survey, it seems obvious that anyone building a cloud-data solution would want to consider both non-relational storage for the price difference alone. 

Of course, when data is involved there are many other factors – these include SLAs, actual uptime, performance, method of querying, ability to performance tune (i.e. index), security, backup/restore, etc….  As I move back into production work, I’ll use this blog to document my journey into the cloud with data – I’ll certainly be investing these other aspects.

I am also wondering what your experience has been?  Do you have production data in the cloud?  How much has it really cost?

Reference Materials and Screen Shots of Vendor Pricing Calculators

Amazon – http://calculator.s3.amazonaws.com/calc5.html

Amazon_Cloud_Drive

Amazon_EC2_SQLServer

Amazon_RDS_mySQL

Amazon_RDS_Oracle

Amazon_Simple_DB

DropBox – https://www.dropbox.com/plans

DropBox_pricing

GoGrid – http://www.gogrid.com

Google – http://code.google.com/apis/storage/docs/pricingandterms.html

Google_Data

Google_Data_2

Microsoft  – http://www.microsoft.com/windowsazure/pricing/ 
Calculator – http://www.microsoft.com/windowsazure/pricing-calculator/

SQLAzure_pricing

RackSpace  – http://www.rackspace.com/cloud

Rackspace

SalesForce.com – http://devcenter.database.com/page/FAQ#What_is_the_pricing_for_Database.com.3F

Database_com_pricing

Agile, SQL Azure, SQL Server 2008, Technical Conference

TechEd Africa October 2011

Llewellyn and I are giving three presentations here in Durban.  I’ll link the slides and demo code below.

1) SQL Azure Tools

2) SQL Server 2008 R2 SP1 for Developers

Here’s a link to the T-SQL demos from this talk – here.

3) Test-Driven Development using Visual Studio 2010

Also you may enjoy the Approval Tests presentation (we use the Approval Tests library during the TDD talk)

SQL Azure, SQL Server 2008

How fast is Microsoft’s cloud?

I often get asked this question when I am presenting technical information about Windows Azure or SQL Azure data storage to developer or DBA audiences.  Of course the most accurate answer is ‘it depends’ (on bandwidth, latency, location, amount of data, etc…).  However I thought it would be useful to give a very specific answer for a specific case as well.

To that end, today I am working on a project to set up a new Azure account (for upcoming live technical presentations).  While doing so I’ll blog on the ‘how and ‘how long’ so that you can get a specific answer.

Here’s the scenario.

I have multiple local servers and databases, all running SQL Server 2008 R2 S1 or higher.  I have multiple Azure accounts, all based in the US.  I will doing some work soon overseas, so I want to set up an Azure account based in Europe for those talks.  To do that, I will be performing the following operations:

1) Create a new Windows Azure account (I’ll create a trial account) for demo purposes – main link and page to do this is shown below.

image

2) Connect to the Windows Azure service using the portal (shown below).  Set up a new SQL Azure database server in the Northern European region (I live in the US – Southern California).

image

3) Set up firewall access rule for the SQL Azure instance.

4) Create two new sample databases on the SQL Azure instance.

5) Populate those instances with sample data.

6) Create a Windows Azure storage container (for BLOB storage).  Create a public container to hold an exported copy (*.bacpac file) of one of the SQL Azure databases (both schema and data).

7) Test importing the *.bacpac file back into the SQL Azure instance.

I used the following tools:

1) Windows Azure portal – http://windows.azure.com – logged in with my Windows Live ID.

2) SQL Azure Migration Wizard – from Codeplex – to create databases, database objects and populate those objects with data (FREE and 3rd party).

3) Windows Azure Storage – from Codeplex – to create, view and test a BLOB storage container (FREE and 3rd party).

4) Windows Azure portal (again) – to export the contents of one database as BLOB, and also to try importing the BLOB back in as a copy of the database.

Here are the results:

1) I hadn’t set up a new trial account for Windows Azure for a couple of months.  Previously it took DAYS to get started.  I was pleasantly surprised to see that my new account was ready within 3 minutes.  I just had to complete verification of my request, by entering in a texted numeric code into the browser verification page, login using my Windows Live ID and then I could start.  Total time 10 minutes.

2) When I connected to my new account in the Northern European Azure data center using the Windows Azure portal, I saw no difference in latency than when connecting to a Northern US data center (from Los Angeles) with the same tool.  I was able to quickly create a new SQL Azure server, to assign firewall rules and to create a new Windows Azure storage account (BLOB storage, to test the import/export functionality now included in SQL Azure).  This took me about 5 minutes total.

3) Next was to create and to populate two sample databases.  I wanted to test this two ways.  The first way was to migrate schema and data from an on-premise SQL Server instance to the cloud. The data transfer and application of scripts took about 4 minutes using the SSMW.  Shown below is output from the tool, with the total time to generate the source scripts for both schema (DDL) and data (BCP) and then to apply those scripts to the destination. 

TotalTimeFromSoCal

After that I wanted to test migrating a database (schema and data) from one Windows Azure data center (US Northwest) to another Azure data center (Northern Europe).  The application of the DDL and BCP scripts took took 6 minutes using the SSMW.

ScriptingFromCloud_NorthAmerica

So the total time from start to finish to use the tool, generate the scripts for 2 databases, one local and another one already in the cloud and then to apply those scripts was about 15 minutes.  I could have stopped there (at 30 minutes total from account sign up to deployment of databases), however I wanted to test out a couple of other features.

4) After that I wanted to test out the newly-added import/export functionality via the Azure portal.  This is shown below (highlighted area).

image

To do this I had to create a named container in my Windows Azure storage account.  I used the Windows Azure storage tool to do this.  I then tested the new container out first by uploading a small blob file (screen shown below).  That took less than 1 minute.

TestBlob

Then I tried out the export database functionality.  I had a couple of tries to get the URL formed correctly.  Unfortunately the ‘status’ reporting tool in the Windows Azure portal didn’t seem to work correctly, reporting failure, even after I managed to get the URL entered correctly (screen shown below).  The export itself, after I managed to get the URL entered in correctly took less than 1 minute.

The Windows Azure storage tool was really useful in helping me to verify that the ‘database as BLOB’ exported correctly.  I really think this tool is great!

LoadingBLOG

5) Last I tried out importing the BLOB back into SQL Azure.  In the case the status tool worked, reporting ‘completion’ (shown below).  This import task took a whopping 1 minute.

ImportExportStatus

Conclusions

I know that I am working with non-production sized (i.e. small) databases for this sample with both of my sample databases being in the MB size range.  What has been interesting for me is the lack of difference in latency working with Windows Azure US and European data centers FROM THE US.  As my ‘speaking tour’ gets underway this week, I’ll continue to test (and to blog) on the actual latency from the various locations that I’ll be presenting from.

Also, I am a BIG fan of GUI tools.  They are real time savers and I will keep testing, pushing the limits of Microsoft tools as well as sharing useful 3rd party tools that I discover.

I am interested in hearing about your experiences with latency and tools with cloud database work.  How’s it been going for you?