Thursday, 19 March 2009

Changes to SDS Announced

Well, it is a bit of old news at this point:  the SDS team has broken the silence and announced the change from SOAP/REST to a full relational model over TDS.  First, I wanted to say I am super-excited about this change (and I really mean 'super', not in the usual Microsoft sense).  I believe that this will be a net-positive change for the majority of our customers.  I can't tell you how many times I have heard customers say, "well, SDS is great. but, I have this application over here that I want to move to the cloud and I don't want to re-architect to ACE".  Or, "my DBAs understand SQL - why can't you just give me SQL?".  We listen, really.  The feedback was loud and clear.

It may be a tiny bit contentious that this change comes with the removal of the SOAP and REST interfaces.  However, if you really love that flex model, you can get it in Windows Azure tables.

I know I have been pinged a few times to ask if the announcement is the reason for my change from SDS to Windows Azure.  The honest answer is not really.  The real reason for my change is that my group had a small re-org and the former Windows Azure tech evangelist moved up and took on higher level responsibilities (he is now my manager).  That, combined with the changes to SDS made it a natural transition point to move.  Zach Owens has now moved into my group and is looking after SDS - as the former SQL Server evangelist, it makes perfect sense for Zach to take this role now as SDS is now SQL Server in the cloud.

I would expect to see close collaboration between Windows Azure and SDS as this is the killer combination for so many applications.  If you want to know more about the changes and specific details, I would try to catch Nigel Ellis' talk at MIX09 this year or watch it online afterwards.  I will update this post with a specific link once Nigel gives his talk.

Updated:  Nigel's talk is here.

Thursday, 22 January 2009

Azure Issue Tracker Released

I am happy to announce the immediate release of the Azure Issue Tracker sample application.  I know. I know. the name is wildly creative.  This sample application is a simple issue tracking service and website that pulls together a couple of the Azure services:  SQL Data Services and .NET Access Control Service.

This demo is meant to show a realistic SaaS scenario.  As such, it features federation, claims-based authorization, and scalable data storage.

In this post, I will briefly walk through the application such that you can see how it is meant to work and how you might implement something similar.  Over the coming weeks, I will dig into features more deeply to demonstrate how they work.  I would also encourage you to follow Eugenio, who will be speaking more about this particular demo and the 'Enterprise' version over the coming weeks.

Let's get started:

  1. Make sure you have a .NET Services account (register for one here)
  2. Download and extract the 'Standard' version from the Codeplex project page
  3. Run the StartMe.bat file.  This file is the bootstrap for the provisioning process that is necessary to get the .NET Services solution configured as well as things like websites, certificates, and SDS storage.
  4. Open the 'Readme.txt' from the setup directory and follow along for the other bits

Once you have the sample installed and configured, navigate to your IssueTracker.Web site and you will see something like this:

image

Click the Join Now and then select the 'Standard' edition.  You will be taken to the 'Create Standard Tenant' page.  This is how we register our initial tenant and get them provisioned in the system.  The Windows LiveID and company name put into this page is what will be used for provisioning (the other information is not used right now).

image

Once you click 'Save', the provisioning service will be called and rules will be inserted into the Access Control service.  You can view the rules by looking using the .NET Services portal and viewing the Access Control Service (use the Advanced View) and select the 'http://localhost/IssueTracker.Web' scope.

image

We make heavy use of the forward chaining aspect of the rules transformation here.  Notice that Admin will be assigned the role of both Reader and Contributor.  Those roles have many more operations assigned to them.  The net effect will be that an Admin will have all the operations assigned to them when forward chaining is invoked.

Notice as well that we have 3 new rules created in the Access Control service (ACS).  We have a claim mapping that sets the Windows LiveID to the Admin role output claim, we have another that sets an email mapping claim, and finally one that sets a tenant mapping claim.  Since we don't many input claims to work with (only the LiveID really), there is not too much we can do here in the Standard edition.  This is where the Enterprise edition that can get claims from AD or any other identity provider is a much richer experience.

Once you have created the tenant and provisioned the rules, you will be able to navigate to a specific URL now for this tenant:  http://localhost/IssueTracker.Web/{tenant}

You will be prompted to login with your LiveID and once you are authenticated with Windows LiveID, you will be redirected back to your project page.  Under the covers, we federated with LiveID, got a token from them, sent the token to ACS, transformed the claims, and sent back a token containing the claims (all signed by the ACS).  I will speak more about this later, but for now, we will stick to the visible effects.

image

From the Project page, you should click 'New Project' and create a new project.

  • Give it a name and invite another Windows LiveID user (that you can login as later).  Make sure you invite a user with a different Windows LiveID than the one you used to initially provision the tenant (i.e. what you are logged in as currently).
  • Choose the 'Reader' role for the invited user and click the Invite button.
  • Add any custom fields you feel like using.  We will likely expand this functionality later to give you more choices for types and UI representation.
  • Click Save (make sure you added at least 1 user to invite!)

Once you click Save, additional rules are provisioned out to the ACS to handle the invited users.  From the newly created project, create an issue.  Notice that any additional fields you specified are present now in the UI for you to use for your 'custom' issue.

image

Once you have saved a new issue, go ahead and try the edit functionality for the Issue.  Add some comments, move it through the workflow by changing the status, etc.

Next, open a new instance of IE (not a new tab or same IE) and browse to the tenant home page (i.e. http://localhost/IssueTracker.Web/{tenant}).  This time however, login as the invited Windows LiveID user.  This user was assigned the 'Reader' role.  Notice the new browser window can read the projects for this tenant and can read any issue, but they cannot use the Edit functionality.

image

Now, I will make two comments about this.  First, so what?  We are checking claims here on the website and showing you a 'Not Authorized' screen.  While we could have just turned off the UI to not show the 'Edit' functionality, we did this intentionally in the demo so you can see how this claim checking works.  In this case, we are checking claims at the website using a MVC route filter (called ClaimAuthorizationRouteFilterAttribute).

One key point of this demo is that the website is just a client to the actual Issue Tracker service.  What if we tried to hit the service directly?

Let's check.  I am going to intentionally neuter the web site claims checking capabilities:

image

By commenting out the required claims that the Reader won't have, I can get by the website.  Here is what I get:

image 

The Issue Tracker service is checking claims too. whoops.  No way to trick this one from a client.  So, what are the implications of this design?  I will let you noodle on that one for a bit.  You should also be asking yourself, how did we get those claims from the client (the website), to the service?  That is actually not entirely obvious and so I will save that for another post.  Enjoy for now!

Wednesday, 17 December 2008

SQL Down Under Podcast

I recently had the opportunity to sit down (virtually) with Greg Low from SQL Down Under fame and record a podcast with him.  I have to thank Greg for inviting me to ramble on about cloud services, data services like SQL Data Services and Microsoft's Azure services in particular.  It is about an hour's worth of content, but it seemed a lot faster than that to me.  I speak at a relatively high level on what we have done with SQL Services and Azure services and how to think about cloud services in general.  There are some interesting challenges to cloud services - both in the sense of what challenges they solve as well as new challenges they introduce.

I am show 42, linked from the Previous Shows page.  Thanks Greg!

Friday, 14 November 2008

Fixing the SDS HOL from Azure Training Kit

If you downloaded the Azure Services Training Kit (which you should), you would find a compilation error on some of the SQL Data Services HOLs.

image

The error is somewhat self-explanatory:  the solution is missing the AjaxControlToolkit.  The reason that this file is missing is not because we forgot it, but rather our automated packaging tool was trying to be helpful.  You see, we have a tool that cleans up the solutions by deleting the 'bin' and 'obj' folders and any .pdb files in the solution before packaging.  In this case, it killed the bin directory where the AjaxControlToolkit.dll was deployed.

To fix this error, you just need to visit AjaxControlToolkit project on CodePlex and download it again.  The easiest way is to download the AjaxControlToolkit-Framework3.5Sp1-DllOnly.zip, extract the 'Bin' directory and copy it in to the root of the solution you are trying to use.

Sorry about that - we will fix it for our next release.

(updated: added link)

Thursday, 13 November 2008

Azure Services Training Kit – PDC Preview

The Azure Services Training kit contains the hands on labs used at PDC along with presentations and demos.  We will continue to update this training kit with more demos, samples, and labs as they are built out.  This kit is a great way to try out the technologies in the Azure Services Platform at your own pace through the hands on labs.

image

You will need a token in order to run a few of the labs (specifically the .NET Services labs, the SQL Data Services labs, and one of the Live Services labs).  The Windows Azure labs use the local dev fabric so no token is necessary.  Once you install the kit, it will launch the browser with a navigation UI to find all the content within.  If you need an account, simply click the large blue box labeled 'Try it now' and follow the links to register at Microsoft Connect.

Happy Cloud Services.

Tuesday, 11 November 2008

Refreshed REST library for SQL Data Services

I finally got around to doing a quick refresh on the SSDS REST Library.  It should now be called SDS REST library of course, but I doubt I will change the name as that would break the URL in Code Gallery.

I am calling this one a 'Refresh' release because I am not adding any features.  The purpose of this release was to fix the serialization such that it runs in partial trust.  Partial trust support is desirable because that means you can use this library in Windows Azure projects.

I found out an interesting fact while working on this about the XmlSerializer.  First, serializing a generic type, in this case SsdsEntity<T> works just fine in partial trust.  However, deserializing that exact same type will not work without Full trust.  To fix it, I had to actually remove any and all code that tried to do it.  Instead, I deserialized the T in SsdsEntity<T> and manually created the SsdsEntity part.  You can see those updates in the SsdsEntitySerializer class as well as the SsdsEntity<T> Attributes property in the setter if you check.  I don't see any problems with my solution, and in fact, it may end up being more efficient.

Remaining work to do:  implement the JOIN, OrderBy, and TOP operations (if I find the time).

Get it here:  SDS REST Library

Tuesday, 04 November 2008

NeoGeo on Building with SDS

During PDC, I was lucky enough to film a short video with Marc Hoeppner, one of the Regional Directors from Germany and the Managing Director of NeoGeo New Media GmbH.  Marc was involved very early with SQL Data Services and has provided some valuable feedback to us on features and direction.

I managed to get Marc to show me his company's media asset management product, neoMediaCenter.NET.  What struck me during this interview was how his team did a hybrid approach to cloud services.  That is, their original product uses SQL Server on the backend for storage and querying.  Instead of forcing customers to make an either/or decision, they took the approach of offering both.  You can move your data seamlessly between the cloud or the on-premises database. 

There are some real advantages to using SQL Data Services for this product: namely, with a click, you can move the data to the cloud where it can essentially be archived forever, but still available for consumption.  We like to term this 'cold storage'.  Imagine the model where you have thousands and thousands of digital assets.  For the assets that are temporally relevant, you can store them in the local on-premises database for the least latency.  However, as the data ages, it tends to be used less and less frequently.  Today, companies either invest in a bunch of new storage, archive it off to tape, or just delete the content once it gets to a certain age.  Instead of forcing customers to make one of these choices, Marc has added the capability to move this data out of the on-premises store and out to the cloud seamlessly.  It still appears in the application, but is served from the cloud.  This makes accessing this data simple (unlike tape or deleting it) as well as relatively inexpensive (unlike buying more disk space yourself).

Once we have multiple datacenters up and operational, you also get the geo-location aspect of this for free.  It may be the case that for certain sets of distributed customers, using the geo-located data is in fact faster than accessing the data on-premises as well.

This is a very cool demo.  If you watch towards the end, Marc shows a CIFS provider for SDS that allows you to mount SDS just like a mapped network drive.  Marc mentions it in the video, but he managed to build all this functionality in just a week!  It is interesting to note that Marc's team also made use of the SSDS REST library that provided the LINQ and strongly typed abstraction for querying and working with SDS (it was named before SDS, hence SSDS still).  I am happy to see that of course since I had a bit to do with that library. :)

Watch it here

Sunday, 02 November 2008

Using SDS with Azure Access Control Service

It might not be entirely obvious to some folks how the integration with SQL Data Services and Azure Access Control Service works.  I thought I would walk you through a simple example.

First, let me set the stage that at this point the integration between the services is a bit nascent.  We are working towards a more fully featured authorization model with SDS and Access Control, but it is not there today.

Authentication

I will group the authentication today into two forms:  the basic authentication used by SDS directly and the authentication used by Access Control.  While the two may look similar in the case of username and password, they are, in fact, not.  The idea is that eventually, the direct authentication to SDS using basic authentication (username/pwd) will eventually go away.  Only authentication via Access Control will survive going forward.  For most folks, this is not a super big change in the application.  While we don't have the REST story baked yet in the current CTP, we have support today in SOAP to show you how this looks.

Preparing your Azure .NET Services Solution

In order to use any of these methods, you must of course have provisioned an account for the CTP of Azure Services Platform and .NET Services in particular.  To do this, you must register at http://www.azure.com and work through Microsoft Connect to get an invitation code.  PDC attendees likely already have this code if they registered on Connect (using the LiveID they registered for PDC with).  Other folks should still register, but won't get the code as fast as PDC attendees.  Once you have the invitation code and have created and provisioned a solution, you need to click the Solution Credentials link and associate a personal card for CardSpace or a certificate for certificate authentication.

 

image image

Once you have credentials associated with your Azure Service Platform solution, you can prepare your code to use them.

Adding the Service Reference

Here is how you add a service reference to your project to get the SOAP proxy and endpoints necessary to use Access Control.  First, right click your project and choose Add Service Reference.

image

In the address, use https://database.windows.net/soap/v1/.  Note the trailing '/' in the URL.  Also, notice it is not using 'data.database.windows.net', but just 'database.windows.net'.  Next, name the proxy - I called mine 'SdsProxy'.

Click the Advanced button and choose System.Collections.Generic.List from the collection type dropdown list.

image 

Once you have clicked OK a few times, you will get an app.config in your project that contains a number of bindings and endpoints.  Take a moment to see the new endpoints:

image 

There are 3 bindings right now: basicHttpBinding (used for basic authentication directly against SDS), as well as customBinding and wsHttpBinding, which are used with the Access Control service .  There are also 4 endpoints added for the client:

  1. BasicAuthEndpoint used for basic authentication with SDS directly.
  2. UsernameTokenEndpoint used for authentication against Access Control (happens to be same username and password as #1 however).
  3. CertificateTokenEndpoint used for authentication against Access Control via certificate and finally
  4. CardSpaceTokenEndpoint used for authentication against Access Control via Cardspace.

Use the Access Control Service

At this point, you just need to actually use the service in code.  Here is a simple example on how to do it.  I am going to create a simple service that does nothing but queries my authority and I will do it in all three supported Access Control authentication methods (2-4 above).

Solution and Password

To use the username/password combination you simply do it exactly like the basic authentication you are used to, but use the 'UsernameTokenEndpoint' for the SOAP proxy.  It looks like this:

var authority = "yourauthority";

//Username/Pwd
var proxy = new SitkaSoapServiceClient("UsernameTokenEndpoint");
proxy.ClientCredentials.UserName.UserName = "solutionname";
proxy.ClientCredentials.UserName.Password = "solutionapassword";

var scope = new Scope() { AuthorityId = authority };

//return first 500 containers
var results = proxy.Query(scope, "from e in entities select e");

Console.WriteLine("Containers via Username/Password:");
Console.WriteLine("================");
foreach (var item in results)
{
    Console.WriteLine(item.Id);
}
Console.WriteLine("================");
proxy.Close();

 

CardSpace

CardSpace has two tricks to get it working once you set the proxy to 'CardspaceTokenEndpoint'.  First, you must use the DisplayInitializeUI method on the proxy to trigger the CardSpace prompt.  Next, you must explicitly open the proxy by calling Open.  It looks like this:

//CardSpace
//create a new one for CardSpace
proxy = new SitkaSoapServiceClient("CardSpaceTokenEndpoint");
proxy.DisplayInitializationUI(); //trigger the cardspace login


//need to explicitly open for CardSpace
proxy.Open();

//return first 500 containers
results = proxy.Query(scope, "from e in entities select e");

Console.WriteLine("Containers via CardSpace:");
Console.WriteLine("================");
foreach (var item in results)
{
    Console.WriteLine(item.Id);
}
Console.WriteLine("================");

proxy.Close();

 

Certificates

Once you have created a certificate (with private key) and installed it somewhere on your machine (I used the local machine store in the Personal container (or 'My' container).  I have also set the self-generated certificate's public key in the trusted people store on the local machine so it validates. 

//Certificates
//create a new one for Certificates
proxy = new SitkaSoapServiceClient("CertificateTokenEndpoint");

//can also set in config
proxy.ClientCredentials.ClientCertificate.SetCertificate(
    "CN=localhost",
    StoreLocation.LocalMachine,
    StoreName.My
    );

//return first 500 containers
results = proxy.Query(scope, "from e in entities select e");

Console.WriteLine("Containers via Certificates:");
Console.WriteLine("================");
foreach (var item in results)
{
    Console.WriteLine(item.Id);
}
Console.WriteLine("================");

proxy.Close();

 

The code is very similar, but I am explicitly setting the certificate here in the proxy credentials.  You can also configure the certificate through config.

And there you have it. using the Azure Access Control service with SQL Data Services.  As I mentioned earlier, while this is nascent integration today, you can expect going forward that the Access Control Service will be used to expose a very rich authorization model in conjunction with SDS.

Download the VS2008 Sample Project

Wednesday, 29 October 2008

Azure Services Platform Management Console

In case anyone was looking for a feature rich management tool for their Azure Services (.NET Services and SQL Services), we have posted a version of a managed console out to Code Gallery.  It is amazingly cool stuff which I am proud to say I had a little input into for the SDS stuff.  It manages your cloud based workflows, identity and access control rules, and SQL Data Services data.

image

In later iterations, I can imagine we will beef up the editors and keep the tools in sync with the live services as they evolve.  However, for pure utility - this thing is better than the Azure portal today.

Download it here

Monday, 27 October 2008

Ruby on SQL Data Services SDK

A few months back, I embarked on a mission to get these Ruby samples produced as I felt it was important to show the flexibility and open nature of SDS.  The problem was that I have no practical experience with Ruby.  With this in mind, I looked to my friend, former co-worker, and Ruby pro, James Avery to get this done.  He did a terrific job as the developer and I am happy to present this today.

With the announcement today at PDC of the Azure Services Platform, we are releasing a set of Ruby samples for SQL Data Services (SDS), formerly called SSDS.  We are putting the source on GitHub, and the samples will be available as gems from RubyForge.

The samples really consist of a number of moving parts:

image

At the core of the samples is a Ruby REST library for SDS.  It performs the main plumbing to using the service.  Next, we have two providers for building applications: an ActiveRecord provider and an ActiveResource provider.  These providers make use of the Ruby REST library for SDS.

Finally, we have two samples that make use of the providers.  We have built a version of RadiantCMS that uses the ActiveRecord provider and a simple task list sample that shows how to use the ActiveResource provider.

To get started:

  1. Visit GitHub and download sds-tasks or sds-radiant.
  2. View the Readme file in the sample as it will direct you to download Rails and some of other pre-requisites.
  3. Use the gem installers to download the sds-rest library.
  4. Set some configuration with your SDS username/password (now called Azure Solution name and password when you provision).
  5. That's it!  Just run the ruby script server and see how REST and Ruby works with SDS.

NOTE:  PDC folks can get provisioned for SQL Services and .NET Services by visiting the labs and getting a provisioning code.  This is exclusive to PDC attendees until the beta.

I will look to James to provide a few posts and details on how he built it.  Thanks again James!

What's new in SQL Data Services for Developers?

I dropped in on a couple of the developers in the SQL Data Services team (formerly SSDS) to chat and film a video about some of the new features being released with the PDC build of the service.  Jason Hunter and Jeff Currier are two of the senior developers on the SDS team and with a few days warning that I would be stopping by, they managed to put together a couple cool demos for us.

This is a longer video, with lots of code and deep content - so set aside some time and watch Jason and Jeff walk us through the new relational features as well as blob support.

What's new in SQL Data Services for Developers?