tsJensen

A quest for software excellence...

Prevent Reflection and Disassembly of Silverlight Class Libary Code

Over the weekend, I began to think about how to better protect proprietary algorithms and other sensitive code in a Silverlight application, keeping the assembly away from prying, snooping eyes. I decided the best way would be to keep the code in memory and never have it committed to the hard drive. A little research and a little coding and badda bing (er, badda google?).

The solution turns out to be rather simple. You need four projects: the Silverlight app, the web app, the contract (interface) and the implementation Silverlight class libraries. The Silverlight app references the contract library which pulls it into the XAP. The implementation library references the contract library to implement the interface, of course. And the web app does its thing, supplying the XAP file to the browser and most importantly supplying the protected bits via a stream that presumably is protected by SSL, authentication and authorization mechanisms, items I've conveniently left out of this post and the sample code for brevity.

Start with the AppManifest.xaml (in Dinorythm.xap)
Note that the manifest contains only the Silverlight app and the contract class library along with the other assemblies required for the Silverlight Navigation Application (find what you need at Scott Guthrie's informative Silverlight 4 Released blog post).

<Deployment 
    xmlns="http://schemas.microsoft.com/client/2007/deployment" 
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" 
    EntryPointAssembly="Dinorythm" 
    EntryPointType="Dinorythm.App" 
    RuntimeVersion="4.0.50401.0">
  <Deployment.Parts>
    <AssemblyPart x:Name="Dinorythm" Source="Dinorythm.dll" />
    <AssemblyPart x:Name="DinoContracts" Source="DinoContracts.dll" />
    <AssemblyPart x:Name="System.ComponentModel.DataAnnotations" Source="System.ComponentModel.DataAnnotations.dll" />
    <AssemblyPart x:Name="System.ServiceModel.DomainServices.Client" Source="System.ServiceModel.DomainServices.Client.dll" />
    <AssemblyPart x:Name="System.ServiceModel.DomainServices.Client.Web" Source="System.ServiceModel.DomainServices.Client.Web.dll" />
    <AssemblyPart x:Name="System.ServiceModel.Web.Extensions" Source="System.ServiceModel.Web.Extensions.dll" />
    <AssemblyPart x:Name="System.Windows.Controls" Source="System.Windows.Controls.dll" />
    <AssemblyPart x:Name="System.Windows.Controls.Navigation" Source="System.Windows.Controls.Navigation.dll" />
  </Deployment.Parts>
</Deployment>

 

DinoContracts a Silverlight 4 Class Library (in Dinorythm.xap).
To any nosy disassembler looking for the secret sauce code, all they will get is the interface and perhaps a few domain classes if you need them.

namespace DinoContracts
{
  public interface IMySecretCode
  {
    string DoSecretWork(string input);
  }
}

 

SecretAlgorithms a Silverlight 4 Class Library (NOT in Dinorythm.xap)
This library has a project reference to DinoContracts and copies it's output to the Dynorythm.Web/App_Data folder.

namespace SecretAlgorithms
{
  public class MySecretCode : IMySecretCode
  {
    public string DoSecretWork(string input)
    {
      return "results of my secret code";
    }
  }
}

 

Dinorythm.Web an ASP.NET MVC 2 project
Obviously this code is not production ready. You need some security here, but what you see here will get you the result you seek. Securing this action method might be a good topic for another blog post.

namespace Dinorythm.Web.Controllers
{
  [HandleError]
  public class HomeController : Controller
  {
    //...other code removed for brevity

    public FileContentResult Secret()
    {
      string ctype = "application/octet-stream";
      string fileName = "SecretAlgorithms.dll";
      byte[] dll = GetFile(fileName);
      return File(dll, ctype, fileName);
    }

    byte[] GetFile(string fileName)
    {
      string path = HostingEnvironment.MapPath(@"~/App_Data/" + fileName);
      byte[] bytes = System.IO.File.ReadAllBytes(path);
      return bytes;
    }
  }
}

 

Dinorythm a Silverlight 4 Navigation Application
This app has a project reference to DinoContracts but knows nothing about the SecretAlgorithms project. The secret code in this demo won't win any awards but it might help you conceive (and me to remember) of how to get the job done with your real intellectual property.

namespace Dinorythm
{
  public partial class About : Page
  {
    public About()
    {
      InitializeComponent();
    }

    // Executes when the user navigates to this page.
    protected override void OnNavigatedTo(NavigationEventArgs e)
    {
      WebClient down = new WebClient();
      down.OpenReadCompleted += new OpenReadCompletedEventHandler(down_OpenReadCompleted);
      Uri location = new Uri(System.Windows.Application.Current.Host.Source, @"../Home/Secret");
      down.OpenReadAsync(location);
    }

    void down_OpenReadCompleted(object sender, OpenReadCompletedEventArgs e)
    {
      AssemblyPart part = new AssemblyPart();
      Assembly asm = part.Load(e.Result);
      IMySecretCode secret = (IMySecretCode)asm.CreateInstance("SecretAlgorithms.MySecretCode");

      if (secret != null)
        this.ContentText.Text = secret.DoSecretWork("help me");
      else
        this.ContentText.Text = "was null";
    }
  }
}

Help credits go to Tim Heuer for some comparison with cached assemblies and to the practical help from a Silverlight Tip of the Day.

A Note About Security: I am not a self-proclaimed security expert. I'm sure there are ways to defeat this approach, but I suspect that doing so would be more trouble than it would be worth. But certainly this would be more efficient than simple or even complex obfuscation. Then again one could obfuscate and then dynamically download and instantiate in memory. That ought to really throw a would be intellectual property thief for a real loop. (Also note, I've never tried obfuscation in a Silverlight class library, so perhaps it's not even possible. Hmm... Another research and blog topic.)

If you find this code useful, I'd love to hear from you. Download Dinorythm.zip (313.34 KB) here.

WCF Plugin Architecture for .NET Windows Service with Easy Debug Console Mode

I've been wanting a simplified WCF plugin architecture for .NET Windows service with easy debug console mode to make my life easier. Building services with Windows Communication Foundation (WCF) is simple and hard. The service and data contract code is simple, like this:

namespace Test2Common	
{
  [ServiceContract]
  public interface IMyOtherService
  {
    [OperationContract, FaultContract(typeof(WcfServiceFault))]
    string GetName(string seed);

    [OperationContract, FaultContract(typeof(WcfServiceFault))]
    Person GetPerson(int id);
  }

  [DataContract]
  public class Person
  {
    [DataMember]
    public string Name { get; set; }

    [DataMember]
    public string Title { get; set; }
  }
}

But that is where simple ends. The hosting and client proxy code and the configuration XML (or code) is not simple. Even when I use tools to generate these artifacts, I find myself having to tweak and fix and delve deeper than time permits. And since I prefer simple, I decided to create a reusable framework for hosting my limited scope services. Here's what I wanted (download PluggableWcfServiceHost.zip (38.05 KB)):

  • Fast net.tcp binding
  • Windows credentials and authentication
  • Encrypted and signed transport (no packet sniffing allowed)
  • Simplified configuration (hide everything I don't want to see)
  • Windows Service host that behaves like a Console app when I'm debugging
  • Dynamic loading of the service (no changes to the host code to add a new service)
  • Generic client so I don't have to write or generate proxy code
  • Client that is truly IDisposable (hide Abort vs Close for me)
  • Long timeout in DEBUG mode so I can really take my time while debugging
  • Inclusion of exception details in DEBUG mode only
  • Base service class with a simple Authorize method to support multiple Windows groups
  • Support for multiple Windows group authorization
  • Identical configuration for server and client
  • Cached resolution of service plugin service and contract types
  • Minimal number of assemblies (projects) in the solution
  • Keep the implementation of the service hidden from the client
  • (Probably some I've forgotten to mention)

Here's the simplified config with two services configured:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <configSections>
    <section name="wcfServices" 
          type="WcfServiceCommon.WcfServiceConfigurationSection, WcfServiceCommon" />
  </configSections>
  <appSettings/>
  <connectionStrings/>
  <wcfServices consoleMode="On">
    <services>
      <add key="test1"
          serviceAddressPort="localhost:2981"
          endpointName="Test1EndPoint"
          authorizedGroups="WcfServiceClients,someOtherGoup"
          hostType="Test1Service.ThatOneService, Test1Service"
          contractType="Test1Common.IThatOneService, Test1Common" />
      <add key="test2"
          serviceAddressPort="localhost:2981"
          endpointName="Test2EndPoint"
          authorizedGroups="WcfServiceClients,someOtherGoup"
          hostType="Test2Service.MyOtherService, Test2Service"
          contractType="Test2Common.IMyOtherService, Test2Common" />
    </services>
  </wcfServices>
</configuration>

And here's the implementation of the service (It really is simple):

namespace Test2Service
{
  public class MyOtherService : WcfServiceBase, IMyOtherService
  {
    public string GetName(string seed)
    {
      base.Authorize();
      return "This is my name: " + seed.ToUpper();
    }

    public Person GetPerson(int id)
    {
      base.Authorize();
      return new Person { Name = "Donald Trumpet", Title = "King of the Hill" };
    }
  }
}

Now for the really fun part. The client. Hey, where's the proxy? Hidden away just like it ought to be.

namespace WcfSvcTest
{
  class Program
  {
    static void Main(string[] args)
    {
      using (var client1 = WcfServiceClient<IThatOneService>.Create("test1"))
      {
        Console.WriteLine(client1.Instance.GetName("seed"));
        var add = client1.Instance.GetAddress(8);
        Console.WriteLine("{0}", add.City);
      }

      using (var client2 = WcfServiceClient<IMyOtherService>.Create("test2"))
      {
        try
        {
          Console.WriteLine(client2.Instance.GetName("newseed"));
          var per = client2.Instance.GetPerson(7);
          Console.WriteLine("{0}, {1}", per.Name, per.Title);
        }
        catch (FaultException<WcfServiceFault> fault)
        {
          Console.WriteLine(fault.ToString());
        }
        catch (Exception e) //handles exceptions not in wcf communication
        {
          Console.WriteLine(e.ToString());
        }
      }

      Console.ReadLine();
    }
  }
}

To save space, I only wrapped the use of the client in try catch on the second test service.

Note: You have to remember that the WcfServiceHost requires the "common" and the "service" assemblies of your dynamically loaded services in it's bin folder. The client (see WcfSvcTest project in the solution) will also need a copy of the "common" assemblies in it's bin folder. You'll find I'm doing that for the test using post-build commands (copy $(TargetPath) $(SolutionDir)WcfServiceHost\bin\debug\). And of course, both need to have identical config sections as shown in the code.

I hope you find this WCF plugin architecture and the accompanying code useful. I will certainly be using it. If you do use it, please let me know how it goes.

Download PluggableWcfServiceHost.zip (38.05 KB)

Keyword Extraction in C# with Word Co-occurrence Algorithm

A few years ago I worked on a project called Atrax which among other things included an implementation of the work of Yutaka Matsuo of the National Institute of Advanced Industrial Science and Technology in Tokyo and by Mitsuru Ishizuka of the University of Tokyo.

I decided to revisit the keyword extraction algorithm and update it a bit and isolate it from the overall Atrax code to make it easier for anyone to use. You can download the code Keyword.zip (17.87 KB).

Here are the top ten keywords the code returns from the Gettysburg Address and from Scot Gu’s most recent blog post:

gettys
   dedicated
   nation
   great
   gave
   dead
   rather
   people
   devotion
   people people
   lives

gu
   ASP.NET MVC
   VS 2010 and Visual Web Developer
   ASP.NET 3.5
   ASP.NET
   MVC
   Web Developer 2008 Express
   VS 2010
   VS 2008
   release
   Improved Visual Studio

Let me know if you end up using the implementation of the algorithm and if you happen to make improvements to it.

Keyword.zip (17.9KB) - updated link

ASP.NET MVC 2 MinStringLength Custom Validation Attribute

I love ASP.NET MVC 2 validations for client and server via annotations. Steve Sanderson's xVal is great too, but in this post I want to focus on a custom validation for MVC 2 which is frustratingly missing from the out-of-the-box validations. There is a very nice StringLengthAttribute which allows you to specify a maximum length but does not provide for a minimum length.

At first I attacked this deficiency with a regex like this:

[RegularExpression("[\\S]{6,}", ErrorMessage = "Must be at least 6 characters.")]
public string Password { get; set; }

This approach just seems clunky. Happily, while reading Scott Allen's piece on MVC 2 validation in MSDN magazine, I came across a critical reference to one of Phil Haack's blog posts which I must have overlooked.

Thanks go to Phil's easy to follow instructions on writing a custom validator for MVC 2 that will work on both client and server sides.

So here's my custom MinStringLengthAttribute and supporting code which let's me do something easier and cleaner like this:

[StringLength(128, ErrorMessage = "Must be under 128 characters.")]
[MinStringLength(3, ErrorMessage = "Must be at least 3 characters.")]
public string PasswordAnswer { get; set; }

If you really wanted to get creative, you could produce a combination of the two, but I'll leave that for another day.

Here's the code for the attribute and the required validator:

public class MinStringLengthAttribute : ValidationAttribute
{
  public int MinLength { get; set; }

  public MinStringLengthAttribute(int minLength)
  {
    MinLength = minLength;
  }

  public override bool IsValid(object value)
  {
    if (null == value) return true;     //not a required validator
    var len = value.ToString().Length;
    if (len < MinLength)
      return false;
    else
      return true;
  }
}

public class MinStringLengthValidator : DataAnnotationsModelValidator<MinStringLengthAttribute>
{
  int minLength;
  string message;

  public MinStringLengthValidator(ModelMetadata metadata, ControllerContext context, MinStringLengthAttribute attribute)
    : base(metadata, context, attribute)
  {
    minLength = attribute.MinLength;
    message = attribute.ErrorMessage;
  }

  public override IEnumerable<ModelClientValidationRule> GetClientValidationRules()
  {
    var rule = new ModelClientValidationRule
    {
      ErrorMessage = message,
      ValidationType = "minlen"
    };
    rule.ValidationParameters.Add("min", minLength);
    return new[] { rule };
  }
}

Here's the code in the Global.asax.cs that registers the validator:

protected void Application_Start()
{
  RegisterRoutes(RouteTable.Routes);
  DataAnnotationsModelValidatorProvider.RegisterAdapter(typeof(MinStringLengthAttribute), typeof(MinStringLengthValidator));
}

Here's the javascript to hook up the client side:

Sys.Mvc.ValidatorRegistry.validators["minlen"] = function(rule) {
  //initialization
  var minLen = rule.ValidationParameters["min"];

  //return validator function
  return function(value, context) {
    if (value.length < minLen) 
      return rule.ErrorMessage;
    else
      return true;  /* success */
  };
};

Now, in your view, add <% Html.EnableClientValidation(); %> and that's all there is to it.

EventLog WriteEntry Helper Class

This log helper class seems to get used over and over again in projects I've worked on. Generally I use it as a fallback, calling it from an exception block in a application logging code. Let me know if you find it helpful.

public static class LogHelper
{
  public static bool WriteEventLogEntry(string source, EventLogEntryType entryType, int eventId, string message)
  {
    bool passFail = true;
    try
    {
      if (!EventLog.SourceExists(source))
      {
        EventLog.CreateEventSource(source, "Application");
      }
      EventLog elog = new System.Diagnostics.EventLog();
      elog.Source = source;
      elog.WriteEntry(message, entryType, eventId);
    }
    catch
    {
      passFail = false; //this method is usually called as a last resort, so there can be no real exception handling
    }
    return passFail;
  }
}

Windows Azure 1.0 CloudTableClient Minimal Configuration

It turns out that using table storage in Windows Azure 1.0 is quite easily done with a minimal amount of configuration. In the new Azure cloud service Visual Studio project template, configuration is significantly simpler than what I ended up with in a previous blog post where I was able to get the new WCF RIA Services working with the new Windows Azure 1.0 November 2009 release and the AspProvider code from the updated demo.

And while the project runs as expected on my own machine, I can’t seem to get it to run in the cloud for some reason. I have not solved that problem. Since I can’t seem to get much of a real reason for the failure via the Azure control panel, I’ve decided to start at the beginning, taking each step to the cloud rather than building entirely on the local simulation environment before taking it to the cloud for test.

I started with a clean solution. No changes except to the web role’s Default.aspx page with some static text. Publish to the cloud and twenty to thirty minutes later (too long in my opinion) the page will come up, deployment is complete and the equivalent of a “hello world” app is running in the Azure cloud.

The next step I want to experiment with is the simplest use possible of Azure table storage. In the previous project, there was a lot of configuration. In this new project, there was very little. I wondered how much of the configuration from the previous project, largely an import from the updated AspProviders demo project, was a partially unnecessary legacy of CTP bits. It turns out, nearly all of it.

Here’s the only configuration you need for accessing Azure storage:

<?xml version="1.0"?>
<ServiceConfiguration serviceName="MyFilesCloudService" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
  <Role name="MyFiles">
    <Instances count="1" />
    <ConfigurationSettings>
      <!-- Add your storage account information and uncomment this to target Windows Azure storage. 
      <Setting name="DataConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myacct;AccountKey=heygetyourownkey" />
      <Setting name="DiagnosticsConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myacct;AccountKey=heygetyourownkey" />
      -->

      <!-- local settings -->
      <Setting name="DataConnectionString" value="UseDevelopmentStorage=true" />
      <Setting name="DiagnosticsConnectionString" value="UseDevelopmentStorage=true" />
    </ConfigurationSettings>
  </Role>
</ServiceConfiguration>

The thumbnails sample does provide one important piece of code that does not come with the standard Visual Studio template. It goes in the WebRole.cs file and makes the use of the CloudStorageAccount class’s static FromConfigurationSetting method which returns the needed CloudStorageAccount instance. To use that method, the SetConfigurationSettingPublisher method must have already been called. Hence this code placed in the WebRole class like this:

public class WebRole : RoleEntryPoint
{
  public override bool OnStart()
  {
    DiagnosticMonitor.Start("DiagnosticsConnectionString");

    // For information on handling configuration changes
    // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
    RoleEnvironment.Changing += RoleEnvironmentChanging;

    #region Setup CloudStorageAccount Configuration Setting Publisher

    // This code sets up a handler to update CloudStorageAccount instances when their corresponding
    // configuration settings change in the service configuration file.
    CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
    {
      // Provide the configSetter with the initial value
      configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));

      RoleEnvironment.Changed += (sender, arg) =>
      {
        if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
           .Any((change) => (change.ConfigurationSettingName == configName)))
        {
          // The corresponding configuration setting has changed, propagate the value
          if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
          {
            // In this case, the change to the storage account credentials in the
            // service configuration is significant enough that the role needs to be
            // recycled in order to use the latest settings. (for example, the 
            // endpoint has changed)
            RoleEnvironment.RequestRecycle();
          }
        }
      };
    });
    #endregion

    return base.OnStart();
  }

  private void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
  {
    // If a configuration setting is changing
    if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
    {
      // Set e.Cancel to true to restart this role instance
      e.Cancel = true;
    }
  }
}

Once you have the set the configuration setting publisher, you can use the FromConfigurationSetting method in the creation of your CloudTableClient and then check for the existence of a table, creating it if it does not already exist in your repository code. Here’s a minimal example that I used and published successfully to my Azure account.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure;
using System.Data.Services.Client;

namespace MyDemo
{
  public class AdventureRow : TableServiceEntity
  {
    public string IpAddress { get; set; }
    public DateTime When { get; set; }
  }

  public class AdventureRepository
  {
    private const string _tableName = "Adventures";
    private CloudStorageAccount _account;
    private CloudTableClient _client;

    public AdventureRepository()
    {
      _account = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
      _client = new CloudTableClient(_account.TableEndpoint.ToString(), _account.Credentials);
      _client.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(1));
      _client.CreateTableIfNotExist(_tableName);
    }

    public AdventureRow InsertAdventure(AdventureRow row)
    {
      row.PartitionKey = "AdventurePartitionKey";
      row.RowKey = Guid.NewGuid().ToString();
      var svc = _client.GetDataServiceContext();
      svc.AddObject(_tableName, row);
      svc.SaveChanges();
      return row;
    }

    public List<AdventureRow> GetAdventureList()
    {
      var svc = _client.GetDataServiceContext();
      DataServiceQuery<AdventureRow> queryService = svc.CreateQuery<AdventureRow>(_tableName);
      var query = (from adv in queryService select adv).AsTableServiceQuery();
      IEnumerable<AdventureRow> rows = query.Execute();
      List<AdventureRow> list = new List<AdventureRow>(rows);
      return list;
    }
  }
}

Just as the StorageClient has moved from “sample” to first class library in the Azure SDK, I suspect that the AspProviders sample may already be on that same path to glory. In it’s current form, it represents something new and something old, the old having not been entirely cleaned up, lacking the elegance it deserves and will likely get either by Microsoft or some other enterprising Azure dev team.

As for me, I will continue trying to learn, one step at a time, why the Adventure code I blogged about previously will run locally but not on the cloud as expected. Who knows, it could be as simple as a single configuration gotcha, but figuring it out one step at a time will be a great learning opportunity and as I get the time, I’ll share what I learn here.

Enterprise Silverlight 3 with WCF RIA Services on Windows Azure 1.0 Part 1 Redux

A month ago I posted the results of some experimentation with the preview bits of what was then called .NET RIA Services, Silverlight 3 and the Windows Azure with AspProvider sample code from the Azure July 2009 CTP. That was then and much has changed.

Windows Azure 1.0 and what is now called WCF RIA Services Beta have since been released. Lot’s of great changes make using these together with the new AspProvider sample in the “Additional C# Samples” download that some friendly readers shared with me. With Visual Studio 2008 SP1 and SQL Server 2008 (Express if you want) and these you’re set up to play.

WARNING: They were not lying about the WCF part when they renamed it. The default Domain Service channel is binary and they’re using Windows Activation Service (WAS). So make sure you’re Windows Features look something like this or you’re in for several hours of maddening error chasing.

advnew3

After some review of the previous effort using the July 2009 CTP and RIA preview bits, I decided starting over was the best course of action. Here’s the steps to nirvana:

  1. Create a new Silverlight Business Application called Adventure which produces Adventure.Web application as well.
  2. Add new CloudService project with WebRole called AdventureCloudService and WebRole1.
  3. Copy WebRole.cs to Adventure.Web and rename namespace.
  4. Add 3 azure refs to Adventure.Web.
  5. In the cloud service project file, replace WebRole1 with Adventure.Web and project guid with that from Adventure.Web. (There is probably a better way to do this.)
  6. The node under Roles in the cloud service project shows an error. Right click it and choose “associate” and pick Adventure.Web.
  7. Copy system.diagnostics section from WebRole1 web.config to that of Adventure.Web.
  8. Remove WebRole1 from solution and set service project as startup.
  9. Copy and new AspProviders project and sections from demo web.config into Adventure.Web, changing DefaultProviderApplicationName and applicationName="Adventure".
  10. Do the same previous steps to create/copy UserProfile class and IUserProfile with FriendlyName property too. Added to the Models directory this time. NOTE: Be sure to get the magic strings right in the UserProfile class or you will get unexpected results.
  11. Add Global.asax and AppInitializer class to it from previous project without the CreateTableFromModel calls which are no longer needed as I understand it.
  12. Drop in the code to create the admin user if it does not already exist.
  13. When I go to modify the LoginStatus.xaml which was previously LoginControl.xaml, but find the needed modification is already there. Skip this step.
  14. Just hit F5 and error is thrown.

After some lengthy research, I discovered a bug in the AspProvider's TableStorageRoleProvider.cs. When the RoleExists method is called and no roles have yet been created an exception is thrown.

AspProvider Fix Found
Replace e.InnerException in line 484 in TableStorageRoleProvider.cs with e.InnerException.InnerException. The first inner exception is another InvalidOperationException. The second inner exception is the DataServiceClientException we're looking for.

Login with admin/admin and we see Administrator displayed in the newly renamed LoginStatus area.

And we’re back to par. Download the code here (566KB).

Windows Azure Tools SDK (November 2009) Refactor or Restart?

The 1.0 release of Windows Azure SDK is now available. Download and read about what’s new here. Sadly, the AspProviders sample project was not included. I spent a few hours trying to bandage it up, but it was a hopeless cause. If a new AspProviders sample is not soon forthcoming, it will be easier to write one from scratch using the new SDK and the new project templates.

Update (11/22/09) - AspProviders in Additional C# Examples: One comment and an additional private email from readers have pointed out Azure Code Samples on MSDN where you can download the "Additional C# Samples" which contatin the new AspProviders. I'll be checking out this and other samples with the new Azure SDK. Thanks to both Jason and Gerold for the links.

When I tried to run my old project, after modifying some references to match those of a fresh project created with the new Visual Studio templates, the first thing that popped was a new local storage database script dialog result:

az-nov09-01 

I’ve not explored them in detail but there are several new tables, some stored procedures and a few user defined functions in the new dev storage database.

I clicked okay and before I got my next error, I noticed the new baloon dialog name for the dev fabric, Windows Azure Simulation Environment:

az-nov09-02

I also compared the csCFG and and csDEF config files for the Azure cloud service project. The XML namespace is the same and as near as I can tell there are no changes there.

The next thing I noticed was a new class in the web role of the new project template. Seems we now have a “Windows Service”-like OnStart and RoleEnironmentChanging set of startup and status change event handlers. This will be highly useful for web role applications that may have resource cleanup needs.

public class WebRole : RoleEntryPoint
{
	public override bool OnStart()
	{
		DiagnosticMonitor.Start("DiagnosticsConnectionString");

		// For information on handling configuration changes
		// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
		RoleEnvironment.Changing += RoleEnvironmentChanging;

		return base.OnStart();
	}

	private void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
	{
		// If a configuration setting is changing
		if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
		{
			// Set e.Cancel to true to restart this role instance
			e.Cancel = true;
		}
	}
}

Another bit of great news is that the StorageClient goes from “sample” project to part of the release libraries. There were ample changes there which appear to have broken the AspProviders sample project severely. While that explains the missing AspProviders project disappointment, I do hope that it means the changes in the StorageClient namespace will make using Azure storage easier.

Much more to learn and blog about with the new release, but it looks as though I’ll need to start over on my “Aventure” sample. I’m glad that I don’t have loads of code to refactor to the new release. It makes starting over less painful. For those who have written large projects relying on the beta, all I can say is, “beta = subject to change.”