Dennis Burton's Develop Using .NET

Change is optional. Survival is not required.
Tags: Azure | knockoutjs | screencast | signalr

Customers today have high expectations of their websites. While they may not be able to tell you what a post back or a page refresh are, they know they do not want to see them. Customers also expect their websites to be alive with content. They have a dynamic business and they expect their web applications to reflect this.

If you are a web developer with the ASP.NET stack in your tool belt, the Knockout.js and SignalR libraries are available to make developing these dynamic applications easier. These libraries handle much of the DOM manipulation and low level communication details so you can get on with adding business value instead of worrying about plumbing. This screen cast series will give you an in-depth introduction to using Knockout.js, SignalR, and git deployment to the new Windows Azure Website platform.

All of the code for the application developed during this screencast is available on github.

Building and MVC 4 application with Knockout.js and Windows Azure Websites:

 

 

Adding real time communication with SignalR:

 

Tags: community

Several years ago Jay Harris (a friend and co-worker) and I decided to start attending some of the local .NET user groups both to increase our skills and make some connections. Of course, we both came up with excuses every month as to why this was inconvenient. But, we eventually pushed each other into going. I need to say “Thank You” to Jay for continually pushing me to be better at what I do.

After a few months of going the Greater Lansing .NET user group and many conversations about what we worked on, Jeff McWherter started convincing me that I had something valuable to contribute. At first, I blew this off and sat happily listening to the presentations. Those of you that know Jeff know that he was not going to let this go. Jeff continued to encourage me to get into speaking. I need to say “Thank You” to Jeff for opening my mind to giving to the community instead of just taking. The impact this has had on me cannot be measured. Community involvement is what led me to a different job where I actually spend time with my family. It is really hard to overstate how important this is as a turning point for me.

A year or so later, I started working at SRT Solutions. SRT is a unique organization that not only encourages keeping with the latest technologies and sharing what you have learned with the broader development community, they also allocate time to accomplish this task. As an example, this year I taught at 4 Windows Azure Boot Camps. Two of these events were in cities that included a day of travel. Combine that with the preparation time and you have about two weeks of time that SRT has invested in allowing me to teach others about something I am passionate about. I need to say “Thank You” to Bill Wagner and Dianne Marsh for building the company that they wanted to work for. I can think of very few places that would have allowed for spending that much time on community, much less encouraged it.

This year the Microsoft developer evangelists from my region, Jennifer Marsman and Brian Prince, mentioned that they wanted to support me being recognized for all of my community related activities. Jennifer is always there to support events in our area. She has helped me put together two conferences, supports all of the user groups that I help out with, and has given me invaluable advice on many occasions. Brian put together the Windows Azure Boot Camps that gave me a platform to talk about what I love. I need to say “Thank You” to Jennifer and Brian for helping me broaden my community connections.

This morning I was greeted with an email that said:

Congratulations! We are pleased to present you with the 2011 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in Windows Azure technical communities during the past year.

I need to say “Thank You” to Microsoft for allowing me the opportunity to participate in this program with a whole bunch of people that are way smarter than me. I look forward to the learning and connections that this will allow me over the next year.

Tags: community | speaking

This weekend I had the pleasure of presenting at Cloud Dev Day Detroit on the topic of migrating an existing application from on-premises to fully hosted on Windows Azure/SQL Azure. I presented an hour long session that had a significant amount of its content dedicated to demonstrating the process of migration. The heavy demo content was important to me as I did not want the audience to see another high level overview that day. I wanted to show real code!

This could go really bad

I decided to check out the internet connection early in the day to see how it would perform with a couple hundred of my closest geek friends checking twitter on their phones. It turns out the speed of the internet connection at the conference was exceptional. However, port 1433 was blocked. This is the port that is used to communicate with a SQL Server(including SQL Azure). A good portion of my demos were based around migrating from a local SQL Server instance to SQL Azure. You might think this was going to go very bad.

The good news

The day prior to the event when I was practicing for my demos (as every speaker should), I fired up Camtasia and recorded all of the demos I planned to give. I originally did this as a way for me to see how they played back and how the pace was for the talk. Having these recordings available for the conference saved my bacon! Not only did it save my bacon for items I was planning on doing live, but I was also able to show more than one deployment to the Windows Azure environment. With the editing tools in Camtasia, I was able to reduce 15-20 minute deployment process to just over a minute. I believe this made for a better experience for those attending this talk.

What do you think?

Does a recorded demo reduce the speaker’s street cred? Is a recorded demo better than the inevitable typos and demo demons that pop up during a presentation? How would you leverage this tool?

Tags:

Thank You to those who came to my session at Cloud Dev Day Detroit. I hope you were able to pick up a few tips, tricks, and guidance during the session.

This is the list of links I referred to during the session:

Tags: Azure

As an Azure developer, you likely followed the recent AWS outage news and were relieved that it happened to Amazon and not to your Windows Azure instances. However, you still need to learn from this cornerstone event in cloud computing history.

The reality is that failure can happen on any of the cloud offerings. Just because someone else is managing the physical machines does not mean that design for failure and disaster recovery planning are optional steps in delivering a solution. Chances are good that most deployments to the Windows Azure environment would not survive a similar event. Let’s take a look at why you may be at risk and actions you can take to safeguard your deployments.

Current Guidance

When you are setting up your hosted services and your storage accounts, current guidance is to put these together in an affinity group. This ensures that the hosted services and the storage accounts stay in the same physical location. If you are using SQL Azure, you will likely configure that instance to be in the same location as the hosted services that would consume its data. This guidance will continue due to the performance and cost advantages of being in the same data center. Ah, now you see the problem. Just like many of the services that were sitting solely in the Northern Virginia data center, your deployment now sits in a single data center.

Performance Advantage of close proximity

There are significant performance benefits of having your data in the same location as your compute instances. Your application would suffer significant latency penalties for having your web role in North Central US and data in North Europe. This latency adds at least 250ms to any call to fetch data. Contrast this with two machines that are on the same rack connected with a gigabit connection. Hosting your data far from your hosted services could well cause an unacceptable level of performance degradation.  I have worked for clients where any database call that took more than 100ms in testing would cause uncomfortable meetings to occur with the resident DBA. This latency is 2.5 times that at best and does not include the data side processing. The takeaway from this is by no means a new concept, nor is it unique to Windows Azure: Keep your data close and performance will improve.

Cost Advantage of close proximity

Under the Windows Azure pricing model, bandwidth between a web role and storage in the same data center is not charged. In the previous sample where the web role is in North Central US and the data is in North Europe, all bandwidth between the two will be charged. Pay-by-resource computing requires that the ongoing cost of a system be kept in mind when designing a solution. The takeaway from this: Keep your data in the same data center and charges can be significantly reduced.

Disadvantage of single datacenter

What happens if you put your compute, storage, and SQL Azure instances in the North Central datacenter, and that data center goes down much like what happened with the AWS North Virginia datacenter? What if it is more like the blackout that occurred in the Northeast a few years ago? You can bet that the Windows Azure team is planning for just such an event, if they have not already. But until this scenario is covered in the SLA that you agree to when using Windows Azure (or any cloud vendor), you need to plan for this failure.

What can I do

So, we have two opposing forces. We need our hosted service, storage, and SQL Azure instance located close together for best performance and most significant cost advantage. We also need hosted services in multiple physical locations for failover. It would seem that there is no resolution between those two goals, but the Windows Azure team has some offerings in the works that help with resolve this conflict. Traffic Manager (in CTP) will allow you to put your hosted services in multiple datacenters and provide traffic routing based on geo-location or round robin. In the event of a datacenter failure, this allows traffic to your web role to be routed to a functioning datacenter. SQL Azure Data Sync (in CTP) will allow you to keep SQL Azure instances in the same location as each of your hosted services and keep the data in sync in the background. The connection strings for each hosted instance would point to the near SQL Azure instance for best performance.

What is missing

As of right now, you would need to write code to keep your storage accounts in sync. Many of your applications may be taking advantage of Windows Azure Table Storage for scalable, low-cost data services. There is currently no offering for keeping your Table Storage data in sync. I think this feature would complete the failover scenario on the Windows Azure platform providing a significant advantage over other cloud offerings today. As such, I have put in a feature request on the site MyGreatWindowsAzureIdea. If you agree with me, please vote up the feature request, which helps raise visibility of the request to the Windows Azure Team.

Tags: screencast | tdd | testing | tools

Test Driven Development has been in full force for quite a few years now. This has lead us to volumes of tests that ensure that we are building the system right. This has proven to be a valuable part of the development process. However, what is missing from the focus on TDD is that the test code is not something you can sit down and talk about with a non-technical client. Having test code that can be read by clients (and potentially even written), facilitates communication on a whole new level. Ultimately, this leads to building the right system.

Our friends in the Ruby community have been enjoying the benefits of at tool called cucumber that allows for the creation of specifications in Gherkin language. This language is a human readable form that can then be translated into automated tests creating a set of executable specifications that a client can read and understand.

SpecFlow is an implementation of a Gherkin based specification engine that runs on .NET and integrates with Visual Studio. In this screencast I show you how to use SpecFlow to create specifications and leverage Telerik’s WebAii Test Automation Framework for driving the browser through code.

Download (52.6 MB) (27:52) (1440x900)

Update: Code used in this screencast is available on bitbucket

This screencast was recorded and edited using Camtasia.

Tags: Azure

This is one of those examples of where demo code and samples diverge from the code you use in a real application. For almost every example you see for loading configuration settings, like connection strings, you will see the following code:

RoleEnvironment.GetConfigurationSettingValue(key)

This code will pull the value in the service configuration file associated with the given key. This would be sufficient if we had the same tools for a service configuration that we had for web.config files. For web.config files, we have a set of XSL files that can be used to apply different settings for different environments. One common place this occurs is in connection strings. In the real world, we set up the configuration strings for production, staging and development environments once and use the config file generated for the current environment.

Adapting config in Azure

Until the XSL tools are available for the service configuration, this technique and helper class can reduce the amount of work you have to perform on your service configuration when changing environments during the development cycle. First, add an entry in your service configuration to indicate your current environment. I use values of Development, Staging, and Production for this setting. Then for your entries that have a different value between the different environments, create a setting with the original setting name prefixed with the environment name. The image below shows how this would look for the setting ConnString.

EnvironmentBasedConfigSettings

The AzureConfigurationSettings class (shown below) will provide the functionality for the CurrentEnvironment setting to be used for determining which of the ConnString values should be used for the current instance. Using this utility, all of the appropriate connections strings may be kept in the service configuration and only one setting needs to change when switching environments. In order to consume this in your code, use AzureConfiguraitonSettings instead of RoleEnvironment when reading configuration settings.

AzureConfigurationSettings.GetConfigurationSettingValue(key);

It is worth noting that Alex Lambert has found a way to run the configuration transforms on service configuration files. He documents the process in a blog post. A side effect of this is that you lose some ability to use the visual tools for editing configuration. There is however much less noise in the configuration file than the approach in this blog post. Until support for configuration transforms is added to Visual Studio, you will have to pick your pain points.

How it works

The AzureConfigurationSettings class will first attempt to use the configuration setting without the decoration for current environment. This ensures that code you have today will function the same. If no value was found without decoration, the code will attempt to use the configuration setting prefixed with the string found in the CurrentEnvironment setting. I also handled the RoleEnvironment.Changing event so that if a new configuration were loaded through the portal with the CurrentEnvironment setting changed, the cache of that value would be invalidated.

The Source

The code you see below may also be found on bitbucket with the rest of the solution.

public class AzureConfigurationSettings
{
  private const string CurrentEnvironmentKey = "CurrentEnvironment";
  private readonly static object lockObj = new object();

  static AzureConfigurationSettings()
  {
    RoleEnvironment.Changing += (sender, e) => {
        var hasCurrentEnvironmentChanged = e.Changes
          .OfType<roleenvironmentconfigurationsettingchange>()
          .Any(change => change.ConfigurationSettingName == CurrentEnvironmentKey);

        if (hasCurrentEnvironmentChanged)
        {
          lock (lockObj)
          {
            isCurrentEnvironmentLoaded = false;
          }
        }
      };
  }

  public static string GetConfigurationSettingValue(string key)
  {
    string value;
    if (TryGetValue(key, out value))
      return value;

    return null;
  }

  private static bool isCurrentEnvironmentLoaded = false;
  public static string currentEnvironment;
  public static string CurrentEnvironment
  {
    get
    {
      if (!isCurrentEnvironmentLoaded)
      {
        lock (lockObj)
        {
          if (!isCurrentEnvironmentLoaded)
          {
            try
            {
              currentEnvironment = RoleEnvironment
                .GetConfigurationSettingValue(CurrentEnvironmentKey);
            }
            catch (RoleEnvironmentException)
            {}
            isCurrentEnvironmentLoaded = true;
          }
        }
      }
      return currentEnvironment;
    }
  }

  private static bool TryGetValue(string key, out string value)
  {
    value = null;
    try
    {
      value = RoleEnvironment.GetConfigurationSettingValue(key);
    }
    catch (RoleEnvironmentException)
    {
      try
      {
        if(!string.IsNullOrEmpty(CurrentEnvironment))
          value = RoleEnvironment.GetConfigurationSettingValue(CurrentEnvironment + key);
      }
      catch (RoleEnvironmentException)
      {
        return false;
      }
    }

    return true;
  }
}
Tags: Azure

In the last post I talked about sequence I would take for this migration. The first step in this sequence will be to migrate the database from an on-premises SQL Server to SQL Azure.

Creating the on-premises database

The purpose of this database is to support an application where logged in users may post questions to the instructors at the Windows Azure Boot Camps being hosted around the country. After a question has been posted the attendees can vote to indicate the general interest level in the questions. At the end of the event, a quick run through the list can be used to uncover any unanswered questions.

The database was created with all of the default options via the SQL Server Management Studio with the name of AzureOverflow. The relatively simple needs for this database are a table for questions, a table for votes, and support for user management. The scripts to create the tables and relationships that I added to the database can be found here. The ASP.NET Membership provider was chosen to provide the user management support needed for this application. The command line used to create the membership objects was:

aspnet_regsql –A mr –d AzureOverflow –E

Migration Analysis

The primary tool we will use to perform this migration is the SQL Azure Migration Wizard. As of this writing, this tool has the capacity to do migrations from:

  • SQL Server to SQL Azure
  • SQL Azure to SQL Server
  • SQL Azure to SQL Azure

What I will focus on here is the analysis features of this tool. This will give you some warning about areas where you might expect issues to arise when performing the migration. This should be your first stop before you actually try to move your data to SQL Azure. To perform the Analysis, start up the Migration Wizard and select SQL Database under the Analyze Only section.

SQLAzureMigrationWizardAnalyzeOnly

After this you will be prompted for the necessary information to connect to your SQL Server. Analysis results from the AzureOverflow database indicate that there are a few issues with some of the items created by the Membership Provider, but the items required to support questions and voting were fine.

SQLAzureMWWithoutScriptingOptionsSet

Migrating Membership Data

As it turns out, using the ASP.NET Membership Provider on SQL Azure is a solved problem. An updated tool for creating the Membership database on SQL Azure can be found in KB2006191. The command line will be similar to the aspnet_regsql tool with a few modifications. Since Windows Authentication is not currently supported within SQL Azure, we will need to provide a user name and password for SQL Authentication. A server name will also be required since the tool will not be running against a local database.

aspnet_regsqlazure –s mydatabase.database.windows.net –d AzureOverflow –u username@mydatabase –p P@ssw0rd –a mr

After running this command, all of the required elements for the Membership Provider will be created on our SQL Azure database. That, however, is not the end of the story in regards to the Membership Provider. No real business is going to be willing to throw away the user data already stored in their on-premises database, so next we need to migrate the data. The most efficient way to deal with copying a mass of data from one database to another is to use BCP (Note that the Migration Wizard uses BCP as well). Sure, you could use the Generate Scripts feature from within Management Studio, but the result of that would be an insert statement for each entry in the tables that you want to migrate. If you were performing a migration on a database of any reasonable size, this is not a viable option. BCP is by far the most efficient way to move our data. We will perform the BCP operation in two steps. First we need to export the data from each table into a file:

bcp dbo.MyTableName out MyTableName.dat –n –S MySqlServerName –T

Next, we need to import the data from the file created above into the SQL Azure database. Again, note that we will need to provide the server name as well as user name and password for SQL Server authentication.

bcp dbo.MyTableName in –n –S myserver.database.windows.net –U username@myserver –P P@ssw0rd –E

A command file which performs this export and import on all of the Membership Provider tables used by this database can be found here.

Migration of the Application Data

This portion of the migration that relates to questions and voting is as it should be, completely uninteresting. The SQL Azure Migration Wizard can handle the tables and data required to support the questions and votes without issue. When the option comes up to Choose Objects, I will select all of the tables that have not already been migrated with the Membership Provider in the steps above.

SQLAzureMigrationChooseObjects

After connecting the wizard to our SQL Azure database, the tables will be created and the application data migrated without incident. In most cases, the Migration Wizard will be the only tool you need. Be sure to perform the analysis first to determine if this is true for your data. For more information on using the migration wizard check out this post by Rich Dudley. It is an excellent example that walks through using the wizard from start to finish.

Configuration Changes

Since SQL Azure is simply a TDS endpoint, the only change that needs to occur inside of our web application is the connection string to the database. There are a couple of differences required of connection strings for SQL Azure databases. First, the connection is to a database and not to a server, so the Initial Catalog parameter of the connection string is a required element. Also, as already mentioned, the current version of SQL Azure does not support Trusted Connections, so you will need to provide a user name and password in the connection string. Also note that the user name is required to be in the format user@server. I have heard rumblings that this will not be required in the near future, but nothing that is confirmed. This results in a connection string entry that looks like:

<add name="ApplicationServices" connectionString="Data Source=DATABASENAME.database.windows.net;Initial Catalog=AzureOverflow;User Id=USERNAME@DATABASENAME;Password=PASSWORD;" />

Testing the Application

After migrating the user data, application data, and updating the application configuration, our web application is now using a SQL Azure database rather than the on-premises SQL Server. Make sure you run through your usual battery of smoke tests to validate major pieces of functionality at this point.

Use the Force

If you have a database that the Migration Wizard simply will not move successfully, be sure to check out Roger Doherty’s blog post on Brute Force Migration of Existing SQL Server Databases to SQL Azure. For this method to work, you will need to change the default scripting option to script for a SQL Azure database before you generate your DDL scripts.

SQLScriptingOptions

There is very little that can go wrong using the brute force approach listed in the article, but it is certainly a fair bit more effort than the Migration Wizard. So if at all possible, stick with the Migration Wizard for a simple and painless migration. As with many things cloud related, a hybrid approach may be your best options. You can see that throughout this post with the use of the Migration Wizard, updated aspnet_regsqlazure to perform the DDL operations, and using BCP to migrate the user data. Carefully consider your cloud migration tasks; There are usually several ways to perform a task. Determine the cost of these options in terms of your time, the time required to perform the task, and the cost of the cloud resources you will use during the migration. All of these costs will influence your choice of migration techniques.

Tags: Azure

There is no shortage of great material out there to help you build your next application on the Windows Azure platform. What I will do in this series is show the process of migrating an on-premises web application to Windows Azure. What you will see over the next few posts is an application that I built for to help prioritize questions during the Azure Boot Camp events. The application is built using the tools and patterns that I use today to build web applications. You will see appearances from MVC2, Fluent NHibernate, MVCContrib, and the Windsor Container.

Many of these tools were added to the project with minimal effort using NuGet. One of the features that saved time on this project was the ability of NuGet to add assembly binding redirects for projects that have dependencies on multiple versions of a library. The Add-BindingRedirects call from the Package Manager Console took care of this problem for the Castle.Core assembly which had bindings to 1.x and 2.x versions of this assembly.

Patterns

I leveraged the repository pattern whose only public methods were calls for data operations that were required to build the application. Additional code was not added to support CRUD operations on every single element of the model, when there was no requirement for this functionality. As the migration progresses and the data access code is replaced, this repository tells us exactly which operations need to be supported.

You will also see the use of the view model concept. The controller will fetch the required model data and perform the aggregation necessary to present exactly the information that is needed by the view. This approach allows for much thinner view code allowing the HTML of the page a higher percentage of the code.  Check out the spark view engine or the upcoming Razor View engine in MVC3 for even cleaner views.

Planning the Migration

Just as with any application, the migration will not take place in one step. When performing a migration from on-premises to the cloud, it is even more important to consider where you will get value in migrating first as well as reducing the cost of the intermediate steps intended only to assist in the process of migration. This is the order that I chose to do the migration and my justification for that order.

Step 1: Migration to SQL Azure

Infrastructure - It is easier to point your on-premises app to the SQL Azure instance by changing configuration settings than it is to open up your database server that is likely sitting behind a firewall(s) not visible to the outside world. There is a non-trivial amount of infrastructure work necessary to make that happen. Since we know that our next step will be the migration of the web hosting to Windows Azure, I would choose not to take on that work that would be thrown away once the web instances are relocated.

Cost – Cloud pricing is a very dynamic point and one that changes for each scenario. I can only speak to the scenarios I have worked with in the past. Your scenario may be different, do your own math. In the larger web applications I have worked on, the database hardware was the most expensive hardware purchased and the licensing cost for SQL Server in an Active/Passive configuration is non-trivial. When moving to SQL Azure, the licensing and the failover support are built into the cost. If you are looking at version upgrades in the near future, this kind of migration might just be a good deal for you.

Step 2: Migration of the Web Servers

Missing Pieces – This application does not have an app server doing a lot of data processing. If it did, I would choose to migrate those processes first. The processes that are doing the heavy lifting with data need to sit as physically close to the data as possible in order to have reasonable performance. Another important consideration is that this type of code usually has the fewest dependencies. That reduces the work load of verification once migration of this step has completed.

Infrastructure – Once this step is completed, the application will run entirely on Windows Azure allowing us to decommission or repurpose the on-premises hardware. If you are already building applications to scale well with your own hardware, it is likely that there will be few things to be modified to run in a web role.

Step 3: Migration to Cheaper Storage

Where it makes sense, we are going to look at migrating data to Table Storage where the charges are (as of this writing) $0.15 per GB/month vs. $9.99 per GB/month on SQL Azure. The planning for this stage will require us to look at our data consumption patterns. While SQL Azure is quite a bit more expensive for storing our bits, there are no charges for transactions. With Table Storage there is a charge for each request that leaves the data center. Since we have already migrated the web site in the previous step, we will make sure that our data stays inside the data center to avoid these charges. Be aware, changing the storage mechanism is a non-trivial change that includes restructuring our data as well as modifying our repository layer to talk to a different style of data storage. There is a different mindset around designing for Table Storage instead of a relational database.

What is your migration story?

Have you done a similar migration to the Windows Azure platform? I would love to hear what the decision points were for your migration as well as your lessons learned. Feel free to comment on these posts or contact me directly with your feedback.

There are several Boot Camps that will be available in this area. Do you need some justification for yourself or your boss? Check out this feature list:

  • They are focused on the latest technology, so you can work on updating your skills.
  • They are hands-on. Bring your laptop with you to these events, work with your own environment, and keep the applications you were working on for reference in the future. I tend to remember new information much better with my hands on the keyboard than with my eyes forward to a presenter.
  • The cost cannot be beat. These events are free to get into, and they are local so travel expenses are minimal.

Azure Boot Camp

Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. You can obtain a free 30-day pass for Azure at these events. Don’t forget, if you are a MSDN subscriber, your subscription comes with Azure benefits for 8 months!

Windows Dev Boot Camp

For the WIN: Windows Development Boot Camp is a one-day deep dive class on client development. The event covers developing for Windows 7, Internet Explorer 9, and Silverlight 4 out of browser. The class includes a trainer with deep real world experience presenting content, as well as a series of labs so you can write some code and practice what you just learned. Web and cloud may be all the rage, but many developers are still doing hardcore client development. If you are among them, this Boot Camp is for you!

Get Connected – Stay Informed

This is just a taste of what the development community has to offer in this area. This year’s regional conference season is just getting started, so stay tuned in to your local user group for more information on upcoming events and regularly scheduled presentations packed with great technical content.

Dennis Burton

View Dennis Burton's profile on LinkedIn
Follow me on twitter
Rate my presentations
Google Code repository

Community Events

Windows Azure Boot Camp Lansing GiveCamp