Dennis Burton's Develop Using .NET

Change is optional. Survival is not required.
Tags: Azure

This is one of those examples of where demo code and samples diverge from the code you use in a real application. For almost every example you see for loading configuration settings, like connection strings, you will see the following code:

RoleEnvironment.GetConfigurationSettingValue(key)

This code will pull the value in the service configuration file associated with the given key. This would be sufficient if we had the same tools for a service configuration that we had for web.config files. For web.config files, we have a set of XSL files that can be used to apply different settings for different environments. One common place this occurs is in connection strings. In the real world, we set up the configuration strings for production, staging and development environments once and use the config file generated for the current environment.

Adapting config in Azure

Until the XSL tools are available for the service configuration, this technique and helper class can reduce the amount of work you have to perform on your service configuration when changing environments during the development cycle. First, add an entry in your service configuration to indicate your current environment. I use values of Development, Staging, and Production for this setting. Then for your entries that have a different value between the different environments, create a setting with the original setting name prefixed with the environment name. The image below shows how this would look for the setting ConnString.

EnvironmentBasedConfigSettings

The AzureConfigurationSettings class (shown below) will provide the functionality for the CurrentEnvironment setting to be used for determining which of the ConnString values should be used for the current instance. Using this utility, all of the appropriate connections strings may be kept in the service configuration and only one setting needs to change when switching environments. In order to consume this in your code, use AzureConfiguraitonSettings instead of RoleEnvironment when reading configuration settings.

AzureConfigurationSettings.GetConfigurationSettingValue(key);

It is worth noting that Alex Lambert has found a way to run the configuration transforms on service configuration files. He documents the process in a blog post. A side effect of this is that you lose some ability to use the visual tools for editing configuration. There is however much less noise in the configuration file than the approach in this blog post. Until support for configuration transforms is added to Visual Studio, you will have to pick your pain points.

How it works

The AzureConfigurationSettings class will first attempt to use the configuration setting without the decoration for current environment. This ensures that code you have today will function the same. If no value was found without decoration, the code will attempt to use the configuration setting prefixed with the string found in the CurrentEnvironment setting. I also handled the RoleEnvironment.Changing event so that if a new configuration were loaded through the portal with the CurrentEnvironment setting changed, the cache of that value would be invalidated.

The Source

The code you see below may also be found on bitbucket with the rest of the solution.

public class AzureConfigurationSettings
{
  private const string CurrentEnvironmentKey = "CurrentEnvironment";
  private readonly static object lockObj = new object();

  static AzureConfigurationSettings()
  {
    RoleEnvironment.Changing += (sender, e) => {
        var hasCurrentEnvironmentChanged = e.Changes
          .OfType<roleenvironmentconfigurationsettingchange>()
          .Any(change => change.ConfigurationSettingName == CurrentEnvironmentKey);

        if (hasCurrentEnvironmentChanged)
        {
          lock (lockObj)
          {
            isCurrentEnvironmentLoaded = false;
          }
        }
      };
  }

  public static string GetConfigurationSettingValue(string key)
  {
    string value;
    if (TryGetValue(key, out value))
      return value;

    return null;
  }

  private static bool isCurrentEnvironmentLoaded = false;
  public static string currentEnvironment;
  public static string CurrentEnvironment
  {
    get
    {
      if (!isCurrentEnvironmentLoaded)
      {
        lock (lockObj)
        {
          if (!isCurrentEnvironmentLoaded)
          {
            try
            {
              currentEnvironment = RoleEnvironment
                .GetConfigurationSettingValue(CurrentEnvironmentKey);
            }
            catch (RoleEnvironmentException)
            {}
            isCurrentEnvironmentLoaded = true;
          }
        }
      }
      return currentEnvironment;
    }
  }

  private static bool TryGetValue(string key, out string value)
  {
    value = null;
    try
    {
      value = RoleEnvironment.GetConfigurationSettingValue(key);
    }
    catch (RoleEnvironmentException)
    {
      try
      {
        if(!string.IsNullOrEmpty(CurrentEnvironment))
          value = RoleEnvironment.GetConfigurationSettingValue(CurrentEnvironment + key);
      }
      catch (RoleEnvironmentException)
      {
        return false;
      }
    }

    return true;
  }
}
Tags: Azure

In the last post I talked about sequence I would take for this migration. The first step in this sequence will be to migrate the database from an on-premises SQL Server to SQL Azure.

Creating the on-premises database

The purpose of this database is to support an application where logged in users may post questions to the instructors at the Windows Azure Boot Camps being hosted around the country. After a question has been posted the attendees can vote to indicate the general interest level in the questions. At the end of the event, a quick run through the list can be used to uncover any unanswered questions.

The database was created with all of the default options via the SQL Server Management Studio with the name of AzureOverflow. The relatively simple needs for this database are a table for questions, a table for votes, and support for user management. The scripts to create the tables and relationships that I added to the database can be found here. The ASP.NET Membership provider was chosen to provide the user management support needed for this application. The command line used to create the membership objects was:

aspnet_regsql –A mr –d AzureOverflow –E

Migration Analysis

The primary tool we will use to perform this migration is the SQL Azure Migration Wizard. As of this writing, this tool has the capacity to do migrations from:

  • SQL Server to SQL Azure
  • SQL Azure to SQL Server
  • SQL Azure to SQL Azure

What I will focus on here is the analysis features of this tool. This will give you some warning about areas where you might expect issues to arise when performing the migration. This should be your first stop before you actually try to move your data to SQL Azure. To perform the Analysis, start up the Migration Wizard and select SQL Database under the Analyze Only section.

SQLAzureMigrationWizardAnalyzeOnly

After this you will be prompted for the necessary information to connect to your SQL Server. Analysis results from the AzureOverflow database indicate that there are a few issues with some of the items created by the Membership Provider, but the items required to support questions and voting were fine.

SQLAzureMWWithoutScriptingOptionsSet

Migrating Membership Data

As it turns out, using the ASP.NET Membership Provider on SQL Azure is a solved problem. An updated tool for creating the Membership database on SQL Azure can be found in KB2006191. The command line will be similar to the aspnet_regsql tool with a few modifications. Since Windows Authentication is not currently supported within SQL Azure, we will need to provide a user name and password for SQL Authentication. A server name will also be required since the tool will not be running against a local database.

aspnet_regsqlazure –s mydatabase.database.windows.net –d AzureOverflow –u username@mydatabase –p P@ssw0rd –a mr

After running this command, all of the required elements for the Membership Provider will be created on our SQL Azure database. That, however, is not the end of the story in regards to the Membership Provider. No real business is going to be willing to throw away the user data already stored in their on-premises database, so next we need to migrate the data. The most efficient way to deal with copying a mass of data from one database to another is to use BCP (Note that the Migration Wizard uses BCP as well). Sure, you could use the Generate Scripts feature from within Management Studio, but the result of that would be an insert statement for each entry in the tables that you want to migrate. If you were performing a migration on a database of any reasonable size, this is not a viable option. BCP is by far the most efficient way to move our data. We will perform the BCP operation in two steps. First we need to export the data from each table into a file:

bcp dbo.MyTableName out MyTableName.dat –n –S MySqlServerName –T

Next, we need to import the data from the file created above into the SQL Azure database. Again, note that we will need to provide the server name as well as user name and password for SQL Server authentication.

bcp dbo.MyTableName in –n –S myserver.database.windows.net –U username@myserver –P P@ssw0rd –E

A command file which performs this export and import on all of the Membership Provider tables used by this database can be found here.

Migration of the Application Data

This portion of the migration that relates to questions and voting is as it should be, completely uninteresting. The SQL Azure Migration Wizard can handle the tables and data required to support the questions and votes without issue. When the option comes up to Choose Objects, I will select all of the tables that have not already been migrated with the Membership Provider in the steps above.

SQLAzureMigrationChooseObjects

After connecting the wizard to our SQL Azure database, the tables will be created and the application data migrated without incident. In most cases, the Migration Wizard will be the only tool you need. Be sure to perform the analysis first to determine if this is true for your data. For more information on using the migration wizard check out this post by Rich Dudley. It is an excellent example that walks through using the wizard from start to finish.

Configuration Changes

Since SQL Azure is simply a TDS endpoint, the only change that needs to occur inside of our web application is the connection string to the database. There are a couple of differences required of connection strings for SQL Azure databases. First, the connection is to a database and not to a server, so the Initial Catalog parameter of the connection string is a required element. Also, as already mentioned, the current version of SQL Azure does not support Trusted Connections, so you will need to provide a user name and password in the connection string. Also note that the user name is required to be in the format user@server. I have heard rumblings that this will not be required in the near future, but nothing that is confirmed. This results in a connection string entry that looks like:

<add name="ApplicationServices" connectionString="Data Source=DATABASENAME.database.windows.net;Initial Catalog=AzureOverflow;User Id=USERNAME@DATABASENAME;Password=PASSWORD;" />

Testing the Application

After migrating the user data, application data, and updating the application configuration, our web application is now using a SQL Azure database rather than the on-premises SQL Server. Make sure you run through your usual battery of smoke tests to validate major pieces of functionality at this point.

Use the Force

If you have a database that the Migration Wizard simply will not move successfully, be sure to check out Roger Doherty’s blog post on Brute Force Migration of Existing SQL Server Databases to SQL Azure. For this method to work, you will need to change the default scripting option to script for a SQL Azure database before you generate your DDL scripts.

SQLScriptingOptions

There is very little that can go wrong using the brute force approach listed in the article, but it is certainly a fair bit more effort than the Migration Wizard. So if at all possible, stick with the Migration Wizard for a simple and painless migration. As with many things cloud related, a hybrid approach may be your best options. You can see that throughout this post with the use of the Migration Wizard, updated aspnet_regsqlazure to perform the DDL operations, and using BCP to migrate the user data. Carefully consider your cloud migration tasks; There are usually several ways to perform a task. Determine the cost of these options in terms of your time, the time required to perform the task, and the cost of the cloud resources you will use during the migration. All of these costs will influence your choice of migration techniques.

Tags: Azure

There is no shortage of great material out there to help you build your next application on the Windows Azure platform. What I will do in this series is show the process of migrating an on-premises web application to Windows Azure. What you will see over the next few posts is an application that I built for to help prioritize questions during the Azure Boot Camp events. The application is built using the tools and patterns that I use today to build web applications. You will see appearances from MVC2, Fluent NHibernate, MVCContrib, and the Windsor Container.

Many of these tools were added to the project with minimal effort using NuGet. One of the features that saved time on this project was the ability of NuGet to add assembly binding redirects for projects that have dependencies on multiple versions of a library. The Add-BindingRedirects call from the Package Manager Console took care of this problem for the Castle.Core assembly which had bindings to 1.x and 2.x versions of this assembly.

Patterns

I leveraged the repository pattern whose only public methods were calls for data operations that were required to build the application. Additional code was not added to support CRUD operations on every single element of the model, when there was no requirement for this functionality. As the migration progresses and the data access code is replaced, this repository tells us exactly which operations need to be supported.

You will also see the use of the view model concept. The controller will fetch the required model data and perform the aggregation necessary to present exactly the information that is needed by the view. This approach allows for much thinner view code allowing the HTML of the page a higher percentage of the code.  Check out the spark view engine or the upcoming Razor View engine in MVC3 for even cleaner views.

Planning the Migration

Just as with any application, the migration will not take place in one step. When performing a migration from on-premises to the cloud, it is even more important to consider where you will get value in migrating first as well as reducing the cost of the intermediate steps intended only to assist in the process of migration. This is the order that I chose to do the migration and my justification for that order.

Step 1: Migration to SQL Azure

Infrastructure - It is easier to point your on-premises app to the SQL Azure instance by changing configuration settings than it is to open up your database server that is likely sitting behind a firewall(s) not visible to the outside world. There is a non-trivial amount of infrastructure work necessary to make that happen. Since we know that our next step will be the migration of the web hosting to Windows Azure, I would choose not to take on that work that would be thrown away once the web instances are relocated.

Cost – Cloud pricing is a very dynamic point and one that changes for each scenario. I can only speak to the scenarios I have worked with in the past. Your scenario may be different, do your own math. In the larger web applications I have worked on, the database hardware was the most expensive hardware purchased and the licensing cost for SQL Server in an Active/Passive configuration is non-trivial. When moving to SQL Azure, the licensing and the failover support are built into the cost. If you are looking at version upgrades in the near future, this kind of migration might just be a good deal for you.

Step 2: Migration of the Web Servers

Missing Pieces – This application does not have an app server doing a lot of data processing. If it did, I would choose to migrate those processes first. The processes that are doing the heavy lifting with data need to sit as physically close to the data as possible in order to have reasonable performance. Another important consideration is that this type of code usually has the fewest dependencies. That reduces the work load of verification once migration of this step has completed.

Infrastructure – Once this step is completed, the application will run entirely on Windows Azure allowing us to decommission or repurpose the on-premises hardware. If you are already building applications to scale well with your own hardware, it is likely that there will be few things to be modified to run in a web role.

Step 3: Migration to Cheaper Storage

Where it makes sense, we are going to look at migrating data to Table Storage where the charges are (as of this writing) $0.15 per GB/month vs. $9.99 per GB/month on SQL Azure. The planning for this stage will require us to look at our data consumption patterns. While SQL Azure is quite a bit more expensive for storing our bits, there are no charges for transactions. With Table Storage there is a charge for each request that leaves the data center. Since we have already migrated the web site in the previous step, we will make sure that our data stays inside the data center to avoid these charges. Be aware, changing the storage mechanism is a non-trivial change that includes restructuring our data as well as modifying our repository layer to talk to a different style of data storage. There is a different mindset around designing for Table Storage instead of a relational database.

What is your migration story?

Have you done a similar migration to the Windows Azure platform? I would love to hear what the decision points were for your migration as well as your lessons learned. Feel free to comment on these posts or contact me directly with your feedback.

There are several Boot Camps that will be available in this area. Do you need some justification for yourself or your boss? Check out this feature list:

  • They are focused on the latest technology, so you can work on updating your skills.
  • They are hands-on. Bring your laptop with you to these events, work with your own environment, and keep the applications you were working on for reference in the future. I tend to remember new information much better with my hands on the keyboard than with my eyes forward to a presenter.
  • The cost cannot be beat. These events are free to get into, and they are local so travel expenses are minimal.

Azure Boot Camp

Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. You can obtain a free 30-day pass for Azure at these events. Don’t forget, if you are a MSDN subscriber, your subscription comes with Azure benefits for 8 months!

Windows Dev Boot Camp

For the WIN: Windows Development Boot Camp is a one-day deep dive class on client development. The event covers developing for Windows 7, Internet Explorer 9, and Silverlight 4 out of browser. The class includes a trainer with deep real world experience presenting content, as well as a series of labs so you can write some code and practice what you just learned. Web and cloud may be all the rage, but many developers are still doing hardcore client development. If you are among them, this Boot Camp is for you!

Get Connected – Stay Informed

This is just a taste of what the development community has to offer in this area. This year’s regional conference season is just getting started, so stay tuned in to your local user group for more information on upcoming events and regularly scheduled presentations packed with great technical content.

How many times have you attended, or given, a presentation where the speaker threw out the phrase: “The UI looks horrible, I am not a front end person.” If you don’t want to be that guy, this book is a good start for you! The book Web Design for Developers by Brian P. Hogan helps someone who has had to utter that phrase understand some of the fundamentals of design. This book has an easy to read quality to it that allowed even a slow reader like me to complete the book on a flight from Detroit to Seattle.

Part 1 – The Basics of Design

Developers often perceive the design side of things as an artistic rather than tactical endeavor. This section shows that there are rules and principals to design that, if followed, lead to decent looking results. Exposure to this material alone would make most of the prototypes and presentations I have seen more effective.  The initial chapter covered the purpose of the site that would be designed over the course of the book, presenting compelling arguments for pencil and paper sketching along the way. If you have ever been in one of those early design reviews where the client is focused on the color of the buttons instead of the flow of the application, you know exactly where the author is coming from.

That brings us to the topic of color. There was so much great information in here, starting with the fundamentals of using the color wheel as well as differences in choosing color for web vs. print media. One thing I learned from this chapter was the concept of taking a picture of something in nature and use that as a basis for a color scheme. I find myself looking at colors that occur naturally together and saving them for future use.

Typography was covered next. This was another area where the author called out some of the older practices and idioms that were based on print medium that change a bit for the web. The mind blowing topic for me in this chapter was establishing a layout grid based on the dimensions of the fonts chosen for the site. This is one of those things that seemed so obvious when the author explained; I wondered why it had never occurred to me. This is a great idea that I plan on incorporating on future designs.

Part 2 – Adding Graphics

This section contained a bunch of good advice on choosing graphics as well as some of the mechanics of using Photoshop. I have to admit that I don’t use Photoshop simply because my talent level is covered by Expression, which comes with my MSDN subscription, and Paint.net which has a lot of community support. The concepts covered here applied directly to the tools that I use regularly and taught me to use them better. The section on layers and building graphics took another topic I knew and gave me new skills to work with. Layering is another one of those fundamental concepts that exponentially expands what you can do with a graphics tool.

Part 3 – Building the Site

This section covers what has gotten to be very popular material over the last couple of years. The importance of separating the sematic HTML from the visual aspects in the CSS is explained well here. The concept of separation of concerns is not new to developers, but it does seem like compromises come quickly when a developer is building HTML and CSS. I think this has more to do with exposure to good practices than anything else. The approach covered here will lead to sites that are much easier to extend, and much easier to add features such as dynamic content without post back. If you are an ASP.NET classic developer used to drag and drop design, please read this chapter. Considerable flexibility has been added in ASP.NET 4, this will help you understand why it is important. Since reading this book, this is the section that I find I reference most often.

Part 4 – Preparing for Launch

No book that covers web design would be complete without covering the 800 pound gorilla. The very important topic of dealing with IE in its various releases gets its own chapter. If you have done any web development at all, this is no shock to you. If you do not know the tricks and traps of dealing with IE, it will consume a good portion of your time during the development process. This is an important chapter to be aware of not only for its contents, but also for the references mentioned.

Accessibility is another topic that is popular in design/web development circles, but rarely discussed in the world of the developer. I was first exposed to this topic by my good friend’s book on Testing ASP.NET Web Applications. If you have not been exposed to this topic, you will be amazed at the impact of your design choices on those with disabilities. I have yet to have to code for 508 compliance as a requirement, but these two books would be right by my side if I did.

Conclusions

Web Design for Developers by Brian P. Hogan is an exceptionally well thought out and timely book. With so many developers heading towards MVC based web authoring, the importance of well-constructed HTML and CSS is at a premium. The tactical side of this book covers a lot of important ground, but more importantly the material in Part 1 on theory is some of the best I have seen.  Make no mistake; you will not change professions from back end developer to designer based on this book alone. But, you will have enough of a solid basis for creating things that look professional. No longer will your UI skills be the focal point of your applications, demos, and prototypes. Understanding the basics covered here will allow your core competencies take center stake. If you can’t tell by now, I highly recommend this book!

Dennis Burton

View Dennis Burton's profile on LinkedIn
Follow me on twitter
Rate my presentations
Google Code repository

Community Events

Windows Azure Boot Camp Lansing GiveCamp