Dennis Burton's Develop Using .NET

Change is optional. Survival is not required.
Tags: screencast | tdd | testing | tools

Test Driven Development has been in full force for quite a few years now. This has lead us to volumes of tests that ensure that we are building the system right. This has proven to be a valuable part of the development process. However, what is missing from the focus on TDD is that the test code is not something you can sit down and talk about with a non-technical client. Having test code that can be read by clients (and potentially even written), facilitates communication on a whole new level. Ultimately, this leads to building the right system.

Our friends in the Ruby community have been enjoying the benefits of at tool called cucumber that allows for the creation of specifications in Gherkin language. This language is a human readable form that can then be translated into automated tests creating a set of executable specifications that a client can read and understand.

SpecFlow is an implementation of a Gherkin based specification engine that runs on .NET and integrates with Visual Studio. In this screencast I show you how to use SpecFlow to create specifications and leverage Telerik’s WebAii Test Automation Framework for driving the browser through code.

Download (52.6 MB) (27:52) (1440x900)

Update: Code used in this screencast is available on bitbucket

This screencast was recorded and edited using Camtasia.

Tags: community | patterns | tdd | testing
I will be presenting items from the PatternsInTesting series as well as some additional content in the Test Driven is Driving me Insane talk at the Great Lakes .net User Group on 3/18/2009 and at the Northwest Ohio .net User Group on 4/21/2009. This has been a really fun talk so far and I have enjoyed the conversation it generates. Stop by if you can make it.

Tags: community | patterns | tdd | testing

I will be presenting items from the PatternsInTesting series at the Greater Lansing .net User Group Flint meeting. I was compelled to put this blog series and presentation together to address the pain many organization experience when trying to include automated testing into their development process. The content is based on the insight and lessons learned that I have picked up by experiencing the same transition in multiple organizations. Participants in this presentation will walk away with tools for writing more effective tests and how to better identify issues in tests.

Tags: tdd

In my last post, I mentioned the new ValuesAttribute that can be used as a test factory to generate a series of tests with many permutations of parameters. Looking into that feature led me to look at the feature set in NUnit 2.5 which is currently in Alpha 4. Some of these features address scenarios that I have run into in my test code. I wanted to mention them here so others could start benefiting as well.

The source of the new feature set that I am pulling from is the current release notes.

Movin' on up

One of the items that I think is most important is not a new feature, just a change in location. The change in mind set is what stands out as important to me. The Is, Has, Text, and List constraints have been moved into the NUnit.Framework namespace. They were formerly off in the more obscure NUnit.Framework.SyntaxHelpers namespace. The assertions that are created using these constraints line up much better with BDD and have a more readable feel to them. I am happy to see them becoming part of the mainstream namespace. If you are not familiar with the constraint model, check out code below.

// Classic model
Assert.AreEqual(expected, actual);
// Constraint model
Assert.That(actual, Is.EqualTo(expected));

I think the second reads much more like the English phrase that the constraint would represent. Check out the docs; you will find that almost everything you can do with the "classic" model can also be done in the constraint model.

Chained Setup and TearDown

Test code should be crafted with the same level of care as production intent code. This means that it should be DRY and carefully designed. Many times you will be testing a related set of classes, leading to base classes in your test code. Don't be afraid to capture common test code functionality in base classes. Prior to this release, the Setup and TearDown attributes could be applied in your base class, but if you applied the Setup or Teardown attributes in your derived class, the calls were not chained. You had to remember to call the base class version from the derived class. In 2.5, the base Setup method will be call prior to the derived Setup method by the framework. One more win for automation versus developer discipline.

New constraints

Chris Marinos recently blogged about disliking the [ExpectedException] style of tests and wrote a helper class to support the syntax of Throws.Exception. As of the 2.5 release Throws.Exception is included. This allows for much more focused testing. Putting the ExpectedExceptionAttribute on a method simply says that somewhere in the method the exception will get thrown. The exception may well be thrown 2 lines prior to the call you were intending to test. This yields a false positive test result. Throws.Exception (and its set of related Throws) takes a delegate so that you can check a specific call for the presence of an exception.

[Test]
[ExpectedException]
public void PassesButShouldnt()
{
    int importantPreWork = int.Parse("abcd");
    DoStuff();
}

[Test]
public void FailsProperly()
{
    int importantPreWork = int.Parse("abcd"); 
    Assert.Throws<Exception>( DoStuff );
}
public void DoStuff()
{ throw new Exception("Something is terribly wrong"); }

In the example, the PassesButShouldnt test passes due to the expected exception being thrown on the Parse call. Unfortunately, the test was intended to ensure that DoBadStuff threw an exception. The FailsProperly test more accurately checks that only the DoBadStuff call throws an exception. In addition, it also reads better as it places the constraint on the same line as the call just like any other Assert method.

Attribute testing

With the Has.Attribute constraint, you can verify an object (yes, object not class) is decorated with an attribute. A common example of where this could be used is validating that a class has been marked with the SerializableAttribute.

[Serializable]
public class testclass {}

[Test]
public void Test()
{ Assert.That(new testclass(), Has.Attribute<SerializableAttribute>()); }

I have to admit, when I first saw the Has.Attribute constraint, I thought for sure one of my test base classes was going to get a bit lighter. One of the applications that I work on has a pluggable architecture and uses attributes to determine what should be exposed to the end user. I had to code up a method that used reflection to determine if the items that were supposed to be exposed did in fact contain the attribute that exposed them. I was somewhat disappointed to see that it only seemed to work with instances of objects and not Types or MemberInfos. This did get me thinking, however, that my approach of putting this in a base class may not have been as elegant as learning about extending NUnit. Perhaps that will be a post in the future. I should call out that this was in the NUnit blog as a feature added in Alpha 1, however, it is the only feature listed here that has not made it to the documentation yet. That could mean that it just has not been documented yet, or it could indicate that it is not yet ready for prime time.

Range Testing

Is.InRange allows you to use a more concise syntax for expressing assertions that required both a GreaterThan and LessThan for validating an item is within bounds.

[Test]
public void RangeTesting()
{
    // old way
    Assert.That(3, Is.GreaterThanOrEqualTo(1) & Is.LessThanOrEqualTo(5));
    // new way
    Assert.That(3, Is.InRange(1, 5));

    // NOTE: InRange is inclusive
    Assert.That(5, Is.InRange(1, 5));
    Assert.That(1, Is.InRange(1, 5));
}

New Assertions

In some tests you are able to determine early in the test that the expected behavior has been met. Rather than throwing in a return statement or creating triangular code, the new Assert.Pass method is available for early termination with passing results. There is also a new inconclusive result state that can be set by calling Assert.Inconclusive. I have no idea how this can be used; I cannot think of any test scenarios that I have had to implement where I really was looking for "Assert...oh...I don't know." It seems there is a failure in the system at this point, even if that failure point is not understanding the requirements. I would take this as an indicator to go clarify requirements with the client and write a better test.

Specialized Assertions

CollectionAssert has picked up a set of IsOrdered constraints. This constraint uses the IComparable interface to verify increasing or decreasing order of all of the items in a collection. This is another one of those features that I have implemented in a test utilities class. I am happy to remove those lines of code.

Previous versions of NUnit had a FileAssert class to verify that your application was generating an output file. DirectoryAssert has been added as a compliment to FileAssert. The DirectoryAssert.IsWithin method (or IsNotWithin) will crawl from the directory specified, including all of its subdirectories, to verify that the expected directory is present. You can also use IsEmpty to verify if a file was output. You are, of course, still responsible for determining that the correct output was written.

Generic Support

There are several type constraints in NUnit that take typeof(class) as a parameter. The following constraints have been updated to include a version that takes a generic parameter specifying the type.

Assert.IsInstanceOf<T>(object actual);
Assert.IsNotInstanceOf<T>(object actual);
Assert.IsAssignableFrom<T>(object actual);
Assert.IsNotAssignableFrom<T>(object actual);
Assert.Throws<T>(TypeSnippet code);

The TestFixtureAttribute also picked up a generic version. Use this to test a class that takes a generic type parameter. Specify this attribute multiple times on a class, each time specifying a different type for the generic parameter. For each type specification, a new test instance is created and executed. This is useful when you were trying to exercise a class under test with a couple of different types. It does seem like this is the wrong level of abstraction for a test, but I am sure it is a reaction to some level of demand.

Try it out

Every one has a different tolerance for using things that have not been marked as released, yet. Early releases of NUnit traditionally have been very solid. Much like GMail still being in beta, it seems the open source community drop releases much more often, but keeps them flagged with alpha and beta. I have already started using these features to produce cleaner, more readable test code.

Tags: programming | tdd

RowTests are a great tool for consolidating tests that have the same logic, but differ by the parameters and the expected values. I did a post a while back on using RowTest in MbUnit, and Michael Eaton recently posted on using RowTest with NUnit. Reading Eaton's post reminded me that I had just run into a new feature in the NUnit 2.5 drop. One of my favorite features of competitive open source tools is that they usually add at least one feature more than the other guy with each release. For the 2.5 release of NUnit, one of those features is the ValuesAttribute. Granted, MbUnit had this feature in concept with CombinationalTestAttribute, but the usage of the ValuesAttribute in NUnit seems much cleaner to me.

Tests with no [Values]

Looking at the previous posts on RowTests, you can see that passing in many values for a specific test is not that difficult. So why would we need anything else? In certain scenarios, the number of permutations that you would have to write Row attributes for could be pretty cumbersome, and if it is truly a permutation scenario, difficult to maintain. The test case for this post will be that the test should be called with all possible combinations of 1,2,3 for one parameter and 10,20,30 for another parameter. With this simple example you can see that we have already blown up to 9 rows, adding on more variant onto either of the parameters and we would have 12 rows to maintain.

Giving your tests [Values]

What the values attribute does is allow you to express all 9 of these combinations in a very concise syntax. This lines up better with the expression of your requirements; instead of 9 Row attributes, you would now have the following:

[Test]
public void Use_Value_Attributes([Values(1,2,3)] int param1,[Values(10,20,30)] int param2)
{
    r.DoStuffWithParms(param1, param2);
}

This nicely wraps up in one line what would have been expressed with 9 [Row] attributes in the past. Bring this up in the NUnit gui to verify that the tests you were looking for were in fact generated.

ValueAttributes

What do you use it for

Now that you have such a powerful tool in you toolset, start to think about how this would be useful. The real world case where I use this feature ensures that interaction with an external library is correct both in number and order of calls when the actual parameters from the system are applied. This test is purely focused on the behavior of the system not the expectation of a mathematical result.

What wouldn't you use it for

It would be very difficult to tie an expected value to the generated permutations. So if you are going after a specific output from an algorithm, a RowTest will work much better for you. Keep in mind that the RowTest feature has been around a while, so it is likely that you will use it more often.

Tags: fundamentals | tdd | VS tricks

In the previous post, I walk through creating a web site project where C# and Visual Basic code live happily in the same project file. The call to action from that post is to ask "What this could be used for?" If you recall from that post, some of the distinguishing features of VB9 over C#3 are XML Literals and expanded support for LINQ query expressions, but where would I use these features? I can't think of a good reason to use these features in the code-behind of a web site project (user interface layer); places where I would use LINQ expressions, for instance, would be in a business logic layer.

The Process
When creating business logic functionality, under TDD we would first create a test to ensure that this functionality does what it is supposed to. Using a web site project under this process, things start to fall apart. A website project does not create an explicit assembly that you can reference inside of a unit test library project. I created a test assembly in the same solution as the MultiLanguage project. When I tried to add a reference, the project was not available either. Because we can't reference the web project assembly, we can't test it, which means there is no test coverage of the components inside App_Code. This is a deal breaker for me; I sure hope that I have missed something along the way.

The Feature Request
What I really want is a class library project that acts like the website project when it comes to adding files in multiple languages. I want to be able to add a partial class implementation in the primary language of my choice. I also want to be able to add a partial class implementation that will allow me to implement methods on that class in a secondary language if that language is better suited for a particular task. The compiler that works with the website project is clearly able to handle this scenario. How about letting Visual Studio do this for me in class libraries?

Looking Forward
As of right now, the benefits of using both VB and C# do not seem to be worth the cost of the extra steps required to build a testable assembly. C# and VB are not that far apart; there are relatively few differences that make one more effective than the other, and that effectiveness is marginal. That said, there is a lot of language work going on top of the CLR. There are significantly different approaches to problems that are enabled by using a functional languages like F#. There is a totally different set of solutions in the dynamic language world enabled by the DLR in languages like IronRuby and IronPython. I want to be enabled to use the best fit language for a given problem and package up that solution in a way that makes sense. I don't want to have to create a new class library and package that into the deployment process every time I want to use a feature from a different language.

If you are only building web solutions and test coverage is not important to you, than the technique described would meet that need. There is still a hole for non-web solutions. There are still plenty of apps written in WinForms, Console apps, or even just a single library. It would be handy if Visual Studio would provide for the use of this technique in environments other than web site projects.

 

Tags: CodeRush | community | programming | tdd

One of the things I really enjoy about the local developer groups is the sharing of ideas and usage patterns of common tools. At tonight's Ann Arbor .net Developers Group, the topic of CodeRush templates came up. I commonly use a set of templates centered around the Rhino Mocks framework As I was describing these templates Jay Wren claimed they should be posted. The irony in that is the first time I saw someone using CodeRush was in a presentation Jay was giving on IoC. As a result of that presentation, we are using Castle Windsor in our application and CodeRush as a productivity tool. 

Creating a Mock Instance

To use Rhino Mock to create a mock instance, you write something like:

MyType myType = (MyType)mocks.DynamicMock(typeof(MyType));

This is an exceptionally redundant exercise that just screams for a Template. The core part of the template takes a string from the Get string provider (named Type) for the type of the instance to create. It will also name the variable with the same name as the type with a slightly different format. Even though this variable name is a linked field, you can break the link by pressing Ctrl-Enter while the cursor is on the variable name. This is commonly required in when creating multiple mock instances in the same scope.

#MockInstance#: Base template not intended to be called directly

«Caret»«Link(«?Get(Type)»)»«BlockAnchor»«Link(«?Get(Type)»,FormatLocalName,PropertyNameFromLocal)»= («Link(«?Get(Type)»)»)mocks.DynamicMock(typeof(«Link(«?Get(Type)»)»));

mk: the type name is initialized to MyType

«?Set(Type,MyType)»«:#MockInstance#»

mk\: the type name is initialized from the Clipboard

«?Set(Type,«?Paste»)»«:#MockInstance#»

Setting up the Tests

The mock instance templates are not much use without having the MockRepository set up and test methods to call. These templates come in handy as well.

mtf: the test fixture

[TestFixture]
public class «Caret»My«BlockAnchor»Test
{
 private MockRepository mocks;

 [SetUp]
 public void Setup()
 {
  mocks = new MockRepository();
 }
 
 «Marker»
}


mt: the test

[Test]
public void «Caret»Test«BlockAnchor»()
{
 «Marker»
 mocks.ReplayAll();
}

Hopefully these templates will be found useful and maybe even kick off some ideas for new ones.

UPDATE: Yea, it would be easier if I added the export file.

CSharp_Custom.xml (12.45 KB)
Tags: tdd

While working on the tests for the next post, I was reminded of how cool MbUnit can be. Most tests have the signature of public void TestMethod(). In some cases, this causes a propagation of tests in order to accomplish running pretty much the same test with different data. MbUnit adds the RowTest attribute to the mix. With the RowTest and Row attributes you can create one parameterized test "template" and the test will run for each data item provided. Running the following code will execute 3 different tests, one for each of the Row parameters provided.

[RowTest]
[Row(2,8)]
[Row(5,5)]
[Row(8,2)]
public void GreaterValueTest(int filterValue, int expectedCount)
{
  ClassicCollectionBase results = col.FilteredCollection(new GreaterValueFilter(filterValue));
  Assert.AreEqual(expectedCount, results.Count);
}

This cool feature can help reduce the maintenance time associated with test code.

What about NUnit?

With the new features of NUnit 2.4 Andreas Schlapsi has an NUnit extension that allows for RowTests. I would expect this exceptional feature to get rolled into the core in an upcoming release.

Dennis Burton

View Dennis Burton's profile on LinkedIn
Follow me on twitter
Rate my presentations
Google Code repository

Community Events

Windows Azure Boot Camp Lansing GiveCamp