Dennis Burton's Develop Using .NET

Change is optional. Survival is not required.

What do you do when the limitations of SQL Management Studio will not show all of your results?

Our application works with an XML file defined by an external industry specification. We found that a significant portion of this file is not required by our application. So, when loading this file into our system, we remove the portions that we do not care about and stuff the rest into a database. What is left can still be a pretty good sized XML string. Common usage shows the reduced size to still be around 300-500 KB.

Recently, we needed to investigate the contents of one of these modified files as well as the history of what was loaded. Using the standard developer playbook, the first response is to quickly fire up SQL Management Studio and craft a query to get the ntext column. At that point, the 65K limit on a column in Management Studio yields about a quarter of the file.

It would be pretty easy to pull up Visual Studio and put together a quick utility to extract the XML into a file. As a developer, I would likely keep this project around for an excessively long period of time. After all, it is now part of my code toolbox. So, rather than squirrel away a project file, source file, and all the related baggage of a VS project, I chose PowerShell as the hammer for this nail. The C# code that would have been created in Visual Studio, in this case, is simply a language that is accessing the functionality in the .net framework. Since PowerShell has access to the .net framework, it is very similar code simply using a different language. Additionally, the PowerShell script is a single file without the weight of a VS project. The only thing that I keep in my toolbox is the code required to complete the task.

What is required is the pretty common code of opening a connection, executing a query, and using the results. This is how this common C# task is represented in PowerShell. We are pulling back several rows that contain this large XML field and tagging the resuts by upload date.

param( $theId, $server,$db )

$connection = New-Object System.Data.SqlClient.SqlConnection("data source=$server;initial catalog=$db;Integrated Security=SSPI")
$command = New-Object System.Data.SqlClient.SqlCommand("select * from MyTable where theItemId=$theId", $connection)


$reader = $command.ExecuteReader()
while( $reader.Read()  ) {
    $fileName = "Item_{0}_DateLoaded_{1:yyyy_MM_dd_HH_mm_ss}.xml" -f $theId,$reader['dStart']
    Out-File  -FilePath $fileName -InputObject $reader['reallyBigXMLText']


Save that text into a .ps1 file and that is all that is necessary to extract the large text field. I am not one that has cut over from the classic command line to PowerShell. At the same time, it is important to know when this tool can be helpful.

Tags: programming | spec# | SpecSharp

The next topic in this series was supposed to be invariants. While that topic will still be coming up, the topic of nulls was not complete enough. In the last post, coverage was mostly related to parameters and method signatures. Since parameters are not the only variables in a system that can be null, it makes sense to talk a little about class fields. Consider the following class member from the Sentence class discussed in Part 1.

List<string> words = new List<string>();

Several methods and properties in the sentence class perform operations on this variable by dereferencing it and calling its methods and properties. Spec# notices this and warns of a potential null dereference. At first this seemed like a bit of an inconvenience. But if you think about this a second, this warning is for every location where "object reference not set to instance of an object" can occur. Wow, removing that error from the development cycle would no doubt be a huge boost to productivity. So a simple change to the signature eliminates that warning.

List<string>! words = new List<string>();

So what if you are not instantiating objects the same way for each class creation. Try removing the new from the declaration, and a whole different error shows up. This time the constructors all get flagged. Spec# notices that the fields that have been marked as non-null have not been assigned in the constructor. New up a sentence object in the constructor on the validation engine is now happy.

There is an impressive depth to the null checks that can lead to a much more solid code base. In the strong typed world where compiler errors are preferred, Spec# strengthens the type checking pushing potential errors earlier in the development cycle to the point of some of them showing up in the editor pre-compile time!

Tags: programming | spec# | SpecSharp

One of the key characteristics that makes a professional-quality application is defensive coding. While necessary, this can hide the meaningful portions of logic. It can also lead to many tests that are more related to the chosen implementation than the business rules. Take this simple piece of code to add a word to a sentence.

public void AddWord(string word)
     if (word == null) throw new ArgumentNullException("word");
     if (word.Length == 0) throw new ArgumentException("word");

The validation code dominates this simple function. Additionally it does not provide the consumer of the method with any hints to the parameters constraints. Sure, xml docs can provide some documentation, but that still does not help at design time. A set of tests that check the expected exception would also be common for this block of code. There is plenty to debate about the non-functional tests, but it is safe to assume that this kind of testing is common.

The Spec# team is doing some incredible work to make constraints like this either enforced by the language or more visible by placing them within the method signature.

The first parameter check relates to nullness. Spec# provides an operator for forcing non-null. The signature of AddWord would now look like:

public void AddWord(string! word)

This tells the compiler and Visual Studio that this parameter cannot be null. If you even try to pass a string that can be null to this method, a warning will appear at design time. A compiler error will also result. This really cranks up the visibility of the constraint. The user of this method can also see the constraint right in the signature.

The second parameter check relates to some data constraint on the string. In this case, it is required to be a non-empty string. What is really going on here is that this method has a pre-condition relating to the data it is going to work with. Preconditions in Spec# are specified with the requires keyword. So our method now looks like:

public void AddWord(string! word)
 requires word.Length > 0;

Now the requirements are part of the method signature that show up in intellisense. This alone is a big help to the development process. The precondition is validated at runtime kicking out an exception if the condition is not met. It is worth noting that you can map the stock RequiresException back to an argument exception via the otherwise keyword.

requires word.Length > 0 otherwise ArgumentException;

It is pretty evident to see the big gain here in the signature of the method containing all of the relevant constraints. Even more assistance can be provided at design and compile time when these constraints are provided in the signature. Even better than all of this is that the body of the method is left with only the required logic the method is responsible for.

There is plenty more to cover with the direction spec# is taking. Next up, validating the internal state of an object as part of the class definition.

Dennis Burton

View Dennis Burton's profile on LinkedIn
Follow me on twitter
Rate my presentations
Google Code repository

Community Events

Windows Azure Boot Camp Lansing GiveCamp