Got a Minute?

I came across this One Minute How-to podcast site while browsing my feedburner stats. It’s a great idea and not unlike the theme behind DailyDevelopers, with an even larger range of topics. I love the idea of explaining something in just a minute!

Test Driven Development (TDD) in a Nutshell

Here’s a brief overview of TDD:

  1. Write a test that fails (it fails because code that it tests is not written yet)
  2. Write just enough code so that the test passes
  3. Re-factor the code
  4. Repeat from step 2 until code is baked, then repeat from step 1.

Brief guidelines for good unit tests:

  • Write simple tests (if a test requires complex configuration and setup, it will eventually fall into disrepair).
  • Test one scenario at a time.
  • Tests are your specifications. Well written tests provide working specification of the code they test.
  • Name your tests accurately and consistently.
  • Use data that makes tests easy to read and understand.
  • Write tests that run quickly.
  • Make each test independent. You should be able to run them in any order.

The initial result of TDD is that you end up with a suite of tests to rely upon when you refactor your code. It gives you the courage to refactor freely and courageously, knowing that you have a ‘safety net’ that will tell you if you have broken existing functionality (providing you have sufficient test code coverage). But TDD also forces you to think like a consumer of your own code, with the positive side effect of producing cleaner, simpler designs.

The TDD sequence is often represented as follows (Note: Not the usual sequence for traffic lights!):

  • Red: Write a test (that fails)
  • Green: Write code to make test pass
  • Amber: Refactor until Green again, then repeat back at Red

Here are a few resources:

Books:
Test Driven Development: Kent Beck. Addison-Wesley.
Refactoring: Martin Fowler. Addison-Wesley.

Web:
http://www.agiledata.org/essays/tdd.html
Test-Driven Development in .NET by Peter Provost. An easy to understand introduction to TDD and unit testing.

Book Review: Pro C# 2005 and the .Net Platform, Third Edition by Andrew Troelsen


Pro C# 2005 and the .NET 2.0 Platform, Third Edition

This book is clear, well written and does not sacrifice any depth despite its broad coverage. Although targeted at developers with a few years experience, it is one of those rare books that is good no matter what your level of experience. The style is very polished and the book’s topics flow easily into one another.

One notable and surprising omission of .Net version 2.0 features was the BackgroundWorker class. This is a very useful feature of .Net 2.0

This is one of two Apress books that I purchased 6 months ago, after researching what books were on offer for version 2.0 of the .Net framework (the other was Pro ASP.NET 2.0, which I hope to post a short review presently). As an aside, I’m not sure whether Apress still offer this, but when I purchased this book I was also able to download a free, fully searchable eBook as well.

Perth .NET User Group Session – Open Mic Night

We (myself and Alistair Waddell) have organised a User Group event along the lines suggested by Rob Farley (great advice and tips BTW, thanks Rob), namely a ‘have a go’ meeting when anyone is invited to stand up for 10 minutes or so and give a talk on just about anything (even non-technical, if they like). It’s happening on the 7th Dec, at the Excom facilities, level 2, 23 Barrack St.

The details are here; I’m really looking forward to it. Everyone is welcome.

This will be the last session before Xmas, so come along and have fun, learn something and network with like minded people over a few after-session drinks.

What makes a code base hard to modify?

Kent Beck makes the following observations, in Martin Fowler’s book, Refactoring:

  • Code that is hard to read, is hard to understand and consequently hard to modify.
  • Programs with duplicated logic are hard to modify.
  • Programs that require additional functionality that requires changes to existing code are hard to modify.
  • Code containing complex and convoluted conditional logic is hard to modify.

When should you refactor?

  1. When you add functionality.
  2. When you need to fix a bug.
  3. When you perform code reviews.

When shouldn’t you refactor?

  1. When you really need to rewrite from scratch!
  2. When you are close to a release deadline.

SQL Server More Secure Than Oracle

My favourite product is getting great press.

Between December 2000 and November 2006, external researchers discovered 233 vulnerabilities in Oracle’s products compared with 59 in Microsoft’s SQL Server technology, according to NGSS, which has worked for Microsoft in the past to make its software products more secure.

The NGSS report comes at a time when security researchers, irked by what
they consider to be Oracle’s glacial pace of fixing bugs, are increasingly turning their attention to its products.

New Features in C# 3.0 (part 2)

Implicitly Typed Local Variables

Implicit typing of local variables is a language feature that allows the type of a variable to be inferred at compile time from the type of the variable’s initialization expression. LINQ query expressions can return types, created dynamically by the compiler, containing data resulting from queries.

var n = 3;

var s = “Twas brillig”;

var d = 3.141592653;

var digits = new int[] { 1, 2, 3, 4, 5 };

Declaring Implicitly Typed Collections

Implicitly typed variables are useful when instantiating complex generic types:

var supplierProducts = new Dictionary<string, List<string>>();

var products = new List<string>();

products.Add(“pears”);

products.Add(“apples”);

products.Add(“oranges”);

supplierProducts.Add(“grocer”, products);

products = new List<string>();

products.Add(“beef”);

products.Add(“lamb”);

products.Add(“chicken”);

supplierProducts.Add(“butcher”, products);

int totalProducts = supplierProducts[“grocer”].Count +

supplierProducts[“butcher”].Count;

Console.WriteLine(“Total products: {0}”, totalProducts);

Implicitly typed variables should not be confused with untyped variables in scripting languages such as VBscript, or variants in VB6, where a variable can hold values of different types over the course of its lifetime. Once the compiler infers an implicit variable’s type from the expression used to initialize it, it’s type is then fixed just as if the variable had been explicitly declared with that type. Assigning a value of a different type will result in a compile time error.

var x; // Error: type is not known.

var x = { 1, 2, 3 }; // Error: type is not known.

var n = 8; // n implicitly typed as int

n = “This will not compile”;

Extending Types with Extension Methods

Extension methods enable developers to extend the functionality of existing types by defining new methods that are invoked using the usual instance method syntax. Extension methods (defined as static) are declared by specifying the modifier keyword this on the first parameter of the method. Extension methods can be added to any type, including the generic types such as List and Dictionary, as shown in this slightly contrived example:

public static class Extensions

{

public static Dictionary Combine(this Dictionary s Dictionary d)

{

var newDictionary = new Dictionary(s);

foreach (K key in d.Keys)

{

if (!newDictionary.ContainsKey(key))

{

newDictionary.Add(key, d[key]);

}

}

return newDictionary;

}

}

Initialise two dictionaries and then combine them using the extension method just defined:

var supplierProducts = new Dictionary<string, List<string>>();

products = new List<string>();

products.Add(“beef”);

products.Add(“lamb”);

products.Add(“chicken”);

supplierProducts.Add(“butcher”, products);

var supplierProducts2 = new Dictionary<string, List<string>>();

products = new List<string>();

products.Add(“pork”);

products.Add(“veal”);

products.Add(“venison”);

supplierProducts2.Add(“butcher”, products);

var x = supplierProducts.Combine<string, List<string>>(supplierProducts2);

Lambda Expressions

C# 2.0 introduced anonymous methods, which allow code blocks to be “inlined” where delegate values are expected. For example, the List FindAll method requires a delegate parameter:

List<int> oddNumbers = list.FindAll(delegate(int i) { return (i%2) != 0; }

Here, the delegate determines whether the input integer is an odd number. C# 3.0 takes this further and introduces lambda expressions, a functional programming syntax for writing anonymous methods.

A lambda expression is written as a parameter list, followed by =>, followed by an expression. The parameters of a lambda expression can be explicitly or implicitly typed. In an explicitly typed parameter list, the type of each parameter is explicitly stated:

(int x) => x + 1

In an implicitly typed parameter list, the types of the parameters are inferred from the context in which the lambda expression is used. In addition, if a lambda expression has a single, implicitly typed parameter, the parentheses may be omitted from the parameter list:

x => x + 1
(x,y) => return x * x + y * y

Here is an example of using a single variable lambda expression:

var list = new List<string>();

list.Add(“cat”);

list.Add(“dog”);

list.Add(“fish”);

list.Add(“mouse”);

list.Add(“catch”);

list.Add(“carrot”);

var matchStartsWithCA = list.FindAll( s => s.StartsWith(“ca”) );

foreach (string matchString in matchStartsWithCA)

{

Console.WriteLine(matchString);

}

References:
Hands-On Lab: Lab Manual: C# 3.0 Language Enhancements

Microsoft Best Practices Analyser

As usual, Scott Hanselman beat me to it (!) with his recent blog post Microsoft Best Practices Analyzer Tools where he gives a round-up of the offerings from Microsoft and muses whether we will be seeing a combined tool soon, which seems likely given this codeplex project called the Microsoft Best Practice Analyzer (BPA). It comes with a plugin for ASP.NET 2.0, and as Scott notes you might be surprised if you run it on one of your projects. Not only will it find potential problems but also suggest accurate solutions. His post also gives links to the various best practices tools currently available.

Lousy Random Number Generators

Over at Jeff Atwood’s blog, Coding Horror, I noticed a nice round-up of random number generation. I tried to post a quick comment but it kept being rejected so I’ve blogged it here instead.

Anyone interested in a small but ‘highly’ random generator that suffers from few points of failure should check out the mersenne twister pseudo-random number generator algorithm. This work is fairly recent (1998). I believe it has been implemented in the .NET framework?

The problem that plagues most generators occurs when you require a huge set of random numbers for large simulations (such as Monte Carlo simulations, for instance).

A word of advice: never, never attempt to roll your own generator or ‘improve’ an existing one. The results will be at best less than random, and at worst dangerous! It’s in the same sin category as using Bubble Sort. A most heinous crime!

http://www.bedaux.net/mtrand/ includes a reference to the original paper.
http://en.wikipedia.org/wiki/Mersenne_twister

Bruce Schneier’s book Applied Cryptography (John Wiley & Sons, 1994) is a great place to start if you are interested in random numbers from the point of view of cryptography.