Wednesday, June 22, 2016

FuncComparer

Often we need a means to compare collections of complex structures. Consider



public class Contact
{
    public Guid Id { get; private set; }
    public string Name { get; set; }
    public Contact() { Id = Guid.NewGuid(); }
}

[TestMethod]
public void Test()
{
    Contact[] expected = new[] 
    {
        new Contact { Name = "Bob", },
        new Contact { Name = "Abel", },
    };
    Contact[] actual = new[] 
    {
        new Contact { Name = "Abel", }, 
        new Contact { Name = "Charlie", },
    };
    // TODO: test for equivalence?
}


I often run into situations like these when testing functions on collections, as presented above. I also see similar situations when comparing collections from a client (that may have modified or net-new entities) to collections from a datastore.


There are many ways to resolve these sorts of scenarios, like in-lining comparison logic, overriding Equals method, or even declaring a custom equivalence class that implements IEqualityComparer<Contact>. The first method is brittle and completely not portable. Overriding Equals is a little more portable, but still fairly brittle - for instance, if in one context I require equality based on Name, in another context equality of Id, and yet another based on both. The third method is portable and is externalized from the class so that it may be clearly swapped depending on context (consider ContactNameEqualityComparer versus ContactIdEqualityComparer).


My only gripe with this last method is that it is always so tedious to implement the boiler-plate comparisons in every instance of a comparer (for instance, writing trivial accept and reject criteria for ContactNameEqualityComparer, and then re-writing that criteria again for ContactIdEqualityComparer, and then again for any other comparers that exist in our system for other types).


FuncComparer



Really I want something that encapsulates basic comparison functionality while being able to supply comparison criteria as lambdas.


public class FuncComparer<T> : IComparer, IComparer<T>
{

    private readonly Comparison<T>[] comparisons = null;
    private readonly int xIsNotNullComparison = 0;
    private readonly int yIsNotNullComparison = 0;

    public FuncComparer(
        Comparison<T> comparison, 
        bool isNullLessThan = true) :
        this (new[] { comparison, }, isNullLessThan)
    {
    }

    public FuncComparer(
        Comparison<T>[] comparisons, 
        bool isNullLessThan = true)
    {
        this.comparisons = comparisons;
        this.xIsNotNullComparison = isNullLessThan ? 1 : -1;
        this.yIsNotNullComparison = isNullLessThan ? -1 : 1;
    }

    // interfaces

    #region IComparer Members

    public int Compare(object x, object y)
    {
        int comparison = 0;
        if (!ReferenceEquals(x, y))
        {
            bool xIsNull = true;
            bool yIsNull = true;
            T a = default(T);
            T b = default(T);
            if (x is T)
            {
                xIsNull = false;
                a = (T)(x);
            }
            if (y is T)
            {
                yIsNull = false;
                b = (T)(y);
            }
            comparison = Compare(a, xIsNull, b, yIsNull);
        }
        return comparison;
    }

    #endregion

    #region IComparer<T> Members

    public int Compare(T x, T y)
    {
        int comparison = 0;
        if (!ReferenceEquals(x, y))
        {
            comparison = Compare(x, x == null, y, y == null);
        }
        return comparison;
    }

    #endregion

    // private methods

    /// <summary>
    /// Compare two strong-typed values. Explicit 'is null' 
    /// parameters are required for scenarios where 
    /// <typeparamref name="T"/> is a valuetype and type-less 
    /// compare is used. In such cases, type-less method uses 
    /// default-type value which will telescope nulls to default
    /// values. For example, when comparing a series of integers,
    /// any nulls in our inputs will be interpreted as '0'.
    /// </summary>
    /// <param name="x"></param>
    /// <param name="isXNull"></param>
    /// <param name="y"></param>
    /// <param name="isYNull"></param>
    /// <returns></returns>
    private int Compare(T x, bool isXNull, T y, bool isYNull)
    {
        int comparison = 0;
        if (!isXNull && !isYNull)
        {
            for (
                int i = 0; 
                i < comparisons.Length && comparison == 0; 
                i++)
            {
                comparison = comparisons[i](x, y);
            }
        }
        else if (!isXNull)
        {
            comparison = xIsNotNullComparison;
        }
        else if (!isYNull)
        {
            comparison = yIsNotNullComparison;
        }
        // both x and y are null, implies equality, no-op
        //else
        //{
        //}
        return comparison;
    }

}

What we may now do is
public class Contact
{
    public Guid Id { get; private set; }
    public string Name { get; set; }
    public Contact() { Id = Guid.NewGuid(); }
}

[TestMethod]
public void Test()
{
    FuncComparer<Contact> contactNameComparer = 
        new FuncComparer<Contact>(
            (a,b) => string.Compare(
                a.Name,
                b.Name,
                StringComparison.InvariantCultureIgnoreCase));
    Contact[] expected = new[] 
    {
        new Contact { Name = "Bob", },
        new Contact { Name = "Abel", },
    };
    Contact[] actual = new[] 
    {
        new Contact { Name = "Abel", }, 
        new Contact { Name = "Charlie", },
    };
    // NOTE: successfully demonstrates ease-of-use, esp
    // when tests are repeated with minor variances in
    // expectation
    CollectionAssert.AreNotEqual(
        expected, 
        actual, 
        contactNameComparer);
    CollectionAssert.AreEqual(
        new[] 
        {
            new Contact { Name = "Abel", },
            new Contact { Name = "Charlie", },
        },
        actual,
        contactNameComparer);
}


Why 'isXNull' and 'isYNull'?



The oddest thing about FuncComparer is how it handles null checks. We perform null checks on the way in, before the private core-implementation executes. This is because the private core-implementation is strongly typed, co-ercing any null inputs into base-type defaults. For reference types, such as any class, this is ok, since default(T) is still null. However, for valuetypes like int or struct, we end up telescoping nulls into default values, which is a loss-full transformation, a reduction in fidelity.


We overcome this through explicit null tests on the way in, and pass that on as state to our core-implementation.



[TestMethod]
public void Test()
{
    IComparer typelessComparer = 
        new FuncComparer<int>(
            (a,b) => a.CompareTo(b),
            true);
    // we expect 'null is less-than', so x as null should
    // return less-than
    int expectedComparison = -1;
    int actualComparison = 
        typelessComparer.Compare(null, 0);
    Assert.AreEqual(expectedComparison, actualComparison);
}

One last minor detail, is the ability to configure an ordinal interpretation of null in relation to all other values. Our constructor accepts a boolean that indicates any null values should be considered less-than non-null values, or vice versa. This allows us to tweak things like list ordering.
[TestMethod]
public void Test()
{
    IComparer typelessComparer = 
        new FuncComparer<int>(
            (a,b) => a.CompareTo(b),
            false);
    object[] expected = new object[] { 0, 1, 2, 3, 4, null, };
    ArrayList actual = new ArrayList { 1, null, 4, 2, 3, 0, };
    actual.Sort(typelessComparer);
    CollectionAssert.AreEqual(expected, actual);
}

FuncEqualityComparer

Upon implementing a FuncComparer, the very next thing that I do is implement a FuncEqualityComparer; so it follows that I should post this next. An implementation of IEqualityComparer on top of an IComparer is relatively trivial and straightforward,


public class FuncEqualityComparer<T> : 
    FuncComparer<T>, 
    IEqualityComparer, 
    IEqualityComparer<T>
{

    private readonly Func<T, int> getHashCode = null;

    public FuncEqualityComparer(
        Comparison<T> comparison, 
        Func<T, int> getHashCode, 
        bool isNullLessThan = true) :
        this(new[] { comparison, }, getHashCode, isNullLessThan)
    {
    }

    public FuncEqualityComparer(
        Comparison<T>[] comparisons, 
        Func<T, int> getHashCode, 
        bool isNullLessThan = true) :
        base(comparisons, isNullLessThan)
    {
        this.getHashCode = getHashCode;
    }

    // interfaces

    #region IEqualityComparer Members

    public new bool Equals(object x, object y)
    {
        int comparison = Compare(x, y);
        bool isEqual = comparison == 0;
        return isEqual;
    }

    public int GetHashCode(object obj)
    {
        int hashCode = 0;
        T o = default(T);
        if (obj is T)
        {
            o = (T)(obj);
        }
        hashCode = getHashCode(o);
        return hashCode;
    }

    #endregion

    #region IEqualityComparer<T> Members

    public bool Equals(T x, T y)
    {
        int comparison = Compare(x, y);
        bool isEqual = comparison == 0;
        return isEqual;
    }

    public int GetHashCode(T obj)
    {
        int hashCode = getHashCode(obj);
        return hashCode;
    }

    #endregion

}



No real secret sauce here; merely account for GetHashCode and interpret an underlying comparison.

Thursday, January 3, 2013

Continuous Integration, how do I declare commonly derived MSBuild metadata?

So, following some recent posts on MSBuild-NUnit integration, I went back and reviewed some of the implementation and wondered if there were easier ways to reference commonly derived metadata.

The motivation for this post stems from issues I encountered with long paths and paths that contain spaces; eg "C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Example.Continuous\..\Example.UnitTests\bin\debug". Both present particular logistic problems when commands require multiple paths as inputs.

So, what follows are a number of refinements I picked up in my exploration to address some of these issues.

Custom ItemGroup versus ProjectReference


In previous examples, I prescribed custom ItemGroups, like TestAssembly below

<ItemGroup>
  <TestAssembly Include="Example.UnitTests.dll">
    <WorkingDirectory>
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
    </WorkingDirectory>
  </TestAssembly>
</ItemGroup>

I prefered this since it would not collide nor compromise any other build metadata. However, it does produce some visual artifacts in Visual Studio, and requires a manual edit of the project file to modify.

Figure 1, custom TestAssembly item appears as missing dll.

A better solution, which would resolve these issues and facilitate adding test projects, is to leverage the well-known ItemGroup ProjectReference.

Adding ProjectReference Items is as simple as using "Add Reference" feature from Visual Studio. However, to add custom metadata for these Items will require a little finesse, but that is what follows below!

Extending ItemGroup metadata


MSBuild permits the extension of existing metadata through the use of ItemDefinitionGroups. This will work for custom Items, such as TestAssembly, or well-known Items, like ProjectReference.

When we add a project reference via Visual Studio, we end up with a ProjectReference Item

<ItemGroup>
  <ProjectReference Include="..\Example.UnitTests\Example.UnitTests.csproj">
    <Project>{eaac5f22-bfb8-4df7-a711-126907831a0f}</Project>
    <Name>Example.UnitTests</Name>
  </ProjectReference>
</ItemGroup>

To add custom metadata, we simply define an ItemDefinitionGroup.

For example, let's define both a WorkingDirectory, and an OutputFile for our test libraries,

<ItemDefinitionGroup>
  <ProjectReference>
    <WorkingDirectory>%(RootDir)%(Directory)$(OutputPath)</WorkingDirectory>
    <OutputFile>%(Filename).dll</OutputFile>
  </ProjectReference>
</ItemDefinitionGroup>

Essentially, for each ProjectReference Item, MSBuild will add custom metadata WorkingDirectory, composed of well-known per-Item metadata RootDir, Directory, and project defined OutputPath. Similarly, OutputFile is a simple concatenation of per-Item metadata Filename and literal ".dll".

For a full list of well-known Item metadata, take a peek at this MSDN reference page.

From relative to absolute paths


When a project exists in a sibling directory we will often end up with relative paths which, when squeezing the most out of a command line, chews up valuable character space. We can mitigate this by converting unevaluated relative paths to absolute paths.

MSBuild exposes a task for path conversion (ConvertToAbsolutePathTask), but we may also inline static method calls; eg $([System.IO.Path]::GetFullPath(C:\Temp\..\Some.Other.Path)). For our purposes, we'll use the latter since we have only a handful of paths to define and the syntax is easy enough to manage.

Really, we only require a WorkingDirectory, a resources folder, and an artifact folder. We have already defined an absolute path WorkingDirectory in our ItemDefinitionGroup, so that leaves resources and artifacts.

Since our tools and output folders are not likely to vary from Item to Item, let's define some project properties to capture these paths.

<PropertyGroup>
  <TestResultsFolder>
    $([System.IO.Path]::GetFullPath($(MSBuildProjectDirectory)\..\TestResults\))
  </TestResultsFolder>
  <ResourcesFolder>
    $([System.IO.Path]::GetFullPath($(MSBuildProjectDirectory)\..\Resources\))
  </ResourcesFolder>
</PropertyGroup>

For example, these will evaluate from "C:\Projects\Example.Continuous\..\Resources\" to "C:\Projects\Resources\". With these root definitions defined, we can then define other properties and metadata like

  <PropertyGroup>
    <NUnitExe>$(ResourcesFolder)NUnit-2.6.0.12051\nunit-console.exe</NUnitExe>
    <NUnitExe-x86>$(ResourcesFolder)NUnit-2.6.0.12051\nunit-console-x86.exe</NUnitExe-x86>
    <NUnitTestResultsFolder>$(TestResultsFolder)NUnit\</NUnitTestResultsFolder>
  </PropertyGroup>

  <ItemDefinitionGroup>
    <ProjectReference>
      <NUnitTestResultsFile>$(NUnitTestResultsFolder)%(OutputFile).xml</NUnitTestResultsFile>
    </ProjectReference>
  </ItemDefinitionGroup>

Escaping paths with spaces


As cited earlier, paths with spaces in them may present particular problems depending on the form of our Exec task. For instance, consider

paths
  • $(NUnitExe), C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Resources\nunit-console.exe
  • %(ProjectReference.OutputFile), Example.UnitTests.dll
  • %(ProjectReference.NUnitTestResultsFile), C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\TestResults\NUnit\Example.UnitTests.dll.xml
  • %(ProjectReference.WorkingDirectory), "C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Example.UnitTests\bin\debug"
 and Exec task

<Exec
  Command="$(NUnitExe) %(ProjectReference.OutputFile) /nologo
    /result=%(ProjectReference.NUnitTestResultsFile)"
  WorkingDirectory="%(ProjectReference.WorkingDirectory)" />

this task will fail for two similar reasons. First, WorkingDirectory is escaped with quotation marks; this poses problems for Exec task's WorkingDirectory parameter, which expects a simple unenclosed path. Second, nunit-console.exe will fail because its result specification is not escaped with quotation marks; it will misinterpret any path spaces as new parameter specifications and blow up.

Since paths may be used in any number of ways during build (for instance arbitrary path concatenation, or various embedded parameters, or contexts that require or prohibit quotations marks), I have settled on simply defining raw paths in my properties and metadata, and explicitly escaping them where utilised.

To resolve our Exec task above, let's consider revising path
  •  %(ProjectReference.WorkingDirectory), C:\Users\johnny g\Documents\Visual Studio 2012\Projects\UnitTests\Example.UnitTests\bin\debug
 and our task

<Exec
   Command="$(NUnitExe) %(ProjectReference.OutputFile) /nologo
     /result=&quot;%(ProjectReference.NUnitTestResultsFile)&quot;"
   WorkingDirectory="%(ProjectReference.WorkingDirectory)" />

 The only real take away is that we must Html encode our quotation marks, so "" becomes &quot;&quot; .

Conclusion


Well, that about wraps up some small MSBuild tidbits. These little morsels should enable us to write reliable build tasks in a clear and concise manner.

Resources

StackOverflow, how to evaluate absolute paths
MSDN MSBuild Well-known Item Metadata
MSDN ConvertToAbsolutePath task





Thursday, December 27, 2012

Continuous Integration, how do I generate NUnit output files?

A quick addendum to my NUnit integration post. To support result file generation (for build server integration), we make a few simple modifications to our MSBuild project file.

Let's assume a pared down version of our previous example, we have

  1. Example.Continuous, our build project
  2. Example.UnitTests, a project containing tests
  3. NUnit in a Resources folder

Example.Continuous declares build target AdditionalTasks, and the following property and item groups.

  
    "$(MSBuildProjectDirectory)\..\Resources\NUnit-2.6.0.12051\nunit-console.exe"
  
  
    
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
    
  
  
    
    
    
    
      
    
    
    
  
What we want to do is leverage NUnit's result switch and specify output files for each test run. To do so, we will introduce both a new property for the results path and additional item paths to specify output files. We are being very explicit here to better support spaces in folder paths (eg default project folders are located under "Visual Studio 2012" folder).

This is our new project file after our modifications!

  
  
    "$(MSBuildProjectDirectory)\..\Resources\NUnit-2.6.0.12051\nunit-console.exe"
    $(MSBuildProjectDirectory)\..\TestResults\NUnit\
  
  
    
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
   "$(TestResultsFolder)Example.UnitTests.dll.xml"
    
  
  
    
    
    
    
      
    
    
    
  

Thursday, May 17, 2012

Continuous Integration, how do I integrate post build tasks?

Everyone these days understands the benefits of automated continuous integration builds. The ability to continually build a code base and execute various automated tasks to evaluate the integrity of each iteration is invaluable. Yet there is still some confusion as to how exactly to implement this, or is something we are constantly re-inventing.

Case in point, today I find myself researching unit test integration into our automated build solution.

My friend Kent Boogaart posted a great article about continuous integration. His solution is to integrate these tasks into the solution, and have MSBuild execute them. This has a number of benefits, including portability (I have used CruiseControl.Net previously, and Jenkins currently) and local execution. My primary role is developer, so I am rather partial to this last point; if we are responsible for failures in the build process, it is paramount that we are able to reproduce the process locally.

Now, Kent's article provides a great overview of the process, but I always find myself rooting around for resources to help implement an MSBuild integration (and re-educating myself on Task declaration). So this article will detail some of these specifics.

Overview


As an overview then, this article will take a solution that has some passing tests and some failing tests. By the conclusion, this solution will build and execute unit tests.

Setup


For this article, I have created a simple solution;
  1. Create a solution, Example.UnitTests,
  2. Create a Class Library, call it Example.UnitTests,
  3. Create a Class Library, call it Example.IntegrationTests,
  4. Create a folder, Resources, under solution directory
With Example.UnitTests, add a code file Tests.cs

using NUnit.Framework;
namespace Example.UnitTests
{
    [TestFixture]
    public class Tests
    {
        // will always pass, should have at least one of these!
        [Test]
        public void Test_Pass() { }
    }
}

With Example.IntegrationTests, add similarly named code file Tests.cs

using NUnit.Framework;
namespace Example.IntegrationTests
{
    [TestFixture]
    public class Tests
    {
        // will always fail, there is always at least one of these!
        public void Test_Fail()
        {
            Assert.Fail();
        }
    }
}

Example.IntegrationTests should only build on Release mode.

Figure 1, Example.IntegrationTests will not build under Debug mode, Release mode only

This may be accomplished by opening ConfigurationManager, selecting Debug solution configuration, and deselecting the Build checkbox next to Example.IntegrationTests project (see Figure 1 above).

With Resources folder, copy a working set of NUnit console. We will use this execute our unit tests.

Strategy


With this solution, it should be plain to see we have a set of tests we would like to execute in Debug mode only (think quick in-proc tests that verify behaviour at a very fine and granular level for a QA environment) and a set of tests we would like to execute in Release mode only (think long-running system-integration tests that verify behaviour at a coarse use-case level for a UAT or Pre-Prod environment).

Ideally, we want to be in the practice of executing unit tests as often as possible. However, forcing a developer to run these tests on every build is less than ideal or may even be prohibitively expensive.

Our approach then will be to create additional build profiles that target our standard Debug and Release modes, and execute our test suites selectively. Developers will be held to an honour system of running tests prior to major commits, but our continuous integration environment will always run these tests.

When we are through, our build environment will be able to
  • Build Debug and execute unit tests,
  • Build Release and execute unit and integration tests,
Of course, these build profiles will also be available to our developers, so that they may verify the integrity of their commit, or at the very least reproduce functional-related test failures in their own environment post build-fail.


Adding Build Profiles


Build profiles are tricky things. Adding and maintaining profiles can be cumbersome and error prone (Visual Studio does not auto-magically add custom profiles to new projects that we add, and is a manual step). Fortunately, we do not need existing libraries or future libraries to implement our custom build configuration.

For our purposes an empty light-weight configuration is all that we need. To do so,

  1. Open ConfigurationManager,
  2. From Active solution configuration: dropdown, select New...,
  3. Enter a configuration name, one that starts with "Debug". For this example, I have chosen DebugContinuous,
  4. From Copy settings from: dropdown, select Empty,
  5. Uncheck Create new project configurations if it is not already in an unchecked state

Figure 2, a minimal Debug continuous integration build configuration

Now may be a good time to create the Release profile as well. Same steps, simply create a profile with a name that starts with "Release" - if you're stuck, try ReleaseContinuous!

This naming requirement may seem odd, but we will be depending on MSBuild's ability to detect similar profiles based on name to target the correct mode for our solution. Basically, when our build environment invokes DebugContinuous, any projects that implement this mode exactly will build in DebugContinuous (more on this in a bit), and projects that do not will build in a mode that most closely resembles this mode (ie all of our existing projects). For our QA builds, this means Debug mode. When a suitable match cannot be found, MSBuild defaults to Release - so it is not the end of the world.

Adding Continuous Integration Project


Now that we have a (solution-wide) build profile for our continuous integration environment, we now need a place to throw in our unit test task. We could simply use any existing project, but for very large solutions, it makes better sense to consolidate our optional continuous integration tasks into a single place that is separate from our test code, so let's add a new project.

Add a new Class Library Project, Example.ContinuousIntegration. Delete the default Class.cs file.

All other projects are fine as they are, implementing only Debug and Release build modes. This one specific project however, will contain conditional elements that require the continuous build mode. So let's add our continuous build modes to Example.ContinuousIntegration. It is very similar to adding a solution-wide profile,
  1. Open ConfigurationManager,
  2. From Configuration dropdown beside our project, select New...,
  3. Enter your debug continuous integration build mode name. As with the rest of this example, I have used DebugContinuous,
  4. From Copy settings from: dropdown, select Empty,
  5. Uncheck Create new solution configurations if it is not already in an unchecked state
Figure 3, a minimal Debug continuous integration build configuration

Once we have defined our build modes, it pays to review each build configuration. Back in ConfigurationManager, iterate through each solution configuration. Ensure, that when Debug is active, Example.ContinuousIntegration does not build and is set to Debug configuration. Ensure that when DebugContinuous is active, all projects are in Debug mode and Example.ContinuousIntegration is set to build with DebugContinuous. Do likewise for Release modes.

One final check before we modify our project file directly.

  1. Open Project Dependencies,
  2. From Projects: dropdown, select Example.ContinuousIntegration,
  3. Check all project boxes,

Figure 4, add dependencies to continuous integration project to ensure a convenient build order


This creates an artificial dependency between our integration project and every other project in the solution. This ensures it builds last. While not strictly necessary, it makes it easier to debug build issues that we may encounter later on.

Now let's crack this sucker open. To edit this project through Visual Studio, first unload the project, and then edit it. Alternatively, use an external program (like Notepad.exe) to modify Example.ContinuousIntegration.csproj; when Visual Studio regains focus, it will detect modifications and prompt to reload.

When you first open it up, it should look something like this,


  
    
      Debug
    
    ...
  
  
    ...
  
  
    ...
  
  
    
    ...
  
  
    
  

What we will do next is add another build target that will contain our unit test execution task. To tidy up the declaration, we will also define some build properties and metadata. Our project file should now look a little something like this,

  
    bin\Debug\
  
  
    bin\Release\
  
  ...
  
    "$(MSBuildProjectDirectory)\..\Resources\nunit-console.exe"
  
  
    
      $(MSBuildProjectDirectory)\..\Example.UnitTests\$(OutputPath)
    
    
      $(MSBuildProjectDirectory)\..\Example.IntegrationTests\$(OutputPath)
    
  
  
    
    
      
    
    
    
  

A few things to note,
  1. Inclusion of AdditionalTasks as part of DefaultTargets attribute,
  2. Modification of OutputPath property, from default bin\DebugContinuous to bin\Debug, and
  3. Conditional inclusion of Example.IntegrationTests for ReleaseContinuous only
The rest of it is fairly straightforward. And I'm knackered.


Resources

MSDN MSBuild Reference
MSDN MSBuild Exec Task Reference
MSDN MSBuild Reserved Property Reference
Kent Boogaart's Blog, fail early with full builds
Peter Provost's Blog, custom metadata
Kevin Dente's Blog, run all tests, fail on at least one error

Monday, April 25, 2011

Oi, oi, oi ...

Been awhile. A few months back I started working on some material for some logging-based posts, but got side tracked by work and other stuff too. Still on the radar, but also ramping up on some Silverlight - especially the design aspects!

While searching for Silverlight related tutorials and sites, found this gem .toolbox. It's a clever site with loads of content that I am slowly working through. As a dev, the Expression Blend tutorials are a good way to get familiar with the product. I am especially interested in using Blend to prototype faster so that I may elicit users feedback sooner. I am also enjoying myself with the design sessions, and recommend these to any developer interested in User Interface design work.

.toolbox user avatar for johnny_g


Wednesday, October 13, 2010

I <3 palindromes ...

w00t! Stack Overflow!


Okay, so maybe not so unique, but pretty cool nonetheless!