jump to navigation

Dictionary Extensions: Define useful extensions to play safe January 18, 2012

Posted by codinglifestyle in C#, CodeProject.
Tags: , , , , , ,
add a comment
   if (searchCriteria.ContainsKey(key) &&
       !string.IsNullOrEmpty(searchCriteria[key]))
       searchTerm = searchCriteria[key];

Ever have a dictionary or similar data structure and your code has many repeated checks to pull the value when in reality you’d be happy with a default value like null or string.Empty? Well, consider the following extension to Dictionary:

    public static class DictionaryExtensions
    {
        public static TValue GetSafeValue<TKey, TValue>(this Dictionary<TKey, TValue> dictionary, TKey key)
        {
            TValue result = default(TValue);
            dictionary.TryGetValue(index, out result);
            return result;
        }
    }

Let’s you do:

    Dictionary bob = new Dictionary();
    string safe = bob.GetSafeValue(100);
    System.Diagnostics.Trace.WriteLine(safe);

where safe defaults to “” as it hasn’t been added. Stop! I know what you’re going to say and I thought of that too. You can control the default value as well:

    public static class DictionaryExtensions
    {
        /// <summary>
        /// Gets the safe value associated with the specified key.
        /// </summary>
        /// <typeparam name="TKey">The type of the key.</typeparam>
        /// <typeparam name="TValue">The type of the value.</typeparam>
        /// <param name="dictionary">The dictionary.</param>
        /// <param name="key">The key of the value to get.</param>
        public static TValue GetSafeValue<TKey, TValue>(this Dictionary<TKey, TValue> dictionary, TKey key)
        {
            return dictionary.GetSafeValue(key, default(TValue));
        }

        /// <summary>
        /// Gets the safe value associated with the specified key.
        /// </summary>
        /// <typeparam name="TKey">The type of the key.</typeparam>
        /// <typeparam name="TValue">The type of the value.</typeparam>
        /// <param name="dictionary">The dictionary.</param>
        /// <param name="key">The key of the value to get.</param>
        /// <param name="defaultValue">The default value.</param>
        public static TValue GetSafeValue<TKey, TValue>(this Dictionary<TKey, TValue> dictionary, TKey key, TValue defaultValue)
        {
            TValue result;
            if (key == null || !dictionary.TryGetValue(key, out result))
                result = defaultValue;
            return result;
        }
    }

Let’s you do:

   Dictionary bob = new Dictionary();
   string safe = bob.GetSafeValue(100, null);
   System.Diagnostics.Trace.WriteLine(safe);

where safe is null.

There’s obviously something wrong with me because I still think this stuff is cool.

I’m developing a nice little set of extensions at this point.  Often it seems like overkill to encapsulate handy functions like these in a class. I had started by deriving a class from Dictionary<TKey, TValue> but changed over to the above.

Advertisement

FindControl: Recursive DFS, BFS, and Leaf to Root Search with Pruning October 24, 2011

Posted by codinglifestyle in ASP.NET, C#, CodeProject, jQuery.
Tags: , , , , , , ,
add a comment

I have nefarious reason for posting this. It’s a prerequisite for another post I want to do on control mapping within javascript when you have one control which affects another and there’s no good spaghetti-less way to hook them together. But first, I need to talk about my nifty FindControl extensions. Whether you turn this in to an extension method or just place it in your page’s base class, you may find these handy.

We’ve all used FindControl and realized it’s a pretty lazy function that only searches its direct children and not the full control hierarchy. Let’s step back and consider what we’re searching before jumping to the code. What is the control hierarchy? It is a tree data structure whose root node is Page. The most common recursive FindControl extension starts at Page or a given parent node and performs a depth-first traversal over all the child nodes.

Depth-first search
Search order: a-b-d-h-e-i-j-c-f-k-g

/// <summary>
/// Recurse through the controls collection checking for the id
/// </summary>
/// <param name="control">The control we're checking</param>
/// <param name="id">The id to find</param>
/// <returns>The control, if found, or null</returns>
public static Control FindControlEx(this Control control, string id)
{
    //Check if this is the control we're looking for
    if (control.ID == id)
        return control;

    //Recurse through the child controls
    Control c = null;
    for (int i = 0; i < control.Controls.Count && c == null; i++)
        c = FindControlEx((Control)control.Controls[i], id);

    return c;
}

You will find many examples of the above code on the net. This is the “good enough” algorithm of choice. If you have ever wondered about it’s efficiency, read on. Close you’re eyes and picture the complexity of the seemingly innocent form… how every table begets rows begets cells begets the controls within the cell and so forth. Before long you realize there can be quite a complex control heirarchy, sometimes quite deep, even in a relatively simple page.

Now imagine a page with several top-level composite controls, some of them rendering deep control heirachies (like tables). As the designer of the page you have inside knowledge about the layout and structure of the controls contained within. Therefore, you can pick the best method of searching that data structure. Looking at the diagram above and imagine the b-branch was much more complex and deep. Now say what we’re trying to find is g. With depth-first you would have to search the entiretly of the b-branch before moving on to the c-branch and ultimately finding the control in g. For this scenario, a breadth-first search would make more sense as we won’t waste time searching a complex and potentially deep branch when we know the control is close to our starting point, the root.

Breadth-first search

Search order: a-b-c-d-e-f-g-h-i-j-k

/// <summary>
/// Finds the control via a breadth first search.
/// </summary>
/// <param name="control">The control we're checking</param>
/// <param name="id">The id to find</param>
/// <returns>If found, the control.  Otherwise null</returns>
public static Control FindControlBFS(this Control control, string id)
{
    Queue<Control> queue = new Queue<Control>();
    //Enqueue the root control            
    queue.Enqueue(control);

    while (queue.Count > 0)
    {
        //Dequeue the next control to test
        Control ctrl = queue.Dequeue();
        foreach (Control child in ctrl.Controls)
        {
            //Check if this is the control we're looking for
            if (child.ID == id)
                return child;
            //Place the child control on in the queue
            queue.Enqueue(child);
        }
    }

    return null;
}

Recently I had a scenario where I needed to link 2 controls together that coexisted in the ItemTemplate of a repeater. The controls existed in separate composite controls.

In this example I need to get _TexBoxPerformAction’s ClientID to enable/disable it via _ChechBoxEnable. Depending on the size of the data the repeater is bound to there may be hundreds of instances of the repeater’s ItemTemplate. How do I guarantee I get the right one? The above top-down FindControl algorithms would return he first match of _TextBoxPerformAction, not necessarily the right one. To solve this predicament, we need a bottom-up approach to find the control closest to us. By working our way up the control hierarchy we should be able to find the textbox within the same ItemTemplate instance guaranteeing we have the right one. The problem is, as we work our way up we will be repeatedly searching an increasingly large branch we’ve already seen. We need to prune the child branch we’ve already seen so we don’t search it over and over again as we work our way up.

To start we are in node 5 and need to get to node 1 to find our control. We recursively search node 5 which yields no results.

Next we look at node 5’s parent. We’ve already searched node 5, so we will prune it. Now recursively search node 4, which includes node 3, yielding no results.

Next we look at node 4’s parent. We have already searched node 4 and its children so we prune it.

Last we recursively search node 2, which includes node 1, yielding a result!

So here we can see that pruning saved us searching an entire branch repeatedly. And the best part is we only need to keep track of one id to prune.

/// <summary>
/// Finds the control from the leaf node to root node.
/// </summary>
/// <param name="ctrlSource">The control we're checking</param>
/// <param name="id">The id to find</param>
/// <returns>If found, the control.  Otherwise null</returns>
public static Control FindControlLeafToRoot(this Control ctrlSource, string id)
{
    Control ctrlParent = ctrlSource.Parent;
    Control ctrlTarget = null;
    string pruneId = null;

    while (ctrlParent != null &&
           ctrlTarget == null)
    {
        ctrlTarget = FindControl(ctrlParent, id, pruneId);
        pruneId = ctrlParent.ClientID;
        ctrlParent = ctrlParent.Parent;
    }
    return ctrlTarget;
}

/// <summary>
/// Recurse through the controls collection checking for the id
/// </summary>
/// <param name="control">The control we're checking</param>
/// <param name="id">The id to find</param>
/// <param name="pruneClientID">The client ID to prune from the search.</param>
/// <returns>If found, the control.  Otherwise null</returns>
public static Control FindControlEx(this Control control, string id, string pruneClientID)
{
    //Check if this is the control we're looking for
    if (control.ID == id)
        return control;

    //Recurse through the child controls
    Control c = null;
    for (int i = 0; i < control.Controls.Count && c == null; i++)
    {
        if (control.Controls[i].ClientID != pruneClientID)
            c = FindControlEx((Control)control.Controls[i], id, pruneClientID);
    }

    return c;
}

Now we have an efficient algorithm for searching leaf to root without wasting cycles searching the child branch we’ve come from. All this puts me in mind jQuery’s powerful selection capabilities. I’ve never dreamed up a reason for it yet, but searching for a collection of controls would be easy to implement and following jQuery’s lead we could extend the above to search for far more than just an ID.

Custom Attributes with Extension Methods: Resource Key July 4, 2011

Posted by codinglifestyle in C#, CodeProject.
Tags: , , , , ,
add a comment

I picked up this technique in my last job to use a custom attribute to contain a resource key. The biggest benefit was all the enums in the system used this attribute which provided a way to translate that enum to text. Take a look at a sample enum:

public enum Mode
{
    [AttributeResourceKey("lblInvalid")]
    Invalid,
    [AttributeResourceKey("lblReview")]
    Review,
    [AttributeResourceKey("lblCheckout")]
    Checkout,
    [AttributeResourceKey("lblOrdered")]
    Ordered
}

Each enum uses the AttributeResourceKey to specify the resource key defined in the resx file. Combined with an extension method we can extend the enum itself to allow us to execute the following:

public void DoOperation(Mode mode)
{
    Log.Info(GetResourceString(mode.ResourceKey()));
    ...
}

The C++ head in me thinks, “why are we using reflection when a static function in a helper class could contain a switch statement to convert the enum to the resource key?”.  Technically this is sufficient and faster.  However, the C# head in me loves the idea that the enum and the resource key are intimately tied together in the same file. There is no helper function to forget to update.  The penalty of reading an attribute is a small price to pay to keep the enum and resource key together in order to increase overall maintainability.

So the first thing I am going to do is define a simple interface for my custom attributes.

public interface IAttributeValue<T>
{
    T Value { get; }
}

All this interface does is define that the custom attribute class itself will define a property called Value of type T. This will be useful when using the generic method, below, for pulling the attribute. Next we define the custom attribute class itself.

    public sealed class AttributeResourceKey : Attribute, IAttributeValue<string>
    {
        private string _resourceKey;
        public AttributeResourceKey(string resourceKey)
        {
            _resourceKey = resourceKey;
        }

        #region IAttributeValue<string> Members
        public string Value
        {
            get { return _resourceKey; }
        }
        #endregion
    }

Notice how simple the above class is. We have a constructor taking a string and a property called Value which returns said string. Now let’s look at the generic method for pulling the attribute.

    public static class AttributeHelper
    {
        /// <summary>
        /// Given an enum, pull out its attribute (if present)
        /// </summary>
        public static TReturn GetValue<TAttribute, TReturn>(object value)
        where TAttribute: IAttributeValue<TReturn>
        {
            FieldInfo fieldInfo = value.GetType().GetField(value.ToString());
            object[] attribs    = fieldInfo.GetCustomAttributes(typeof(TAttribute), false);
            TReturn returnValue = default(TReturn);

            if (attribs != null && attribs.Length > 0)
                returnValue = ((TAttribute)attribs[0]).Value;

            return returnValue;
        }
    }

The code above is the heart of code. It uses generics so you need only define this code once in a static class. By passing the attribute and return type we can extract our Value defined by IAttributeValue<TReturn>.  Using the where constraint on TAttribute allows the generic method to know this type defines a property called Value of type TReturn.  This exposes the true power of generics as without this constraint the method could only presume TAttribute is nothing more than an object.  This might tempt you to wrongly cast TAttribute in order to access it’s properties inviting an exception only seen at runtime.

Now to define our extension method, to be placed in a common namespace, to extend all enums with the ResourceKey() method.

    public static class EnumerationExtensions
    {
        /// <summary>
        /// Given an enum, pull out its resource key (if present)
        /// </summary>
        public static string ResourceKey(this Enum value)
        {
            return AttributeHelper.GetValue<AttributeResourceKey, string>(value);
        }
    }

Thanks to the generic attribute helper the above extension method looks trivial. We simply use the helper to return the resource key and now we’ve extended all our enums to have this useful property.

Visual Studio 2008 JumpStart December 18, 2007

Posted by codinglifestyle in ASP.NET, C#, Javascript, jQuery, linq.
Tags: , , , , , , , , , , ,
add a comment

Yesterday I attended the Visual Studio 2008 Jumpstart at Microsoft in Dublin.  This was a one-day course, presented by David Ryan, introducing some of the new features in C# 3.0 and VS2008.

 

I approach every release with excitement and trepidation.  There are some fantastic new features like JavaScript debugging, nested MasterPages, improved IDE for both coding and web design, and multi-target runtime support.  With multi-targeting support we can use VS2008 to continue to code or support .NET 2.  If you open a VS2005 project, you will be prompted to upgrade your project even if you continue to use .NET 2.  I asked was there any risk associated with this for those who need to continue to support .NET 2 and was told only the SLN file is changed.  So theoretically, there is no reason to keep VS2005 installed!  Do you believe it??  If only we could get rid of VS2003 as well.

 

Now, the reason I also approach each release with trepidation is because you know there is going to be some big, new, ghastly feature which will be force-feed to us like geese in a pâté factory.  This time that feature in LINQ.  Open wide because there is a big push behind LINQ and you’ll notice a using System.Linq statement in every new class (which is a real pain if you change target framework back to .NET 2).  But first, let’s review some of the changes made to C#:

 

  • Anonymous types
    • var dt = DateTime.Today;
    • Anything can be assigned to a var, but once it’s assigned its strongly typed and that type can’t be changed.  So I can’t reuse local variable dt and assign string “Hello” like I could with object.
  • Automatic Properties
    • This:

      private int m_nID;

         public int ID

         {

             get

             {

                 return m_nID;

             }

             set

             {

                 m_nID = value;

             }

         }

 

  •  
    • becomes this:

public int ID { get; set; }

 

  •  
    • The problem is in practice I prefer to use the private variables in code leaving properties for external access only.  Many times the get/set is doing something interesting, like reading the value from a cache or performing some complex operation.  We don’t necessarily want/need this code to execute every time it is accessed from within the class so I use the private variable.  Automatic Properties has us using the public property directly everywhere in the class.  So, personally, this will cause inconsistency in my coding style.
    • You can also specify the get as public but keep the set to just be accessible from within the class like this:

public int ID { get; private set; }

 

  • Object initalizers
    • Ever have to instantiate your own class and have to initialize it with a dozen properties?  Do you add 13 lines of code or go overload the constructor?  Now you don’t have to, imagine a simple Person class with 3 properties:

Person person = new Person { FirstName=“Chris”, LastName=“Green”, Age=33 };

 

  •  
    • Only initialize what you properties you want:

Person person2 = new Person { FirstName = “Chris”, LastName = “Green” };

 

  • Collection initalizers

List<Person> people = new List<Person>

         {

new Person { FirstName = “Chris”, LastName = “Green”, Age = 33 },

new Person { FirstName = “Bill”, LastName = “Bob”, Age = 46 },

new Person { FirstName = “Foo”, LastName = “Bar”, Age = 26 }

         };

 

  • Extension methods
    • This is a way of adding new inherent functionality to existing classes.  The objective is to add a new method to the DateTime type called LastDay() which will return the last day of the given date’s month.

public static class DateTimeUtils

{

        public static int LastDay (this DateTime dt)

        {

            return DateTime.DaysInMonth(dt.Year, dt.Month);

        }

}

 

  •  
    • Notice the this before the first parameter argument.  This tells the compiler that this is an extension method for the DateTime type.  We can now use this method as if it was built in to the DateTime class:
      • int nLastDay = dt.LastDay();
  • Lambda expressions
    • Too lazy or couldn’t be arsed to write a small 2 line function?  This is for you:

Func<string, bool> CheckName = sName => sName == “Bob”;

bool bBob = CheckName(person.FirstName);

Func<int, int> Square = x => x * x;

int nSquare = Square(5);

 

The fact is there’s an ulterior reason var, Lamda expressions, and many of above additions have been added to C# 3.0.  C# had been bent in order to accommodate LINQ.  LINQ allows you to perform queries on objects, DataSets, XML, and databases.  It’s interesting in that it offers an independent layer of abstraction for performing these types of operations.  This has actually occurred to me before when looking at data layers chock full of embedded SQL not being particularly ideal.  LINQ offers a generic and powerful alternative.  Let’s take a look at a simple example based on the people collection from above:

 

var matches = from person in people

              where person.FirstName == “Bob”

              select person;

 

The first thing that caught my attention was the select was last.  One reason they likely did this was to force us to state what we were querying first (the from clause) so that Intellisense could kick-in for the rest of the expression.  Notice person.FirstName was fully supported by Intellisense so the Person class was automatically inferred from the people collection.

 

You can create objects on-the-go from your expression.  For example:

 

var matches = from employee in people

              where employee.FirstName == “Bob”

              select new Person(employee.FirstName, employee.LastName);

 

 

Notice how var is inherently handy here (but bad practice for nearly everything else) as our LINQ expression returns System.Collections.Generic.IEnumerable.  Lambda expressions as well play a key part in LINQ:

 

var matchesBob = people.Select(CheckName => CheckName.LastName == “Bob”);

 

matchesBob.ToArray()

{bool[3]}

    [0]: false

    [1]: true

    [2]: false

 

var matchesInitials = people.Select(ChopName => ChopName.FirstName.Remove(1) + ChopName.LastName.Remove(1));

 

matchesInitials.ToArray()

{string[3]}

    [0]: “CG”

    [1]: “BB”

    [2]: “FB”

 

There is so much more to LINQ that I won’t attempt to cover any more.  Rest assured you will hear much more about LINQ in the months to come (fatten up those livers).  One thing is obvious, C# took a heavy hit in order to support it.  Let’s hope it’s worth it as every release we lean more and more towards VB.  A colleague recently remarked, “C# is all VB under the covers anyhow”.  He might be right.

 

Some other interesting new additions:

  • XElement – for anyone who has programmatically built XML before you will appreciate this quicker alternative
  • ASP.NET ListView – the singular new control this release.  Its basically a Repeater but with design-time support
  • Script Manager Proxy – typically the Ajax ScriptManager will be placed in the MasterPage.  When the content page needs to access the ScriptManager a ScriptManagerProxy can be used to get around the restriction of only one ScriptManager allowed per page.
  • Javascript enhancements – built-in libraries which extend string, add StringBuilder, and a number of other enhancements to make writing JavaScript more .NET-like
  • CLR Add-In Framework: essentially a pattern for loading modules (only from a specified directory) vs. using reflection, iterating classes for a specified type, and using the activator to instantiate the object
  • Transparent Intellisense: In VS2005 Intellisense went in to hyperdrive and our tab keys have never been the same.  However, I often found myself cursing it as it often got in the way.  So when VS is being too helpful and you can’t see what you’re doing press CTRL rather than ESC to turn Intellisense transparent.
  • Right-click the code and you will see an Organize Using menu below Refactor.  Here is a feature I’ve often dreamed of: Remove Unused Usings.  Pinch me!  Also, if you right-click the interface in a class’s definition (public class MyClass : Interface) there is an Implement interface option.   Last, if you are coding away and using a class before adding a using statement use ALT-Shift-F10 to automatically resolve the class and add the using statement to your file.
  • Improved support for debugging multithreaded applications
  • SQL Database Publishing Wizard is now integrated in the VS
  • Did I mention JavaScript debugging??!

 

You may notice I’ve omitted a few big topics.  I didn’t mention Ajax because that’s old news now.  However, there are new versions of the Ajax Control Toolkit and web deployment projects for VS2008.

 

I also didn’t mention Silverlight although I may find some interesting applications for it in the future.  For example, if you really hate writing JavaScript you could use Silverlight’s built-in mini CLR to write C# code which executes in the browser.  Oh, I hear it does some UI stuff too.

 

References: ScottGu 19-11-2007, ScottGu 13-03-2007, ScottGu 08-03-2007, C# 3.0 In a Nutshell, Pro ASP.NET 3.5 in C# 2008, and CodeProject.