Automate, else Enforce otherwise Path of Least Resistance

These are my Three Principles of Pragmatic Process Establishment. From my experience, these are the only ways to successfully implement a process, no matter how much value it may add, or how simple it may be to follow. If these principles are not followed, neither will the process that has been put in place when it comes to the crunch.

Automate

Automation is the most desired and beneficial option. The benefits are obvious. If a process is automated the value-add from that process is gained, without extra effort from the team. For example, the simple process that developers should run the unit tests. The value is obvious. Errors are found earlier and higher quality code results. However, although this is a simple instruction to follow, running unit tests has blockers for many developers. Firstly, they take a long time to run. Secondly, not all are relevant to you. Thirdly, they may have environmental requirements you cannot or do not want to setup. This simple process also has a simple automation solution: have a continuous integration build server that runs all the tests for the developers on each check-in. All the benefits gained, and everyone will perform the process.

Another example is having developers log their time spent working against work items. Having developers do this is controversial (individual developers should never be measured, only the team as a whole), but the value is the schedule becomes very realistic and accurate based on actual burn-down rates. Time tracking is however, time consuming and error-prone. Developers tend to forget to record the time started, time finished and minus interruptions against each work item. The numbers are then fudged voiding all the value of the process. The automation solution utilized has been to use TFS Working On. The tool allows developers to very easily track their time, without wasting their time. The data is providing us with accurate reporting while dog-fooding our BI product.

Automated processes also have the benefit of accurate repeatability. If an installation package is fully created each build, the risk involved in following a multiple step process is gone. A typical Standard Operating Procedure (SOP) is usually a long manual process that is error prone, especially when under time pressure and reduces confidence in the end result, compared to an automated solution. Of course the negatives of this process implementation solution is that not everything can be automated, it may not work for all required cases or require ongoing maintenance, or may take a long time to implement.

Enforce

When Automation does not fit enforcement is usually the next best. The reason a process even needs enforcement is that it will most likely not be the easiest thing to do. There is much value in doing it, but not necessarily by the person doing it, or not at that time. The enforcement itself must be a physical enforcement, not a SOP document stating “thou shalt follow”. By physical I mean generally technical. A document cannot be uploaded to SharePoint until it has certain properties set. Physical is like ballot papers must be folded, so they don’t fit in the slot on the box, until they are folded.

In the development world check-in policies are one of the best examples of enforcement. Here you can define your coding standards, code reviewer, check-in comments and work item association for example. Having these particular policies helps maintain higher code quality and readability, easier searching of history, and better reporting. The danger of enforcement is going too far and having people overwhelmed with the number of hoops they have to jump through to get things done, making them feel inefficient and ineffective. Each process enforced must be reviewed occasionally to ensure it is still adding value, and if found to be less value than that of the time it costs, it should be dropped immediately.

Path of least resistance

When a process cannot be automated or enforced it must be the easiest way to get the task done. If not, it will require someone to be the Sergeant to enforce the process which itself is time consuming, error allowing, inconsistent and demoralizing. Essentially a path of least resistance process is very simple and value is seen clearly. It could however be artificially made the path of least resistance by making the other paths more difficult or less effective.

Using a wiki for team collaboration and information is a good thing and seems like a simple enough process to follow. However, most wikis require custom syntax for mark-up and uploading images is a pain. What I have seen happen is email is just used instead, since it allows simple rich text editing, inline images and the information is directly sent out to everyone right away. This is almost an acceptable process over the wiki, but distracting email threads tend to occur, important emails are overlooked and new employees cannot access the information. What we have implemented instead is Microsoft OneNote for our wiki. OneNote is a rich client application that automatically synchronizes all the content locally, is searchable including text in images, can show a list of recent changes if you want to follow what is happening and all sections show who modified it and when (still needs proper versioning though). It has become the easiest way to share information among the team, and is a great resource for new employees.

General Tips

Begin a new process with as little as possible. Do not start a new form with a large amount of mandatory fields. Grow them as the required. The process will have fewer blockers, less red tape with little perceived value, and get more user buy in.

Do not over analyze. Analysis Paralysis is state I now know I use to get stuck in. Do something. Keep it light, nimble and able to be adapted quickly as it matures.

Re-evaluate occasionally. Make sure you are getting value out of what the process is requiring people to do. If the cost is high and value is low, either look at how the cost can be reduced through automation or trim some of the fat of the process that is yielding low returns. If people are having trouble with the process, don’t blame the people. Focus on smoothing out the bumps that are causing them to fall of the path.

Do not provide much documentation of the process. The process needs to be simple to follow and self documenting, i.e. each step points to the next. Documentation tends to be TAGRI and will either stop a process being nimble or quickly become out of date with the process. The best option for documentation if it is required is to have it automatically generated.

A good process will perform under pressure. Do not abandon it to save a penny now, because it should save you hundreds later.

TFS Business Intelligence Reporting

In December, I spoke at the Queensland VSTS User Group about reporting on TFS using the Analysis Services cube that is included on the data tier. At the end of this post I have included links to my slides and some sample reports in Excel 2007 to get you started.

Reports are only as good as the data. Using TFS Working On to help accurately report the time spent, or at least remaining hours on your work items has been working very well for my development team. Due to this, the next version should be released shortly, and hopefully continually improved quickly as we are all using it in anger.

Before you can use the Excel reports you must change the TFS cube connection:

  1. Open the Work Book
  2. Select the Data Ribbon
  3. Select Connections, Properties… and the Definition tab
     
  4. Update the Data Source in the Connection String to the Analysis Services instance containing the TFS Cube
  5. Click OK, Close

 http://cid-ccb05a30bca0ff01.skydrive.live.com/embedrowdetail.aspx/Public/QVSTSUG-TFSBIReporting-Dec08-MatthewRowan.ppsxhttp://cid-ccb05a30bca0ff01.skydrive.live.com/embedrowdetail.aspx/Public/TFSBIReports.xlsx

DevSta Entry – Mobile Memory Speed

Recently, I entered the DevSta {Challenge 2008}. The Challenge Brief was release at 0808 EST on Monday 29 September. This was the start of 200 hours and 8 minutes to develop a Windows Client, Mobile or Silverlight application or a Vista Gadget based on the theme Old School / New Cool. Not only was the time very limited, since I do work full time and have a 3 month old son, but thinking of an idea to fit the theme, that would be exciting and able to be completed in time, was difficult. Asking many people the only responses for ideas I got, was games. That is fine, and there were some good suggestions, but the time constraints meant I would never get these finished.

My idea ended up being the simple game of Memory on a Windows Mobile device. The "new cool", was to make it Speed, in which matched pairs would turn back over after a time out. Not a very cool idea I understand, but it is additive to play nonetheless. Below is the description of my submission:

Mobile Memory Speed brings the classic ‘old school’ card game Memory to the Windows Mobile Platform with a new twist. As you match the pairs, if you don’t finish quick enough, the pairs will start to flip back. This game comes complete with Difficulty options up to 40 pairs, which is enough to challenge anybody. Game high scores and statistics are also recorded to ensure you have a goal to beat.

 

I was able to complete the application within 10 hours and am very happy with the result. It feels like a completed product, although I have had many more suggestions for features to mix up the game further. The winners have been announced and unfortunately my entry is not among them. All in all it was a good experience putting something together so quickly. It has given me more motivation to just get in there and write more applications. Thanks also to Microsoft, the sponsors, the judges and the guys that organised the competition, and congratulations to the winners! I look forward to participating next year. Until then, I have made an installer for Mobile Memory Speed if you would like to give it a go. Note that the Microsoft .NET Compact Framework 3.5 is required.

http://cid-ccb05a30bca0ff01.skydrive.live.com/embedrowdetail.aspx/Public/MobileMemorySpeed|_Setup.CAB

Circle of Interest

Recently, Steve Nagy nominated me to continue on with the circle of interest started by Paul Stovell stating the technology areas where I will be focusing my efforts. So here it is:

Core – Green
This area consists of things which I already know fairly well, but I know I can learn a whole lot more. They also fall directly what I am working on at the moment, so I have a good opportunity to make good on doing these things better.

Non-Core – Blue
I am quite interested in these and will get to use them this year, but I will not get a deep level of understanding, or a great deal of experience, as I will with the core items.

No More Time – Red
I’m overly curious. I like technology. But with the whole point of the circle, is to ensure I’m not a jack of all trades, and master of none. These items just get pushed out.

There are things that I didn’t fit on my list, like Expression Studio, Windows Home Server and Windows Media Center. These are things I tinker with, but it is something that I won’t get much time, but I like to be able to get by. For example, I used Expression Design to draw the circle. Other technologies, like Windows Server 2008, SQL Server 2008, IIS 7, and other environmental and system administration tasks I am able to get familiar with, due to my work in WiX, MSI and deployment. But I need to utilise others expertise in these areas, because I don’t have time to learn it all myself, but again, like to be able to get by.

Another interesting thing, is that putting the items in a graphic, just made things so much easier to see. When I began drafting, I wrote the items in lists, and it just was not clear. I guess that is a testament to Mind Maps

Technorati Tags:

Finding the Assembly location

Finding the location of the running application is a task that is often hit. This is usually required to locate resources relative to the application, such as application settings or deployed files. There are many locations that can be used outside of the assemblies location that are often more appropriate, such as the Users’ Application Data directory, temporary directory or a database.

One example that I hit recently where those locations just didn’t fit the requirement was for an installation deployed configuration file, which is configured by the installer for application wide settings, and was not to be modified once deployed. This method had to be consistent as it would be used in a shared library by WinForms, Web applications and Office Add-ins. Environment.CurrentDirectory, is usually set to the directory the application was started in, so I skipped that and went straight to the method I’ve used before:

System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location);

From a WinForms application it returned, as expected:

C:\Users\Matthew\Documents\Visual Studio 2008\Projects\FindAssemblyLocation\Application1\bin\Debug

However, when I was running it in a web application it returned a directory that was not very helpful:

C:\Windows\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\website1\ecf8f52a\3c164603\assembly\dl3\7948a1a3\8eb04ec1_9e9fc801

And within an Excel Add-in it returned:

C:\Users\Matthew\AppData\Local\assembly\dl3\3LGRT984.R33\JL6TAD7L.4OV\1e7fb3b7\b3a2d91c_55a0c801

My colleague mentioned another method, so I gave that one a whirl: (Note _Default is the name of a class in the assembly)

Directory.GetParent(typeof(_Default).Assembly.Location).FullName;

Unfortunately it returned the same as above in all instances.

After a little searching I found another:

Uri assemblyUri = new Uri(System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().GetName().CodeBase));
assemblyUri.LocalPath;

From a WinForm application it returned the same as above. From a Web application hosted by IIS it returned:

C:\Users\Matthew\Documents\Visual Studio 2008\Projects\WebSite1\PrecompiledWeb\WebSite1\bin

Note however, that this also returned a Local Temporary ASP.NET Files location when running under the Visual Studio ASP.NET Development Server, but this is to be expected. From an Excel Add-in it returned:

C:\Users\Matthew\Documents\Visual Studio 2008\Projects\FindAssemblyLocation\ExcelAddin1\bin\Debug

This method appears to work in WinForm applications, Console applications, Office Add-ins, and Web applications hosted in IIS.

Technorati Tags: ,,

WiX is Free… almost

Christopher Painter recently posted an interesting WiX post, WiX: Forced to use Beta Software, and now has a follow up cleverly titled, Pay No Attention To The Bugs Behind the Curtain. These are very good arguments that have been bought up, but I think there is another aspect that is being slightly overlooked. WiX is free open source software.

My experiences with open source projects have generally come with the issues that Christopher has been raising, lack of support, unknown release quality, and unknown version compatibility. If Microsoft and not just their employees, on their own time, built WiX, many of these concerns might go away.

Being open source is also its strength. It is completely free, if you have the time. If WiX were a commercial product, it would then have great difficulty competing against the current major MSI authoring tools. If you come across any showstopper bugs, you can fix them yourself, although this can be a time consuming, expensive process. If it does not suit, you can make it. You just need to look at what SharpDevelop did to the WiX MSBuild target for WiX v2 to support fragments in their IDE. The difference is the developer remains in control. Whenever you use a third party product you are depending on them to support and fix any issues you find in a timely manner. This does not always happen and can leave you stranded.

Being forced to use Beta software is a bit of an embellishment. I completely agree that as a setup developer you are stuck if you want to use Votive. How do I get around this? I do not use it. WiX v2 is great, Votive v2 is not. As long as you can hit a button and create a ready to ship MSI, this is all that matters. I simply have an empty C# project that contains the list of WiX files. In the MSBuild script for the project, I run the command line tools to generate my installer with WiX. I have the WiX release I used to develop that installer, checked in source control, and referenced by the MSBuild task. You do not need to be concerned about what WiX release you are using as long as it creates the MSI you need for your project.

This only becomes an issue when you want to take advantage of new features or you come across a bug that you cannot workaround. Here you have two options. See if it has been fixed in a later release, or fix it yourself. If you use a later release, then you have to go through a complete quality assurance process of your MSI you have created. This is like any commercial product. If you fix it yourself, it will cost you time, but hopefully you will be able to estimate the timeframe that will be required. This is not like any commercial product. For example, we have recently come across a showstopper bug in the .NET Framework 2.0 SP1. This is preventing us moving forward to .NET 3.5. The fix is known, has been raised with Microsoft, but we need to wait for their process to get a fix. (A recent update informs there will not be a hotfix release, because a workaround is available, which may not help to us, because the code is in a 3rd party control.)

I am using WiX v2 and do not intend to move to WiX v3 until the schema is stabilised and it is given the go ahead. It would be too expensive for me to develop my installers in v3 and have to modify them heavily to work in later releases of v3. There are already plenty of modifications required to move from v2 to v3, so I only want to have to do this once. Having all the developers focused on v3 is an issue since support for v2 is short. The mailing list alleviates this issue, although many responses are "you can do this in v3", which is of no help. I would like to move to v3 due to great new features and integration with Visual Studio. This is not something that I can do if business depends on it. It was a risk moving to WiX v2 while it still was not finished. I had to choose a weekly release and work with that, until I hit a bug that was fixed in a later weekly release.

Now that WiX (Votive specifically) is being developed within the Visual Studio Rosario team hopefully releases will be more supported, higher quality and better version compatibility. It is unlikely that I will move to WiX v3 for any commercial project before a release candidate of Rosario is available.

Technorati Tags: ,

Performance Improvements

During initial development it is often fruitless to take performance into consideration. Focusing on good design, readability and maintainability is far more important. When performance improvements are looked at too early, time is usually spent in areas where little difference is actually made. Obvious inefficient code should always be avoided, but running performance testing at the end of the development cycle is a much more effective way of making gains.

Profiling tools are what make this way of performance improvements possible. I was a little disappointed in the Visual Studio 2008  Profiler. It is an alright start, but just knowing which functions are taking the most time, doesn’t always make it obvious what is slowing it down, especially when framework classes are included in the analysis. ANTS profiler felt light, but had the features that made the task very simple. It can be set to only profiler classes you have code for and gives you the number of times each line is run, and the time spent on each line. With this information it is very easy to see the violating code. Armed with the right tools, there were three performance improvement techniques that aided me in my last bottleneck hunt.

Firstly, combining loops. Avoid iterating over the same list twice. This one can sometimes be hard to see if the code is not well maintained. Surprisingly, it also is often hard to refactor without introducing errors, and does not give a great return on investment. This is certainly an inefficiency that should be avoided during initial development.

Secondly, avoid unnecessary exceptions. If you know a certain piece of code may throw an exception, if possible, test for that condition first. Exception handling is far more expensive than doing a conditional check. This is one that you should not be concerned about while coding. Usually exception handling is a more elegant way to write error handling code. Especially if the exception is not expected to happen often. Avoid using try catch blocks within loops. The particular instance that I found from the profiler was inside a nested loop of rows and columns, and throwing and catching an exception almost every iteration. Removing it improved the performance greatly.

Thirdly, cache expensive calls. This is another improvement that is often not seen during development, but when it causes an issue, is highlighted by a profiler and easy to improve. Calls to some functions or properties may do more work than you expect, like reading from the database. If these calls are made within a loop, make the call once outside the loop, and store the result in a local variable. If the call is dependant on the index of the loop this can be more difficult. A simple way around this is to use a dictionary. For example consider the code:

foreach (Student student in students)
{
    Console.WriteLine(RoomName(student.RoomId));
}

The function RoomName comes up on the profiler to take a long time, since it actually performs a database query. Replacing this with a dictionary lookup would result in:

Dictionary<int, string> roomsLookup = new Dictionary<int, string>();
foreach (Student student in students)
{
    if (!roomsLookup.ContainsKey(student.RoomId))
    {
        roomsLookup.Add(student.RoomId, RoomName(student.RoomId));
    }
    Console.WriteLine(roomsLookup[student.RoomId]);
}

This ensures that the expensive function is only called the minimum required amount, and is replaced with a quick dictionary lookup. For more advanced performance improvements read the Microsoft patterns & practices guides.

Technorati Tags: ,

WiX Installation for Excel Add-In

Using WiX for installation development provides a simple way to quickly build installers, while maintaining the power to extend to the most difficult deployment scenarios. For the deployment requirements of an Excel Add-In you should read Deploying Application-Level Add-ins, on the MSDN site.

Firstly, it is required to register your COM Add-in. This can be done from the command line using the RegSvr32 executable. In WiX, all that is required is:

<File Id="Addin_Dll" Name="Addin.dll" Source="Addin.dll" KeyPath="yes" >
    <Class Id="{CLSID UUID}" Context="InprocServer32" Description="Addin" ThreadingModel="apartment" >
        <ProgId Id="Addin.Connect" Description="Connect Class" />
    </Class>
</File>

The Addin.dll is  a C++ COM component. Similarly can be done registering a .NET Assembly with COM Interop. The required registry keys are very simple with WiX to add:

<Registry Root="HKLM" Key="Software\Microsoft\Office\Excel\Addins\Addin.Connect" 
          Name="Description" Value="Description for Addin.Connect" Type="string" />
<Registry Root="HKLM" Key="Software\Microsoft\Office\Excel\Addins\Addin.Connect" 
          Name="FriendlyName" Value="Addin.Connect Friendly Name" Type="string" />
<Registry Root="HKLM" Key="Software\Microsoft\Office\Excel\Addins\Addin.Connect" 
          Name="LoadBehavior" Value="3" Type="integer" />

To register the add-in for just the current user on the computer, simply change the Root value to HKCU. It is also prudent to add a pre-installation condition that Excel in installed on the target machine:

<Property Id="P_EXCEL11INSTALLED">
    <RegistrySearch Id="SearchExcel11" Type="raw"
        Root="HKLM" Key="SOFTWARE\Microsoft\Office\11.0\Excel\InstallRoot" Name="Path" />
</Property>
<Condition Message="You must have Microsoft Office Excel 2003 installed to use this product.">
    P_EXCEL11INSTALLED
</Condition>

Adding these few blocks to your standard installation is all that is required in WiX to deploy an Office Excel Add-In.

VB.NET New Line Character in a String

Normally I work with C# but recently I have had to do some VB.NET development. Today I came upon the issue of a new line character in a string. C# this is no issue:

string.Format("Line 1: {0}\nLine 2: {1}", str1, str2);

VB.NET this becomes a little awkward. VB.NET you have many options for a new line character:

VB Constants: vbNewLine, vbCrLf
Character Function: Chr(13)
VB Control Chars: ControlChars.NewLine, ControlChars.CrLf
Environment Variable: Environment.NewLine

Using the Environment Variable would be the recommended practice as it should return an Environment specific string. However, in practice it would come down to constants versus variables. As I saw suggested on various forums was replace the \n in your string for one of the options above. This would result in either:

String.Format("Line 1: {0}" + Environment.NewLine + "Line 2: {1}", str1, str2)  
String.Format("Line 1: {0}{1}Line 2: {2}", str1, Environment.NewLine, str2)

Neither of these options satisfied me. Since I wanted to use it in a resource file, that ruled out the first way, and the second is clumsy and error prone. There is a very simple solution to this issue though. In the resource editor, while editing the string press Shift+Enter to insert a New Line.

If you don’t have a resource file setup in your project, make sure you have Refactor! for VB.NET installed. This is free as I mentioned in my last post. Place your cursor on the string and press the Refactor! Shortcut Key Ctrl+~. Select Extract String to Resource. This will move the string literal to the resource file and replace it in the code with a call to the resource. Name the resource string appropriately, and adjust your string in the resource editor. You can now use Shift+Enter to add a new line in your string, while also using a better programming practice with resource files.

Technorati Tags:

Refactoring Bad Code

The code I have been refactoring has been causing me a bit of pain, as I hinted in my last post. I have refactored plenty of good and bad code before. This time however, I headed off in the wrong direction too quickly. Before long, I had myself tangled and had to revert to the original code and start again. Before I get into my approach, let’s review the tools.

Refactoring is not something that should be done by hand, as there are very good tools available. I usually make heavy use of the stock ones in Visual Studio Team System – Software Developer Edition. I am using VS 2008, but not much in the Refactoring seems to have changed since VS 2005. Unfortunately, these are not available in VB.NET, but Microsoft does recommend using Refactor! for VB.NET which is available for free. I am fortunate to have access to the full version of Refactor!Pro and ReSharper. For me, I have actually uninstalled both of them, and just use the VS built in refactoring’s. Whenever you see a smart tag, Shift+Alt+F10, is your friend.

Refactor!Pro I find has a very clean interface and interaction. Their principle of no modal dialogs works very well to avoid jumping to the mouse. ReSharper has dynamic compilation as you type and it appears to have much smarter refactoring’s. On the negatives Refactor!Pro has a bit of a delay to give you the context menu of available refactoring’s, and nothing appearing really advanced like ReSharper. ReSharper though is bugging, crashes often (itself and VS), uses huge amounts of memory and slows down VS greatly. I feel it attempts to do a little too much, hijacking the intellisense and cripples VS when uninstalled. Additionally, due to the dependency that you get having these tools, I find myself feeling crippled when I have to work on another machine without them. I do have Refactor! for VB.NET installed, since there is no VS options and it is free, so I can install it on any machine I am working on. Although, if I have a choice, I would not opt for programming in VB. John Papa has an old but still relevant comparison if you want to read more

Armed with Refractor! for VB.NET my approach was as I normally would do and usually works well. Find obvious blocks of code that can easily be pulled out into using Extract Method. Doing this helps aid in understanding the code. Well-named methods become self commenting code. In this function there was a few For Each loops, some repeated. Removing the content of a loop into methods so that the start and finish can be seen on the one screen can often show possible optimizations that are not easily seen otherwise. While extracting a method in the initial state of this code I had seven parameters, half of them begin ref’s and the other being out’s. This makes for an extremely error prone function and the code more of a mess than it is already in. Extracting methods initially failed in this code due to the complete misuse of local variables.

The next Refactor command to come to the rescue is this case was, Move Declaration Near Reference. At the top of this function all the variables used (and not used) were declared. Not only that, but after a variable was used, instead of creating a new variable, it was just reassigned a value, not dependent on its previous value. This creates an unnecessary dependency, requiring an extract method attempt to return the value, when it is not really used after the method. As in the example below, extracting the first loop requires iRow and iSum to be returned.

Dim iRow As Integer
Dim iSum As Integer

For Each iRow = 0 In Table.Rows
    iSum += 1
Next
iSum = 0
For Each iRow = 0 In Table.Rows iSum += 2 Next

Reducing the scope of the variable to the smallest required, segments these two tasks. Before performing an extracted method this code should be refactored to:

Dim iSum As Integer
For Each iRow As Integer = 0 In Table.Rows
    iSum += 1
Next
Dim iSum As Integer = 0
For Each iRow As Integer = 0 In Table.Rows
    iSum += 2
Next

This will raise a warning that been Local variable ‘iSum’ is already declared in the current block. This will be fine once the code blocks have been extracted to methods. Declaring the variable again allows the refactoring tools to know that the variable is not required after the method. Meticulously performing this a great number of times to reduce the scope of the variables, and determine when a new declaration was required enable one particular method extraction to go from having a return value, 4 ref and 3 out parameters, to being a subroutine with no return value and no parameters.

In the example above, you could also see the possibility of combining the two loops. This could also be performed on the code I was working on, but was not immediately obvious until the contents of the loops were safely extracted. When refactoring, be careful not to automatically assume the previous programmer did not know what they were doing. There must be a reason why certain things were done. Do not just delete a block of code because you at first appearance seems unnecessary. It is likely you do not understand fully what it is doing or attempting to do. Reviewing code after it is written can reveal a great deal of improvements that can easily not be seen while you are in the process of writing it. It may just be an instance where the code built up over many iterations building on Technical Debt and was just never reviewed as a whole, allowing for great improvements with a little refactoring.

Technorati Tags: