During initial development it is often fruitless to take performance into consideration. Focusing on good design, readability and maintainability is far more important. When performance improvements are looked at too early, time is usually spent in areas where little difference is actually made. Obvious inefficient code should always be avoided, but running performance testing at the end of the development cycle is a much more effective way of making gains.
Profiling tools are what make this way of performance improvements possible. I was a little disappointed in the Visual Studio 2008 Profiler. It is an alright start, but just knowing which functions are taking the most time, doesn’t always make it obvious what is slowing it down, especially when framework classes are included in the analysis. ANTS profiler felt light, but had the features that made the task very simple. It can be set to only profiler classes you have code for and gives you the number of times each line is run, and the time spent on each line. With this information it is very easy to see the violating code. Armed with the right tools, there were three performance improvement techniques that aided me in my last bottleneck hunt.
Firstly, combining loops. Avoid iterating over the same list twice. This one can sometimes be hard to see if the code is not well maintained. Surprisingly, it also is often hard to refactor without introducing errors, and does not give a great return on investment. This is certainly an inefficiency that should be avoided during initial development.
Secondly, avoid unnecessary exceptions. If you know a certain piece of code may throw an exception, if possible, test for that condition first. Exception handling is far more expensive than doing a conditional check. This is one that you should not be concerned about while coding. Usually exception handling is a more elegant way to write error handling code. Especially if the exception is not expected to happen often. Avoid using try catch blocks within loops. The particular instance that I found from the profiler was inside a nested loop of rows and columns, and throwing and catching an exception almost every iteration. Removing it improved the performance greatly.
Thirdly, cache expensive calls. This is another improvement that is often not seen during development, but when it causes an issue, is highlighted by a profiler and easy to improve. Calls to some functions or properties may do more work than you expect, like reading from the database. If these calls are made within a loop, make the call once outside the loop, and store the result in a local variable. If the call is dependant on the index of the loop this can be more difficult. A simple way around this is to use a dictionary. For example consider the code:
foreach (Student student in students) { Console.WriteLine(RoomName(student.RoomId));
}
The function RoomName comes up on the profiler to take a long time, since it actually performs a database query. Replacing this with a dictionary lookup would result in:
Dictionary<int, string> roomsLookup = new Dictionary<int, string>(); foreach (Student student in students) { if (!roomsLookup.ContainsKey(student.RoomId)) { roomsLookup.Add(student.RoomId, RoomName(student.RoomId)); } Console.WriteLine(roomsLookup[student.RoomId]); }
This ensures that the expensive function is only called the minimum required amount, and is replaced with a quick dictionary lookup. For more advanced performance improvements read the Microsoft patterns & practices guides.