Writing High Performance .NET Code
Threading Support in .NET and Tips for Avoiding Common Threading Mistakes
Threading support in .NET is implemented in System.Threading namespace. This provides the classes and functions such as creating/destroying threads, synchronization primitives for atomic access that needed to write multi threaded code. This namespace also provides a class that allows us to use the pool of system provided threads called “Threadpool”.
Threadpool basically handles thread creation and cleanup. It recycles threads to minimize the thread creation and clean up overhead. Threadpool also sees other threads running such as GC threads so it can adjust the thread creation logic. A developer may not consider the number of threads that should be used, critical to proper performance. Threadpool also has built in heuristics enabling it to adjust the number of threads. It is recommended to use thread pool when you are thinking about threading your application. ASP.NET already uses Threadpool for processing web requests.
I mentioned earlier that Threadpool automatically decides how many threads are needed for optimal performance. For ASP.NET (web) applications, tune using the machine.config file to reduce the contention. Tune using this method when the following conditions are true (2).
- You have available CPU
- Your application performs I/O bound operations
- The ASP.NET ApplicationsRequests in Application Queue performance counter indicates that requests are getting queued
<processModel autoConfig="true"/> -> This default means it is adjusted automatically
minFreeThreads="32" -> Requests will be queued if total # of available threads falls below this number.
minLocalRequestFreeThreads="32" -> Requests from the local host will be queued if total # of available threads falls below this number.
maxWorkerThreads="12" <i>-> maximum # of worker threads in a threadpool. This is per CPU.</i>
maxIoThreads="12"-> <i>maximum number of I/O threads in a threadpool. This is per CPU.</i>
minWorkerThreads="40" <i>-> minimum worker threads available in the system @ any time. This is for the entire system</i>
So, how does the formula work?
The number of worker threads = maxWorkerThreads*# of CPU (Cores) in the system – minFreeThreads 16 = 12*4-32 (assuming you are running a 4 core machine). The total number of concurrent requests you can process is 16. But an interesting question arises. How do you know that this actually worked? Look at the “Pipeline Instance Count” performance counter and it should be equal to 16. Only 1 worker thread can run in a pipeline instance count so you should see a value of 16.
You have to be very careful when doing this as performance may degrade if you use random values.
.NET threading API’s and Threadpool make a developer’s life easier, but still there are many threading related issues that can hurt performance and scalability.
- Creating more or fewer number of threads than required can impact performance. Use Threadpool to help you in this instance. Ideally, the number of threads will equal the number of cores, and will yield the highest performance as each thread that can run concurrently on a processor.
- Threading wrong portion of application: This is by far the major problem in threading. Analyze your application completely before deciding where to thread. You have to thread the portion of your code where you spend most time to get significant performance.
- Multi threading also complicates debugging and events such as dead locks and race conditions. Have a good debug log (that you can enable in debug mode) to solve some of these complex nature bugs.