Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Get Started with Multithreading in .NET

Have you ever built an application where users had to wait while the application performed some lengthy calculation or operation? Learn how to improve your application's responsiveness by creating and controlling threads.


advertisement


The concept of threads is central to the inner workings of most operating systems, including Windows, but relatively few programmers even know what they are—let alone how to take advantage of them. Understanding threads, using modern operating system's multithreading capabilities properly, is a fundamental step toward creating fast, responsive applications. The .NET Framework makes creating multi-threaded applications easier than ever. In this article you'll see what threads are, how threading works, and how you can use them in your applications. Why Multithreading?
To understand the power of multithreading you need to know something about how the Windows operating system works under the hood. Windows is a preemptive multitasking operating system. The system CPU can do only one thing at a time, but to give the illusion that multiple processes are running simultaneously the operating system splits the CPU time between the various running processes.

The term preemptive means that the operating system determines when each task executes and for how long. Preemptive multitasking differs from cooperative multitasking where each task decides how long it will execute. Preemptive multitasking prevents one task from taking up all the processor's time. For example, you might think you can edit a word processing document, print a spreadsheet, and download an MP3 file from the Web all at the same time, but in actuality, the operating system allocates small "slices" of CPU time to each of these processes. Because the CPU is very fast, humans are very slow, and the time slices can be extremely small, all three processes appear to run simultaneously.

On a multi-processor system things are a bit more complicated but the basic idea is the same—the operating system divides the time of the CPUs among the processes that need it. Figure 1 illustrates the principle of multitasking.

Figure 1: A multitasking operating system divides the CPU's time between all running processes.
Each process has a priority that determines how the operating system allocates CPU time to it relative to other processes. Schemes for allocating process priorities differ among OS's, but Windows 2000 has four priority levels. In descending order, these priorities are:


  • Real time: The highest priority. These processes preempt all processes with lower priority. Real time priority is reserved for processes that cannot suffer even minor interruptions, such as streaming video and games with complex graphics.
  • High priority: Used for time-critical processes that must execute immediately (or almost immediately) for proper operating system functionality. The Windows Task List is an example of a high-priority process. It is assigned high priority because it must display immediately when requested by the user regardless of anything else the operating system is doing. Only real-time priority threads can interrupt a high priority thread.
  • Normal priority: Used for processes that have no special CPU scheduling needs. Word processing and background printing are examples of normal priority processes.
  • Idle priority: Used for processes that run only when the system is otherwise idle. A screen saver is a good example of this priority level.
These thread priorities are those defined for the Windows 2000 operating system. When you are assigning a priority to a thread in .NET the names that are used are different, but the meanings are clear. Here's where the concept of threads becomes important. A thread is the unit, or entity, to which the operating system assigns CPU time. Normally a program or process is associated with a single thread. The process must accomplish everything it needs to do—drawing the user interface, responding to user input, writing files, calculating results, everything—during the CPU time that Windows allocates to its single thread. For most applications this works just fine.

Windows also supports multi-threaded processes. As the name suggests, a multi-threaded process is one in which the program divides its tasks among two or more separate threads. Because Window allocates processor time to threads and not to processes, a multi-threaded process gets more than the usual share of CPU time. So is the purpose of multithreading simply to speed up a program? No, although it can have that effect, particularly on a multiprocessor system. The most important use of multithreading has to do with the program's responsiveness to users. There are few things more frustrating to users than a program that does not respond immediately to mouse and keyboard input. Yet, when a single-threaded program has lengthy calculations or I/O activities going on at the same time this is exactly what is going to happen. The program's one thread must handle user interaction and the calculations, which often causes the user interaction to become sluggish.

Therefore, when your program must respond to user input at (perceptually) the same time that it is performing one or more other tasks, you should consider multithreading. By assigning calculations or I/O to one thread, and user interaction to another thread, you can create a program that responds to user actions efficiently while also performing the required data processing. There's Always a Downside
Using multiple threads is not all roses, however. As with almost everything else in life, a penalty accompanies the advantages. There are several factors you should consider.

First, keeping track of and switching between threads consumes memory resources and CPU time. Each time the CPU switches to another thread, the state of the current thread must be saved (so it can be resumed again later) and the saved state of the new thread must be restored. With too many threads any responsiveness advantages you hoped to gain may be partially nullified by the extra load placed on the system. Second, programming with multiple threads can be complex. Creating a single extra thread to handle some background calculations is fairly straightforward, but implementing many threads is a demanding task and can be the source of many hard-to-find bugs. My approach to these potential problems is to use multithreading only when it provides a clear advantage, and then to use a few threads as possible.

Third is the question of shared resources. Because they're running in the same process, the threads of a multi-threaded program have access to that process's resources, including global, static, and instance fields. Also, threads may need to share other resources such as communications ports and file handles. You must synchronize the threads in most multi-threaded applications to prevent conflicts when accessing resources, such as deadlocks (when two threads stop as each waits for the other to terminate). For example, suppose that Thread A is responsible for obtaining data over a network, and Thread B is responsible for performing calculations with that data. It might seem like a good idea to require Thread A to wait for Thread B to complete (so the data is not updated in the middle of a calculation), and also to require Thread B to wait for Thread A to complete (so the latest data is used in the calculations). If coded improperly the result will be two threads that never execute. The .NET Framework provides classes to control thread synchronization, but even so, multithreading introduces another level of programming complexity.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date