Asynchronous Programming
- Part 1: How Processors Work
Published on 27 September 2023
Processors are capable of executing programs asynchronously. In this article, we will explain how this process works, why it is essential, and its implications for programming.
Asynchrony, multithreading, and parallelism are fundamental aspects of programming, playing a pivotal role in modern computing.
The C# programming language equips developers with numerous tools to create applications that efficiently utilize multiple threads, enabling parallel and asynchronous execution. This very capability is one of the driving forces behind the popularity of the C# language and the primary reason it was chosen as the focus of this article series.
Understanding these concepts may seem daunting initially. Therefore, we'll commence with an introductory program illustrating how computer processors carry out operations.
How Processors Operate
In a given unit of time, known as a "tick," a processor can perform only one task, such as reading the value of a memory cell. Consequently, the following operations will require four ticks to complete:
· Read data from two memory cells.
· Perform an addition operation.
· Write the result to another memory cell.
This concept applies exclusively to atomic operations, the smallest and indivisible operations. More intricate tasks may comprise multiple atomic operations. For instance, multiplying number A by number B involves adding number A to itself B - 1 times:
5 * 5 = 5 + 5 + 5 + 5 + 5
Division requires even more ticks, and when working with floating-point numbers, the number of required ticks can be staggering.
Currently, you can have numerous tabs open in your browser, run a media player, use a messaging app, edit code, and more, seemingly all at once. It's intriguing that none of these tasks genuinely occur simultaneously.
Despite the processor's ability to execute only one task at a time, engineers and programmers have devised techniques to allocate its processing time efficiently, allowing it to switch between tasks swiftly. The processor's capability to execute a vast number of operations in a second makes it imperceptible to users that these operations occur sequentially.
The number of ticks is measured in hertz (Hz), a unit of measurement for the frequency of periodic processes. For instance, if a racing car passes by your house once every second, its frequency is 1 Hz. If it passes twice per second, the frequency is 2 Hz, and so on.
Processors operate at such a rapid pace that their frequencies are measured in gigahertz:
1 GHz = 1,000,000,000 Hz
Modern processors typically operate at frequencies of 2-3 GHz.
How Processors Execute Programs
Every program comprises numerous processes: addition operations, data writes to multiple memory cells, multiplication, and division, for example.
Linear execution of these processes would result in inconvenient programs. For instance, clicking on the download button in your browser would lock your computer until the download completes. Ideally, you should at least be able to monitor the download progress.
In this case, the program code might resemble the following:
Run the loop while there are packages to download:
Upload the package to temporary storage;
Move it from temporary storage to a cell at address X;
Calculate what percentage of packages are loaded;
Update the download lane;
End of the cycle.
While this cycle is running, you won't be able to perform tasks outside its boundaries. As you can observe, there are no lines for activities like playing music or tracking mouse movements. In fact, there aren't even lines for monitoring mouse movements.
To enhance the usability of computers, we segregate all concurrently running programs into threads. For example, if we have 10 programs running and the processor operates at a rate of 100 ticks per second, each thread will be allocated 10 ticks. In essence, the processor executes instructions from one thread for 10 ticks and then switches to the instructions of another thread, and so on in a cyclical fashion. Furthermore, each thread possesses a priority; more critical programs receive a higher allocation of ticks.
Within your operating system, every active program is executed within its own thread. This occurs seamlessly because it is the inherent design of the operating system. However, when you develop a program, you also have the capability to create new threads. This empowers you to craft applications that offer a more user-friendly experience.
When Asynchrony is Necessary
Asynchrony is most commonly required in programs featuring a graphical interface. In such programs, the primary logic and image-related operations are segregated into distinct threads. Consequently, even when the logic is engaged, the application remains responsive.
If all these tasks were performed within a single thread, the application would become unresponsive during the execution of complex instructions. In Windows OS, it's common to observe that when an application is in the midst of a task, clicking on it may display the message "Not Responding" in the window's title bar.
This message doesn't always indicate that the application has frozen; it could simply be occupied with processing a complex task within the same thread.
Example of an Asynchronous Application
To illustrate this concept further, let's create an application that utilizes asynchrony. It will be divided into two threads: the first will update the download progress, and the second will handle the actual downloading process.
using System;
using System.Threading;
using System.Threading.Tasks;
using System.Text;
namespace Async
{
class Program
{
static int full = 100;
static int completed = 0;
static int state = 0;
static char[] cursors = new char[] { '-', '/', '|', '\\' };
static void Main(string[] args)
{
LoadAsync(); //Start loading
UpdateLoading(); //Start of the download lane update
Console.ReadKey();
Console.WriteLine();
}
static void UpdateLoading() //This method will erase the old download lane every 100 milliseconds, and then output a new one
{
while(completed <= full)
{
Console.Clear();
state++;
if(state == 4)
{
state = 0;
}
string loadingBar = GetLoadingString();
Console.WriteLine(loadingBar + " " + cursors[state]);
Thread.Sleep(100);
}
}
static string GetLoadingString() //Method that creates a text loading bar
{
StringBuilder loadingBar = new StringBuilder("[");
for(int i = 0; i <= full; i++)
{
if(i < completed)
{
loadingBar.Append("#");
}
else
{
loadingBar.Append(".");
}
}
loadingBar.Append($"] {completed} %");
returnloadingBar.ToString();
}
static async void LoadAsync() //Asynchronous method that creates a thread to execute the Load() method
{
awaitTask.Run(()=>Load()); //The method is waiting for the Load() method to execute
}
static void Load() //A loading method that adds one to the completed value every 500 milliseconds
{
for(int i = 0; i <= full; i++)
{
completed++;
Thread.Sleep(500);
}
}
}
}
Conclusion
Asynchrony, multithreading, and parallelism are incredibly powerful and valuable tools that enhance our daily lives. Nevertheless, they come with their fair share of potential pitfalls, some of which can lead to catastrophic consequences. A separate article in this series will delve into these potential dangers.
Due to the intricacies of these tools, it will require more than a single article to provide a comprehensive understanding. My goal is not only to demonstrate how to effectively utilize them but also to foster a deeper comprehension of the underlying principles behind their functionality.
Recommended Tutorials
Stay updated with our latest articles, industry insights, and expert tips to keep your business informed and inspired.
We Anticipate Risks and Offer Simple Solutions.Software development encompasses...
Learn More
We might wish for a single technology that could create high-quality application...
Learn More