When we apply multithreading to increase the speed and efficiency of our applications, synchronization plays an essential role for it to be effective. It maintains order when multiple threads try accessing the same resource. In this article, we’ll explore different synchronization mechanisms. We’ll learn how they help us avoid problems like race conditions and deadlocks.

To download the source code for this article, you can visit our GitHub repository.

Let’s begin.

Support Code Maze on Patreon to get rid of ads and get the best discounts on our products!
Become a patron at Patreon!

Multithreading in C#

Before diving into the synchronization mechanisms, let’s take a quick look at multithreading. 

In programming, multithreading allows us to execute multiple threads concurrently. Each of these threads can perform different tasks at the same time. This parallel execution of threads helps our program run faster and be more responsive.

It introduces concurrency in our applications and we can utilize it to make an application asynchronous. However, it’s important to note that multithreading by default doesn’t make a program asynchronous. We need to explicitly use techniques such as async/await to run tasks independently to achieve asynchrony.

For efficient utilization, threads share resources thus reducing the overall consumption. However, this also opens us to a set of challenges:

  • Multiple threads access the shared data simultaneously and lead to unpredicted results by interfering with each other in a race condition
  • Threads may keep waiting indefinitely for each other to release resources while causing the application to freeze in a deadlock
  • When there are multiple threads interacting with resources it leads to complexity in application maintenance and debugging

Need for Synchronization in a Multithreaded Environment

Synchronization is essential in a multithreaded program. If we allow multiple threads to access and make changes in shared resources simultaneously, it’d result in a chaotic mess.

We might end up in a situation where while a thread is making modifications, another thread changes the value before the first thread can commit its change.

Let’s understand this with an Account class:

public class Account
{
    private int _balance;
    
    public int Balance
    {
        get => _balance;
        set => _balance = value;
    }

    public void Withdraw(int amount)
    {
        _balance -= amount;
    }
}

This class represents a bank account with Balance property and a Withdraw() method to withdraw some amount from the initial balance.

Concurrency and Race Condition

Let’s simulate a scenario where we have multiple tasks trying to withdraw balance:

public static void WithdrawBalance(Action<int> withdrawalAction)
{
    using var synch = new ManualResetEventSlim(false);

    var tasks = new Task[1000];

    for (var i = 0; i < tasks.Length; i++)
    {
        tasks[i] = Task.Run(() =>
        {
            synch.Wait();
            Thread.Sleep(Random.Shared.Next(50, 300));
            withdrawalAction(100);
        });
    }

    synch.Set();
    Task.WaitAll(tasks);
}

Here, we have a WithdrawBalance() method that takes an Action delegate as a parameter. The random delay before invoking the referenced method ensures that threads take different times to execute the method hence introducing some interleaving.

We are using ManualResetEventSlim class to initialize the starting point of all tasks and they all wait till we call synch.Set(). When we signal the tasks to start, all 1,000 tasks begin executing simultaneously.

Let’s use it to call the Withdraw() method of the Account class:

var account = new Account
{
    Balance = 100000
};

Console.WriteLine("Withdrawal without synchronization");
AccountService.WithdrawBalance(account.Withdraw);
Console.WriteLine($"Final balance: {account.Balance}");

We set the initial balance to 100,000 and try to withdraw this amount concurrently across 1000 threads using the Withdraw() method. 

This is where we might encounter the race condition. Since multiple tasks are trying to withdraw simultaneously, several of them might see a balance of 100,000 and reduce 100 from it at the same time. The first task reads the balance, subtracts 100 from it, and updates the balance to 99,900. However, before the first task can update the balance, the second task reads it. The balance for the second task is still 100,000 as the first task didn’t update it. Thus, the second task subtracts 100 from 100,000 (instead of 99,900) and updates the balance to 99,900 (instead of 99,800).

Let’s execute the application and see the race condition in action:

Withdrawal without synchronization
Final balance: 300

Here, we are left with 300 as the final account balance instead of the expected 0. However, even this balance may keep changing on each run based on the execution order of the threads.

This leads to the final balance being unpredictable. Thus, we need synchronization mechanisms to effectively handle concurrent executions in a multithreaded environment.

Let’s look at these synchronization mechanisms.

Ensuring Data Visibility With the Volatile Keyword

The volatile keyword ensures the visibility of a variable to all the threads. In simple terms, when we mark a field volatile, we declare to the compiler that multiple threads will access this specific field and thus do not apply certain memory reorder optimizations on that field.

When a thread reads a volatile field, it always reads the most recent value of the field as the compiler doesn’t reorder any memory operations it has performed on this field. Similarly, on writing to a volatile field, the compiler ensures that the previous memory operations don’t get reordered to appear after the latest write operation.

This ensures that a change made by one thread is visible to all the threads at all times. 

Let’s see how that can affect our Account class:

public class Account
{
    private volatile int _balanceVolatile;
    
    public int BalanceVolatile
    {
        get => _balanceVolatile;
        set => _balanceVolatile = value;
    }
    
    public void WithdrawVolatile(int amount)
    {
        _balanceVolatile -= amount;
    }    
}

As evident, the class doesn’t change much apart from marking the field representing the account balance as volatile.

Synchronization With Volatile

We’ll continue to use the WithdrawBalance() method to simulate the withdrawal of balance. Now, we might wonder if the race condition we had previously gets resolved with volatile:

var account = new Account
{
    BalanceVolatile = 100000    
};

Console.WriteLine("Withdrawal with the 'volatile' keyword");
AccountService.WithdrawBalance(account.WithdrawVolatile);
Console.WriteLine($"Final balance: {account.BalanceVolatile}");

After all, the _balanceVolatile field value is immediately visible to all the threads now. So, a thread would know the field’s value as soon as another updates it. However, the answer is no:

Withdrawal with the 'volatile' keyword
Final balance: 200

Similar to the example without synchronization, the final balance remains unpredictable with the volatile keyword.

The volatile keyword guarantees visibility. However, it doesn’t inherently prevent race conditions as it doesn’t guarantee an uninterrupted read/write operation. Multiple tasks can still modify the balance concurrently and thus we are still left with our problem of the correct balance not getting updated.

Another limitation that we have with volatile is that while it works fine for simpler data types like int, bool, or char , it doesn’t support larger datatypes like long, double, or decimal. That’s because the larger datatypes take up more than 32 bits of memory which is larger than a CPU word and hence may require multiple CPU instructions for a read/write. It makes it difficult to ensure atomicity on these types.

Now, let’s see how to solve this problem.

Mutual Exclusion With Locks

In C#, the lock keyword provides us with a mechanism such that only one thread can access a particular code at a time. Once a thread acquires a lock on a shared resource, all other threads trying to access the resource must wait till the former thread releases its lock from the resource.

Let’s make the Account class thread-safe:

public class Account
{
    private int _balanceLock;
    private object _lock = new object();
    private readonly int _delay = new Random().Next(0, 100);

    public int BalanceLock
    {
        get
        {
            lock (_lock)
            {
                return _balanceLock;
            }
        }
        set
        {
            lock (_lock)
            {
                _balanceLock = value;
            }
        }
    }

    public void WithdrawLock(int amount)
    {
        lock (_lock)
        {
            Thread.Sleep(_delay);
            _balanceLock -= amount;
        }
    }
}

The _lock object we introduce here acts as the synchronization object. It ensures that all code sections within the lock statement are accessible to only one thread at a time.

Synchronization With Lock

We’ve used the lock statement in the BalanceLock property as well as the WithdrawLock() method. This ensures that all thread executions related to the _balanceLock field are mutually exclusive.

Now, let’s try to perform the withdrawal with multiple threads concurrently:

var account = new Account
{
    BalanceLock = 100000
};

Console.WriteLine("Withdrawal with the 'lock' statement");
AccountService.WithdrawBalance(account.WithdrawLock);
Console.WriteLine($"Final balance: {account.BalanceLock}");

When the first task accesses the _balanceLock field with a value of 100,000, it acquires a lock over it. Hence, the second task can’t access it till the first task is done with the modification and releases the lock.

Once the second task gets hold of the field, the balance is already updated to 99,900. Thus, we avoid any race conditions here:

Withdrawal with the 'lock' statement
Final balance: 0

As expected, the final balance is the expected amount of 0. The lock statement guarantees the mutual exclusion of the threads. Hence, the final balance of the Account object remains consistent.

It is important to note that the lock statement is “Task agnostic”, i.e., it doesn’t distinguish between threads originating from multiple tasks or threads created with the Thread class. When a thread enters the code under the lock statement, it gets an exclusive lock meaning no other thread can enter the code till the former exits.

Now, while the lock statement is great for avoiding race conditions, it’s not always the most efficient solution. When dealing with simple operations on shared resources, we can use the Interlocked class while getting rid of the overhead of requiring and releasing resources.

Atomic Operations With the Interlocked Class

The Interlocked class provides us with a set of atomic operations for updating the shared resources in a multithreaded environment. With these operations, we no longer need to acquire explicit locks like with the lock statement.

Some of the commonly used atomic operations allow us to:

  • Increment or decrement value of the variable using the Interlocked.Increment() and Interlocked.Decrement() methods
  • Add or subtract a specified value from a variable using the Interlocked.Add() method 
  • Replace the value of a variable with a new value using the Interlocked.Exchange() method

Let’s modify the Account class to Interlocked operations:

public class Account
{
    private int _balanceInterlocked;
    private readonly int _delay = new Random().Next(0, 100);
    
    public int BalanceInterlocked
    {
        get => Interlocked.CompareExchange(ref _balanceInterlocked, 0, 0);
        set => Interlocked.Exchange(ref _balanceInterlocked, value);
    }

    public void WithdrawInterlocked(int amount)
    {
        Thread.Sleep(_delay);
        Interlocked.Add(ref _balanceInterlocked, -amount);
    }
}

Here, we no longer need to acquire specific locks, thus making our code more concise. Avoiding the locking overhead also makes the code more efficient.

Synchronization With Interlocked

In a multithreaded environment, the Interlocked class provides thread safety by making the operations atomic. Thus, if an Interlocked operation is in progress for a variable, other threads can’t modify its value.

In our example of concurrent withdrawal, this helps prevent race conditions:

var account = new Account
{
    BalanceInterlocked = 100000
};

Console.WriteLine("Withdrawal with the 'Interlocked' class");
AccountService.WithdrawBalance(account.WithdrawInterlocked);
Console.WriteLine($"Final balance: {account.BalanceInterlocked}");

When a task accesses the balance and performs an Interlocked.Add() operation, any concurrent task or thread can’t modify the balance.

Hence, we avoid the condition where multiple threads make concurrent changes to the balance:

Withdrawal with the 'Interlocked' class
Final balance: 0

Similar to the lock statement scenario, the final balance is 0. As the Interlocked operations are always atomic, the balance remains consistent in multiple executions.

Now that we’ve looked at different synchronization mechanisms, let’s explore how to select the correct mechanism for the task at hand.

Selecting the Correct Synchronization Mechanism

Out of volatile, lock, and Interlocked, each serves a unique purpose in achieving synchronization in a multithreaded environment. 

volatile ensures the visibility of a variable but doesn’t prevent concurrent modifications by multiple threads. Thus, it can’t prevent race conditions. It’s best suited for scenarios where we want to have a shared resource visible to all the threads but don’t perform any complex operations.

The lock statement allows us to implement mutual exclusion on sections of the application to make it thread-safe. It’s best suited when we want only one thread at a time to access a shared resource. It’s also useful when we want to perform complex operations while still ensuring thread safety.

The Interlocked class is an efficient option for performing basic operations while providing thread safety and better performance. It is suitable for performing arithmetic operations, incrementing counters, or updating flags.

In practice, these techniques are not used in isolation from each other. We might very well use Interlocked for updating a flag and lock for more complex operations in the same application together. Thus, we should make this choice based on multiple factors such as performance considerations and complexity of operations.

Conclusion

In this article, we learned about various synchronization mechanisms like volatile, lock, and Interlocked. We looked at their usage and how to make informed decisions on which technique to choose in our applications.

Liked it? Take a second to support Code Maze on Patreon and get the ad free reading experience!
Become a patron at Patreon!