In this article, we are going to learn how to execute multiple tasks in C# in sequence and parallel. By using async/await, we avoid performance bottlenecks and enhance the scalability of our application. Depending on the business logic, we may need to execute functional tasks either sequentially, or in parallel.

To download the source code for this article, you can visit our GitHub repository.

Let’s dive into it.

Creating a Demo REST API

Let’s consider that we are developing an application for employee management in an organization. The client/UI application consumes a REST-based API that will return the employee details, salary, and appraisal rating for an employee from three different endpoints.

Support Code Maze on Patreon to get rid of ads and get the best discounts on our products!
Become a patron at Patreon!

Here, we will use a simple ASP.NET Core Web API. The controller actions in this Web API return static data for this demo:

private readonly Random _random = new Random();

[HttpGet("details/{id}")]
public async Task<IActionResult> GetEmployeeDetails(Guid id)
{
    return await Task.FromResult(Ok(new
    {
        Id = id,
        Name = $"Sam_{id}",
        DateOfBirth = DateTime.Now.AddYears(-1 * _random.Next(20, 30)).Date,
        Address = "Employee Dummy Address"
    }));
}

[HttpGet("salary/{id}")]
public async Task<IActionResult> GetEmployeeSalary(Guid id)
{
    return await Task.FromResult(Ok(new
    {
        Id = id,
        SalaryInEuro = 25000
    }));
}

[HttpGet("rating/{id}")]
public async Task<IActionResult> GetEmployeeRating(Guid id)
{
    return await Task.FromResult(Ok(new
    {
        Id = id,
        Rating = 4
    }));
}

Now, let’s take a look at the three endpoints with their response in the swagger documentation for this Web API:

curl -X 'GET' \
  'https://localhost:7145/api/Employee/details/7119e779-3054-493c-8cf7-c617b4aa0f4e' \
  -H 'accept: */*'
{
  "id": "7119e779-3054-493c-8cf7-c617b4aa0f4e",
  "name": "Sam_7119e779-3054-493c-8cf7-c617b4aa0f4e",
  "dateOfBirth": "1999-07-03T00:00:00+05:30",
  "address": "Employee Dummy Address"
}

curl -X 'GET' \
  'https://localhost:7145/api/Employee/salary/7119e779-3054-493c-8cf7-c617b4aa0f4e' \
  -H 'accept: */*'
{
  "id": "7119e779-3054-493c-8cf7-c617b4aa0f4e",
  "salaryInEuro": 25000
}

curl -X 'GET' \
  'https://localhost:7145/api/Employee/rating/7119e779-3054-493c-8cf7-c617b4aa0f4e' \
  -H 'accept: */*'
{
  "id": "7119e779-3054-493c-8cf7-c617b4aa0f4e",
  "rating": 4
}

We have made the Web API simple on purpose as we don’t want to deviate from the topic of this article by diving into the intricate implementation details of an ASP.NET Core Web API.

Execute Multiple Tasks in Sequence using async and await

The client application has an EmployeeProfile class which we use as a presentation model to the end-user:

public class EmployeeProfile
{
     public EmployeeDetails EmployeeDetails { get; }

     public decimal Salary { get; }

     public int Rating { get; }

     public EmployeeProfile(EmployeeDetails employeeDetails, decimal salary, int rating)
     {
         EmployeeDetails = employeeDetails;
         Salary = salary;
         Rating = rating;
     }

     public override string ToString()
     {
         var sb = new StringBuilder();
         sb.AppendLine($"Name: {EmployeeDetails.Name}");
         sb.AppendLine($"DOB: {EmployeeDetails.DateOfBirth.ToShortDateString()}");
         sb.AppendLine($"Address: {EmployeeDetails.Address}");
         sb.AppendLine($"Salary: {Salary}{HexToChar("20AC")}");
         sb.AppendLine($"Appraisal Rating: {Rating}");

         return sb.ToString();

         char HexToChar(string hex)
             => (char) ushort.Parse(hex, NumberStyles.HexNumber);
     }
}

Here, we override the ToString() method to append all the employee data to a single string for printing it to the console.

Since this class acts like a view model, the properties of this class are constructed from the contract classes:

public class EmployeeDetails
{
    public Guid Id { get; set; }

    public string Name { get; set; }

    public DateTime DateOfBirth { get; set; }

    public string Address { get; set; }
}

public class Salary
{
    public Guid Id { get; set; }
    
    public decimal SalaryInEuro { get; set; }
}

public class AppraisalRating
{ 
    public Guid Id { get; set; }

    public int Rating { get; set; }
}

The Executor class in the client application consolidates employee data from all three REST API endpoints while constructing the EmployeeProfile instance:

public async Task<EmployeeProfile> ExecuteInSequence(Guid id)
{  
    var employeeDetails = await _employeeApiFacade.GetEmployeeDetails(id);
    var employeeSalary = await _employeeApiFacade.GetEmployeeSalary(id);
    var employeeRating = await _employeeApiFacade.GetEmployeeRating(id);    

    return new EmployeeProfile(employeeDetails, employeeSalary, employeeRating);
}

Here, we are executing the tasks for fetching employee details, salary, and rating in sequence using asynchronous programming.

First, we are awaiting the Task instance returned by the GetEmployeeDetails() method. Then, we do the same with GetEmployeeSalary() in the continuation of the GetEmployeeDetails() method. Finally, we repeat the action with the GetEmployeeRating() method in the continuation of the GetEmployeeSalary() method. Hence, in this case, each task execution step waits for the previous step to finish without blocking the thread.

In these three methods (GetEmployeeDetails(), GetEmployeeSalary(), GetEmployeeRating()) all we do is send a simple HTTP request and process the response. Also, we are simulating a network latency in all three methods using the Task.Delay() method. Such delay is very common in enterprise-grade applications due to factors like slow database calls, geographic factors, etc.

You can find the implementation inside the EmployeeApiFacade class in the client project.

Typically, we do such sequential execution in a workflow where each task execution step is dependent on the previous one, unlike the above steps. However, here we are anyway doing it to demonstrate in a later section the performance improvement that we can achieve by executing these tasks in parallel.

Let’s now run the client application:

Name: Sam_7119e779-3054-493c-8cf7-c617b4aa0f4e
DOB: 7/3/1999
Address: Employee Dummy Address
Salary: 25000€
Appraisal Rating: 4

We get back the consolidated result. Let’s now see if we can improve the performance of this functionality.

Execute Multiple Tasks in Parallel using Task.WhenAll

Since the tasks for fetching employee details, salary and rating are independent of each other, it is easy to execute them in parallel to improve the overall performance of the workflow:

public async Task<EmployeeProfile> ExecuteInParallel(Guid id)
{ 
    var employeeDetailsTask = _employeeApiFacade.GetEmployeeDetails(id);
    var employeeSalaryTask = _employeeApiFacade.GetEmployeeSalary(id);
    var employeeRatingTask = _employeeApiFacade.GetEmployeeRating(id);

    await Task.WhenAll(employeeDetailsTask, employeeSalaryTask, employeeRatingTask);

    var employeeDetails = await employeeDetailsTask;
    var employeeSalary = await employeeSalaryTask;
    var employeeRating = await employeeRatingTask; 

    return new EmployeeProfile(employeeDetails, employeeSalary, employeeRating);
}

Here, we are not awaiting the Task instances from any of the methods which means that these methods will now return the Task instances instead of the result and will execute in parallel. However, we need to wait for all three methods to complete their execution before the consolidation of the result. This is achieved by using the WhenAll method from Task Parallel Library.

The Task.WhenAll method creates a Task that will be complete when all the supplied tasks have been completed. Once all three Tasks are complete, we await the individual Task instances to derive the result from them.

Alternatively, we can use the Result property from the Task instances instead of awaiting them to avoid some unnecessary state of machine-generated code during compilation:

...
...
var employeeDetails = employeeDetailsTask.Result;
var employeeSalary = employeeSalaryTask.Result;
var employeeRating = employeeRatingTask.Result;   

return new EmployeeProfile(employeeDetails, employeeSalary, employeeRating);
...

Though using blocking code like .Result is not considered good practice under normal circumstances, in this case, it is perfectly safe to use it as all the tasks have finished execution by the time this line is executed. In fact, in this case, it is more beneficial to use this approach as it reduces the size of the compiled code.

Now, when we execute the client code, we get back the consolidated result but with an increase in the overall performance.

Improving the Exception Handling in Task.WhenAll

Handling exceptions play a major part in real-life application development. There is a problem in the above implementation in regards to handling exceptions. Let’s consider a scenario where both GetEmployeeDetails() and GetEmployeeSalary() methods throw exceptions:

public async Task<EmployeeDetails> GetEmployeeDetails(Guid id)
{    
    var response = await _httpClient
                         .GetStringAsync($"https://localhost:7145/api/employee/details/{id}");
    var employeeDetails = JsonSerializer.Deserialize<EmployeeDetails>(response, _serializerOptions);
    await Task.Delay(2000);
    throw new Exception("Test Exception 1");    
}

public async Task<decimal> GetEmployeeSalary(Guid id)
{    
    var response = await _httpClient
                         .GetStringAsync($"https://localhost:7145/api/employee/salary/{id}");
    var salary = JsonSerializer.Deserialize<Salary>(response, _serializerOptions);
    await Task.Delay(1000);
    throw new Exception("Test Exception 2");    
}

In this case, we are only getting the first exception in the console and ignoring the other exception.

System.Exception: Test Exception 1
   at Program.<>c__DisplayClass0_0.<<<Main>$>g__GetEmployeeDetails|5>d.MoveNext() in Program.cs:line 146
--- End of stack trace from previous location ---
   at Program.<>c__DisplayClass0_0.<<<Main>$>g__ExecuteInParallel|1>d.MoveNext() in Program.cs:line 71
--- End of stack trace from previous location ---
   at Program.<Main>$(String[] args) in Program.cs:line 26

So, if other Task instances throw any exceptions, we will never know about it. If we want to get hold of all the exceptions for logging purposes, we need to handle Task.WhenAll a bit differently:

public class TaskExtensions
{
    public static async Task<(T1, T2, T3)> WhenAll<T1, T2, T3>(Task<T1> task1, Task<T2> task2, Task<T3> task3)
    {
        var allTasks = Task.WhenAll(task1, task2, task3);
        try
        {
           await allTasks;
        }
        catch (Exception exp)
        {
           Console.WriteLine("Task Exception", exp);
           throw allTasks.Exception;
        }

        return (task1.Result, task2.Result, task3.Result);
    }
}

This method extends the Task.WhenAll method to return a ValueTuple with the results of the tasks. Most importantly, it also handles the issue with multiple exceptions by catching and rethrowing an aggregate exception on all tasks that are awaited.

Now, we can invoke the WhenAll method from the TaskExtensions class instead of the original Task.WhenAll in the calling method:

public async Task<EmployeeProfile> ExecuteInParallel(Guid id)
{
    var employeeDetailsTask = _employeeApiFacade.GetEmployeeDetails(id);
    var employeeSalaryTask =  _employeeApiFacade.GetEmployeeSalary(id);
    var employeeRatingTask =  _employeeApiFacade.GetEmployeeRating(id);   

    var (employeeDetails, employeeSalary, employeeRating) 
        = await TaskExtensions.WhenAll(employeeDetailsTask, employeeSalaryTask, employeeRatingTask); 
      
    return new EmployeeProfile(employeeDetails, employeeSalary, employeeRating);
}

If we execute the client code now, we can see an aggregate exception in the console showing both exceptions:

Task Exception
System.AggregateException: One or more errors occurred. (Test Exception 1) (Test Exception 2)
---> System.Exception: Test Exception 1
at Program.<>c__DisplayClass0_0.<<<Main>$>g__GetEmployeeDetails|5>d.MoveNext() in Program.cs:line 146
--- End of stack trace from previous location ---
...
--> (Inner Exception #1) System.Exception: Test Exception 2
at Program.<>c__DisplayClass0_0.<<<Main>$>g__GetEmployeeSalary|6>d.MoveNext() in Program.cs:line 156<---

Performance Comparison async/await vs. Task.WhenAll 

To compare the performance between the two approaches, we have run a benchmark with the default target count:

|            Method |     Mean |    Error |   StdDev | Ratio | RatioSD | Rank | Allocated |
|------------------ |---------:|---------:|---------:|------:|--------:|-----:|----------:|
| ExecuteInParallel | 31.71 ms | 0.618 ms | 0.782 ms |  0.49 |    0.04 |    1 |     13 KB |
| ExecuteInSequence | 65.04 ms | 1.292 ms | 3.144 ms |  1.00 |    0.00 |    2 |     13 KB |

The ExecuteInParallel() method that uses Task.WhenAll clearly exhibits better overall performance compared to the ExecuteInSequence() method.

Sequential Task Execution using async, await, and foreach Loop

Let’s consider another business workflow where we need to invoke the same REST API endpoint multiple times. For example, invoking the /details/{id} endpoint to fetch details for three different employees. Once again, depending on the business case, we may have to do this either sequentially or in parallel.

To execute the tasks sequentially, we can use a foreach loop:

public async Task<IEnumerable<EmployeeDetails>> ExecuteUsingNormalForEach(IEnumerable<Guid> employeeIds)
{    
    List<EmployeeDetails> employeeDetails = new();
    foreach (var id in employeeIds)
    {
        var employeeDetail = await _employeeApiFacade.GetEmployeeDetails(id);

        employeeDetails.Add(employeeDetail);
    }                                                      
    
    return employeeDetails;
}

Here, for each employee id, we call the GetEmployeeDetails() method that invokes the /details/{id} endpoint in the REST API. We await the Task instance returned from the GetEmployeeDetails() method and then write the employee details to the console:

Id: 7119e779-3054-493c-8cf7-c617b4aa0f4e
Name: Sam_7119e779-3054-493c-8cf7-c617b4aa0f4e
DOB: 7/3/1999 12:00:00 AM

Id: cbb9c89f-3718-43d9-8763-b3fa3765bcea
Name: Sam_cbb9c89f-3718-43d9-8763-b3fa3765bcea
DOB: 6/2/1987 12:00:00 AM

Id: 165bcfa8-3a4f-4850-a681-bc496616f830
Name: Sam_165bcfa8-3a4f-4850-a681-bc496616f830
DOB: 1/1/1995 12:00:00 AM

We see that all the tasks are executed. However, these tasks of fetching employee details for multiple employees can very easily be executed in parallel as they are independent of each other. So, executing them in parallel will improve the performance. Let’s see how we can achieve such improvement.

Parallel Tasks Execution using Parallel.Foreach

The Parallel.ForEach method executes a foreach operation in which the iterations may run in parallel.

Let’s re-write the example from the previous section using Parallel.ForEach:

public IEnumerable<EmployeeDetails> ExecuteUsingParallelForeach(IEnumerable<Guid> employeeIds)
{
    ParallelOptions parallelOptions = new()
    {
         MaxDegreeOfParallelism = 3
    };    

    ConcurrentBag<EmployeeDetails> employeeDetails = new();

    Parallel.ForEach(_employeeIds, parallelOptions,  id =>
    {
        var employeeDetail = _employeeApiFacade.GetEmployeeDetails(id).GetAwaiter().GetResult();
                          
        employeeDetails.Add(employeeDetail);
    });   

    return employeeDetails;
}

We set the MaxDegreeOfParallelism property to 3. This property limits the number of concurrent operations by the Parallel.ForEach loop to the set value. By default, ForEach will utilize as many threads as the underlying scheduler provides. But, we are limiting this for better utilization of CPU resources in this demo as we have only 3 independent tasks.

Parallel.ForEach does not support asynchronous delegates. So, we are blocking the thread that executes the GetEmployeeDetails() method till it produces a result.

In many cases, Parallel.ForEach can provide significant performance improvements over ordinary sequential loops. However, the work of parallelizing the loop introduces complexity that can lead to problems that we don’t encounter in sequential code. In certain cases, a parallel loop might run slower than its sequential equivalent.

It is better to keep in mind the below points before using the Parallel loops:

  • Parallel loops are designed for CPU-intensive operations. It’s not always the best solution for I/O-bound operations
  • Parallel loops are not appropriate for asynchronous code flow
  • The degree of parallelism is completely dependent on the number of cores and CPU specifications so performance can vary

In the previous code block, each thread runs the independent I/O operations because of the lack of async support in the Parallel.ForEach. This is a big issue considering scalability and has the potential for thread pool starvation in an ASP.NET web application.

However, starting from .NET 6 we can use the Parallel.ForEachAsync method which is async aware.

Parallel Tasks Execution using Parallel.ForeachAsync

The Parallel.ForEachAsync executes a foreach operation on an IEnumerable<T> in which iterations may run in parallel. However, unlike the Parallel.ForEach, it returns a Task instance that represents the entire for-each operation. It also supports an async delegate. This was introduced recently as part of .NET 6.0

Now, let’s re-write the example from the previous section using Parallel.ForEachAsync:

public async Task<IEnumerable<EmployeeDetails>> ExecuteUsingParallelForeachAsync(IEnumerable<Guid> employeeIds)
{
    ParallelOptions parallelOptions = new()
    {
        MaxDegreeOfParallelism = 3
    };
    
    ConcurrentBag<EmployeeDetails> employeeDetails = new();
    
    await Parallel.ForEachAsync(_employeeIds, parallelOptions, async (id, _) =>
    {
        var employeeDetail = await _employeeApiFacade.GetEmployeeDetails(id);

        employeeDetails.Add(employeeDetail);
    });   

    return employeeDetails;

}

Here, we set the MaxDegreeOfParallelism property to 3 like in the previous section for better optimization of CPU resources.

Now, on executing the client code, we see that the tasks are executed and  it  is approximately similar to the timing that we got using the Parallel.ForEach. 

Performance Comparison foreach vs. Parallel.ForEach vs. Parallel.ForEachAsync

Here also, we have run a benchmark with the default target count:

|                           Method |     Mean |    Error |   StdDev |   Median | Ratio | RatioSD | Rank | Allocated |
|--------------------------------- |---------:|---------:|---------:|---------:|------:|--------:|-----:|----------:|
|      ExecuteUsingParallelForeach | 31.99 ms | 0.566 ms | 0.530 ms | 31.84 ms |  0.34 |    0.01 |    1 |     17 KB |
| ExecuteUsingParallelForeachAsync | 33.97 ms | 1.290 ms | 3.596 ms | 32.78 ms |  0.37 |    0.04 |    1 |     16 KB |
|        ExecuteUsingNormalForEach | 93.49 ms | 1.543 ms | 1.584 ms | 92.86 ms |  1.00 |    0.00 |    2 |     14 KB |

So, the Parallel.ForEach and the Parallel.ForEachAsync approaches perform better than the normal foreach. We see that there is not much difference in terms of execution time between the Parallel.ForEach and the Parallel.ForEachAsync. However, the scalability of Parallel.ForEachAsync is much higher considering that it is not blocking the individual threads anymore during the I/O operations. Hence, it minimizes the chances of thread pool starvation.

Conclusion

In this article, we have demonstrated a few scenarios where we may need to execute tasks in sequence and parallel. C# provides us with many approaches to achieve the same and we have covered most of them here. One must keep in mind the general pitfalls of asynchronous programming in C# while implementing such task execution.

Liked it? Take a second to support Code Maze on Patreon and get the ad free reading experience!
Become a patron at Patreon!