In this article, we’re going to discuss Rate Limiting in ASP.NET Core, specifically version 8, and explore some ways of implementing it.

To download the source code for this article, you can visit our GitHub repository.

So, Let’s start.

What is Rate Limiting?

Rate limiting controls the rate at which requests or events are allowed within a given time frame. In today’s world, where information flows at unprecedented rates, ensuring the stability and security of online systems is paramount. Rate limiting often serves to prevent the overloading of endpoints, protect against security threats such as DDoS (Distributed Denial-of-Service) attacks, and ensure fair resource utilization.

Support Code Maze on Patreon to get rid of ads and get the best discounts on our products!
Become a patron at Patreon!

Real-World Applications Using Rate Limiting

Online stores and social media sites are just some examples of applications where rate limiting is of great importance. In the world of e-commerce, it helps to manage high-traffic events, ensuring a smooth checkout process for all customers. Social media platforms implement it to control spam and regulate API usage. Cloud providers also implement rate limiting as a means of controlling access to shared resources in a multi-tenant environment, preventing a single tenant from monopolizing resources. 

Rate Limiting in Older Versions of ASP.NET Core

We could achieve rate limiting through custom implementations via libraries in any .NET version. However, built-in rate limiting middleware was introduced in ASP.NET Core 7.0. This middleware offers us significant advantages and simplifications compared to custom approaches. In this article, we will focus on the built-in middleware. For “older” implementations, visit our article Rate Limiting in ASP.NET Core Web API.

Implementing Rate Limiting

The Microsoft.AspNetCore.RateLimiting namespace provides built-in middleware with different rate-limiting capabilities. We set up policies and link them to endpoints. Let’s check out the built-in capabilities.

Every time we want to use rate limiting, we must first register the middleware:

app.UseRateLimiter();

Here, we register the rate-limiting middleware into the application’s request-processing pipeline. The middleware controls the rate of requests that can be sent to the server and blocks requests for a specified duration if the rate is exceeded. We must also be aware of the order in which we register our middleware components.

If we plan to use endpoint-specific rate limiting, then we need to register the middleware after a call to UseRouting(). On the contrary, when using global limiters we can register it before. Also, we should register it early in the pipeline to avoid unnecessary processing of requests that exceed the rate limit.

Now, we must define the rate-limiting policies, which specify the limits and rules for the middleware. With them, we specify how we want the middleware to work. In the following sections, we will go through the out-of-the-box capabilities provided by the framework.

ASP.NET Core 7.0+ Built-In Rate Limiting

Now, let’s examine the extension methods provided by the RateLimiterOptionsExtensions class, and see how we can use them to configure rate limiting.

Fixed Window Limiter

Here we first add the rate limiter service. Then, we set up a fixed window limiter where we name the policy which we will later use to configure which endpoints it will affect. 

The AddFixedWindowLimiter method enforces the rate limit within a specified duration, known as the time window. During this window, the system allows up to a set number of requests. If the limit is set to 100 requests per minute, the system will either queue or reject any additional requests, depending on the configuration. The system resets the allowed number of requests as each window expires. This type of limiter is useful for managing scenarios with relatively uniform traffic:

builder.Services.AddRateLimiter(options => options
    .AddFixedWindowLimiter(policyName: "fixed", limiterOptions =>
    {
        limiterOptions.PermitLimit = 100;
        limiterOptions.Window = TimeSpan.FromMinutes(1);
        limiterOptions.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        limiterOptions.QueueLimit = 5;
    }));

Here the limiter permits up to 100 requests per minute and allows five requests to queue once the rate limit is reached. We also specify that we process the oldest requests first. Any additional request after 105 requests within one minute will receive a “429 – too many requests” error response.

Sliding Window Limiter

The sliding window algorithm provides a refinement to the fixed window limiter by dividing the time window into smaller segments. This, in turn, offers smoother traffic handling and a more evenly distributed load. This is especially beneficial in systems where traffic intensity fluctuates rapidly. This mechanism ensures a continuous evaluation of request counts, offering a more dynamic and responsive rate limiting compared to the fixed window approach:

builder.Services.AddRateLimiter(options => options
    .AddSlidingWindowLimiter(policyName: "sliding", limiterOptions =>
    {
        limiterOptions.PermitLimit = 100;
        limiterOptions.Window = TimeSpan.FromMinutes(1);
        limiterOptions.SegmentsPerWindow = 10;
        limiterOptions.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        limiterOptions.QueueLimit = 5;
    }));

Here, as we did for the fixed window, we set up a sliding window rate limiter, allowing 100 requests per minute. But here we have an additional property, SegmentsPerWindow, denoting that we divide the window into smaller segments, 10 to be exact. This means that the one minute Window we specified will be split into 10 segments, each 6 seconds long, allowing up to 10 requests per segment. As the example above, we again allow up to 5 requests to queue and process the oldest requests first. For this example, if more than 15 requests are made in 6 seconds, they will receive a 429, too many requests, error response.

Token Bucket Limiter

The token bucket limiter manages request rates by maintaining a balance of tokens and adding them to the bucket at a fixed rate. When a request comes in, it consumes a token. As long as the bucket has enough tokens, we allow the request. If there are no free tokens, we reject or queue the request, depending on our configuration. This method allows for handling bursts of requests up to the bucket’s limit, without breaching the overall rate limit. Tokens are added to the bucket until the TokenLimit is reached. It’s worth mentioning that tokens are not returned to the bucket. This ensures that the rate of requests doesn’t exceed the defined limits:

builder.Services.AddRateLimiter(options => options
    .AddTokenBucketLimiter(policyName: "token", limiterOptions =>
    {
        limiterOptions.TokenLimit = 100;
        limiterOptions.ReplenishmentPeriod = TimeSpan.FromMinute(1);
        limiterOptions.TokensPerPeriod = 10;
        limiterOptions.AutoReplenishment = true;      
        limiterOptions.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        limiterOptions.QueueLimit = 5;
    }));

Here we configure a token limit of 100, which means that the maximum number of tokens the bucket can hold is 100. It ensures that the rate of requests across all clients does not exceed a certain threshold over time.

TokensPerPeriod is set to 10 which means 10 immediate requests can be processed per client. Here the QueueLimit is set to 5 which means up to 5 additional requests per client can be queued and processed after the replenishment period which is set to 1 minute in our case, is over. Any additional request, which is more than 15 requests from a specific client will be rejected with a “429 – too many requests” error response. 

Concurrency Limiter

Unlike token bucket or window-based limiters that regulate the rate of incoming requests over time, a concurrency limiter focuses on the number of simultaneous requests taking place. This approach is particularly useful in scenarios where we want to prevent system overload due to too many processes running at the same time, rather than managing the flow rate of requests entering the system. This method works well in environments with limited capacity for processing concurrent operations:

builder.Services.AddRateLimiter(options => options
    .AddConcurrencyLimiter(policyName: "concurrency", limiterOptions =>
    {
        limiterOptions.PermitLimit = 10;
        limiterOptions.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
        limiterOptions.QueueLimit = 5;
    }));

Here, we are only defining the PermitLimit which denotes how many concurrent requests we allow at one time. This approach prevents the system from becoming overwhelmed with too many simultaneous requests. Once again we allow up to 5 additional requests to queue. Once we reach that limit (15 requests in this example), we return a “429 – too many requests” error response.

Implementing Rate Limiting in an ASP.NET Core Minimal API

In .NET, Minimal APIs have become increasingly popular for their simplicity and efficiency in setting up lightweight web services. They help simplify the process of web service creation by reducing the amount of code required to set up basic routing and request handling. We ditch the conventional patterns of controllers and views, and instead, directly map endpoints to lambda expressions or methods in Program.cs file. To learn more about Minimal API, please check out our article.

So let’s see how to integrate rate limiting in Minimal APIs. The configuration part is the same as for a normal API, as mentioned above. But when defining endpoints we have some differences:

var app = builder.Build();

app.MapGet("/myendpoint", () => "This endpoint is rate-limited.")
   .RequireRateLimiting("concurrency");

app.Run();

Here, we protect the /myendpoint route with the previously defined “concurrency” rate-limiting policy. This is accomplished by calling the RequireRateLimiting() extension method.

Rate Limiting With Authorization

Integrating rate-limiting features with authentication and authorization enables us to control the rate of requests based on the identity of the requester. By following this approach, we can help to provide different service levels to different users or protect resources from abuse:

builder.Services.AddRateLimiter(limiterOptions =>
{
    limiterOptions.RejectionStatusCode = StatusCodes.Status429TooManyRequests;
    limiterOptions.AddPolicy(policyName: "jwt", partitioner: httpContext =>
    {
        var accessToken = httpContext.GetTokenAsync("access_token").Result;

        return !string.IsNullOrEmpty(accessToken)
            ? RateLimitPartition.GetFixedWindowLimiter(accessToken, options =>
                new FixedWindowRateLimiterOptions
                {
                    QueueLimit = 5,
                    PermitLimit = 100,
                    Window = TimeSpan.FromMinutes(1),
                })
            : RateLimitPartition.GetFixedWindowLimiter("Anon", options =>
                new FixedWindowRateLimiterOptions
                {
                    QueueLimit = 5,
                    PermitLimit = 10,
                    Window = TimeSpan.FromMinutes(1),
                });
    });
});

Here we set up rate limiting with different policies for authenticated and anonymous users. We allow authenticated users to fire 100 requests per minute, and unauthorized users, only 10. This approach limits a service to unauthorized users. When exceeding the rate limit, the server issues a 429 status code, signaling too many requests in a given time.

Enable or Disable Rate Limiting in ASP.NET Core Controllers

If we want to enable rate limiting for all of the endpoints, we can do it in one go by adding a one-liner in the Program.cs file. It should be added after app.UseRouting(), but before app.Run():

app.MapDefaultControllerRoute().RequireRateLimiting(nameOfPolicy);

We are chaining RequireRateLimiting(nameOfPolicy) method with the MapDefaultControllerRoute() method since it is used for setting up routing to our controllers automatically. By chaining them together, we specify that the default controller routes should also enforce rate limiting according to our defined policies. Note that we must create the policy and register the rate-limiting middleware.

For a more granular approach, we can disable or enable rate limiting for specific controllers or actions by applying different attributes to a controller. For disabling we use the [DisableRateLimiter] attribute on the top of the controller definition or action method. On the contrary, to enable it we use [EnableRateLimiting]. We can also create a custom attribute to deal with rate limiting as we see fit:

[EnableRateLimiting("fixed")]
public class CustomerController : Controller
{
    public ActionResult Index()
    {
        return View();
    }

    [EnableRateLimiting("sliding")]
    public ActionResult Details()
    {
        return View();
    }

    [DisableRateLimiting]
    public ActionResult SpecialOffer()
    {
        return View();
    }
}

Here we define the entire controller to have a fixed rate-limiting policy indicated by the [EnableRateLimiting("fixed")] attribute. For the Details() action method we define a different policy, the sliding one, overriding the controller’s default. And for the SpecialOffer() action method, we turn off the rate-limiting altogether. 

Advanced Rate Limiting Techniques in ASP.NET Core

We can layer (or chain) multiple rate-limiting strategies together for more complex scenarios using the PartionedRateLimiter class:

builder.Services.AddRateLimiter(limiterOptions =>
{
    limiterOptions.GlobalLimiter = PartitionedRateLimiter.CreateChained(
        PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
        {
            var userAgent = httpContext.Request.Headers.UserAgent.ToString();
            return RateLimitPartition.GetFixedWindowLimiter(
                userAgent, _ => new FixedWindowRateLimiterOptions
                {
                    AutoReplenishment = true,
                    PermitLimit = 4,
                    Window = TimeSpan.FromSeconds(2)
                });
        }),
        PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
        {
            var clientIP = httpContext.Connection.RemoteIpAddress!.ToString();
            return RateLimitPartition.GetFixedWindowLimiter(
                clientIP, _ => new FixedWindowRateLimiterOptions
                {
                    AutoReplenishment = true,
                    PermitLimit = 20,
                    Window = TimeSpan.FromSeconds(30)
                });
        }));
});

Here, we use the CreateChained() method to chain two limiters together. This causes the incoming requests to pass through each limiter in sequence. Chaining these two limiters ensures that a request must pass the conditions of the first limiter and then the second one before processing. Incoming requests first try to pass the userAgent limiter, which allows only 4 requests within 2 seconds from the same device or browser. If the request passes, it proceeds to the clientIP which allows 20 requests within 30 seconds from the same internet connection. Our app processes only requests that successfully pass both the limiters.

This setup is especially useful in scenarios requiring a combination of different dimensions of rate limiting, such as per-user-agent and per-IP. This setup safeguards our app from excessive requests from multiple sources, leading to denial-of-service (DDoS) attacks. We can customize the limits per each limiter and even add more for finer control. Remember that each gate acts independently.

Conclusion

Rate limiting in .NET 8 helps us to control how often users can access our services. It protects our APIs from attacks, manages user quotas, and prevents misuse. We can use built-in limiters easily, and tweak them as we see fit. Remember we should test thoroughly and consider potential challenges like false positives and configuration complexity. By using rate limiting effectively, we can build secure, scalable, and user-friendly applications.

Liked it? Take a second to support Code Maze on Patreon and get the ad free reading experience!
Become a patron at Patreon!