EasyCaching is an open-source caching library that helps us manage caching more easily, and flexibly and it’s an alternative to ASP.NET Core’s out-of-the-box caching library.
Caching is a prevalent and useful technique for improving the performance of web applications. By caching frequently accessed data, you can minimize the frequent use of SQL queries, API calls, or repetitive calculations. The main objective of caching is to speed up data delivery to clients by removing the need to repeatedly retrieve the same data from a data source (e.g. a database).
Let’s dive in.
Installing EasyCaching
Throughout this article, we will work on an API that lists award-winning wines.
The main package to install is EasyCaching.Core
which contains all the basics of caching. Additionally, we need to install one or more caching providers where EasyCaching will store its cached data (e.g. in-memory, Redis, SQLite, Memcached, etc).
We will create an ASP.NET Core Web API and see how to use two different cache providers (in-memory and SQLite) in the same project.
To begin, we will install the necessary NuGet packages. In addition to the EasyCaching.Core
package which allows us to implement cache management, we have to install EasyCaching.InMemory
and EasyCaching.SQLite
. These two libraries will enable us to use the application server RAM, commonly called local memory caching, and the SQLite database engine for persisted caching items, to create two kinds of storage locations.
At the end of the installation, our project file will contain the package references:
<PackageReference Include="EasyCaching.Core" Version="1.9.2" /> <PackageReference Include="EasyCaching.InMemory" Version="1.9.2" /> <PackageReference Include="EasyCaching.SQLite" Version="1.9.2" />
Configuring the Project to Use EasyCaching
In our Program
class, we must now configure caching with EasyCaching and the providers we want to use.
We can configure the caching provider in two ways. We can do it by code or by storing the configuration in the appsettings.json
file:
builder.Services.AddEasyCaching(options => { options.UseInMemory(configuration, "InMemoryCache", "EasyCaching:InMemoryCache"); options.UseSQLite(config => { config.DBConfig = new EasyCaching.SQLite.SQLiteDBOptions { FileName = "cache.db" }; }, "SQLiteCache"); });
Our Program
class presents both methods. For in-memory caching, we provide a configuration object pointing to a section of the appsettings.json
file, while we write instructions for SQLite caching.
In the appsettings.json
file, the EasyCaching
section with sub-section InMemoryCache
contains some basic parameters:
"EasyCaching": { "InMemoryCache": { "MaxRdSecond": 120, // the max random second will be added to cache's expiration, default value is 120 "EnableLogging": false, // whether enable logging, default is false "LockMs": 5000, // mutex key's alive time(ms), default is 5000 "SleepMs": 300, // when mutex key alive, it will sleep some time, default is 300 "DBConfig": { "SizeLimit": 100, // total count of cache items, default value is 10000 "ExpirationScanFrequency": 60 // scan time, default value is 60s } } }
Now that the configuration is ready, we can call our two providers to handle caching in a controller.
Caching Implementation
Let’s create a controller and an endpoint to put caching into practice and compare the performance with and without it.
Implementing Caching in the Controller Class
We will use a primary constructor, and pass the provider to it as a class parameter.
When using a single provider, we inject an object implementing IEasyCachingProvider
:
public class ValuesController(IEasyCachingProvider _provider) : ControllerBase { }
In our example application, we decided to use two different providers. In this case, we will use a factory provider implementing IEasyCachingProviderFactory
and pass it as an argument into the primary constructor:
public class ValuesWithTwoProvidersController(IEasyCachingProviderFactory _factory) : ControllerBase { }
During runtime, the class will receive the necessary argument by injecting the dependencies from the Dependency Injection container.
Utilizing Caching in the Endpoint Function
We can now utilize our factory object and call its GetCachingProvider()
method by providing the name of our caching provider as a parameter, to select the one we want to work with:
public class ValuesWithTwoProvidersController(IEasyCachingProviderFactory _factory) : ControllerBase { [HttpGet] public async Task<ActionResult<ApiResponse>> GetValues() { var inMemoryProvider = _factory.GetCachingProvider("InMemoryCache"); //In memory if (await inMemoryProvider.ExistsAsync("today")) { var cachedTodayDate = await inMemoryProvider.GetAsync<DateTime>("today"); // ... } else { // ... var today = DateTime.Now.Date; await inMemoryProvider.SetAsync("today", today, TimeSpan.FromMinutes(1)); } // ... } }
Here, we check whether the data we want to retrieve exists in the cache to serve it first, by using the ExistsAsync()
method. If not, we look for it from the original data source.
If the requested data is cached, we then call the provider’s GetAsync()
method with the key name of the data as a parameter, to retrieve it. Otherwise, we get the data from its source and then save and cache it using the provider’s SetAsync()
method with three parameters, a key name, the value, and an expiration time. Thus, the next request of the same data will check the cache first for better performance.
In the example, the target data is a DateTime
variable but let’s assume it’s frequently accessed data and stored originally on a remote server, for example.
Let’s note that EasyCaching offers many methods for managing and manipulating cached data. In addition to the methods presented, it is possible to flush all cached elements with the FlushAsync()
method or to delete one or a list of specified cache keys using the Remove()
and RemoveAll()
. We can also handle the expiration of a cache key using GetExpirationAsync()
. The API offered by EasyCaching largely meets the various current needs.
Caching Performance With EasyCaching
Now let’s complete our GET
endpoint which will return a list of rewarded wines. We will add our second provider, which will give us access to two different caching sources: in-memory caching and an SQLite database. The complete code of our application is accessible on our git repository.
Having multiple caching providers can be useful in certain situations. Indeed, it is important to note that various cache providers have their strengths and weaknesses.
For example, frequently accessed data can be cached in memory to ensure optimal speed, while data that needs to be persistently stored but accessed less frequently, a distributed cache such as Redis or SQLite can be utilized:
private ApiResponse _response = new(); [HttpGet] public async Task<ActionResult<ApiResponse>> GetValues() { var sqliteProvider = _factory.GetCachingProvider("SQLiteCache"); var prizeDto = new PrizeDto(); var stopwatch = Stopwatch.StartNew(); //SQLite if (await sqliteProvider.ExistsAsync("prizes")) { var cachedPrizes = await sqliteProvider.GetAsync<List<WinePrize>>("prizes"); prizeDto.Prizes = cachedPrizes.Value; } else { await Task.Delay(50); var prizes = Data.GetWinePrizes(); prizeDto.Prizes = prizes; await sqliteProvider.SetAsync("prizes", prizes, TimeSpan.FromMinutes(1)); } stopwatch.Stop(); _response.Result = prizeDto; _response.Duration = stopwatch.ElapsedMilliseconds; _response.StatusCode = System.Net.HttpStatusCode.OK; return Ok(_response); }
In our endpoint, we simulate a fifty milliseconds long treatment with Task.Delay(50)
statement, for each of our data that is not cached yet.
We use the ApiResponse
class to standardize the HTTP response. The retrieved data is mapped into a DTO class called PrizeDto
.
Now let’s run several successive calls to our endpoint to measure execution time without and with caching. In our test, the first run lasted for 710 milliseconds. Since both requested data were not cached, Task.Delay(50)
was executed twice before returning the response.
Subsequent executions take less than 50 milliseconds and frequently around 1 milliseconds! In such cases, we fetch data from the cache with each request.
This simple example demonstrates the effectiveness of caching.
Conclusion
EasyCaching is a freely available library that simplifies caching functionality in .NET Core applications. It provides a unified interface for working with various caching providers. It allows you to easily switch between them without modifying your core caching logic. In this article, we presented how to set up a caching mechanism, using the EasyCaching library.
Consult the official documentation for more information on the various providers offered by this library