In-memory caching is a fundamental technique for improving the performance and scalability of applications. In the .NET context, caching allows data to be stored in memory for fast and efficient access, avoiding expensive operations such as database queries or external service calls.
What is In-Memory Caching?
In-memory caching is a mechanism that temporarily stores data to reduce the time required to access frequently used information. Instead of fetching data from a slower source (like a database or web service), the cache stores a local copy of the data in memory, allowing faster and more efficient access. Caching is particularly useful in systems with high read operations.
Benefits of In-Memory Caching
In-memory caching offers several advantages to improve application performance and scalability:
Reduced response time: Cache provides data quickly from memory, which is much faster than disk or network access.
Reduced database load: Prevents overload from frequent queries by reducing the number of operations needed.
Improved scalability: With fewer external calls, the system can support more simultaneous users.
Types of Cache
Caching can be implemented in various ways depending on application needs:
Local cache: Stored in the memory of the process running the application. Example:
MemoryCache
in .NET.Distributed cache: Stored on a separate cache server, accessible by multiple application instances. Examples include Redis and Memcached.
In this article, we’ll focus on local cache using MemoryCache
.
Considerations When Using Cache
Data consistency: Cached data can become outdated. To avoid consistency issues, it’s essential to implement appropriate expiration and invalidation policies. This means defining a time-to-live for the data and ensuring it is updated or removed when needed.
Memory management: Excessive cache usage can lead to memory problems, especially in resource-constrained systems. It’s important to monitor memory usage and apply strategies like cache size limits and eviction of less-used items.
Security: The cache may store sensitive data, posing a security risk if not managed correctly. Ensure cached data is encrypted and access is restricted to authorized users and processes.
Implementing Caching
To illustrate in-memory caching, consider an API project built with ASP.NET Core. Suppose we have a controller that returns a list of products by always querying the database, like this:
[HttpGet]
public IActionResult Get()
{
return Ok(_productsRepository.GetAllProducts());
}
If this product list is frequently requested and the data doesn’t change often, we may end up querying the database multiple times for the same data. This is a perfect scenario for caching.
By applying caching to this endpoint, the number of database queries will decrease, as the data will often be retrieved from memory.
To use in-memory caching in ASP.NET, the first step is to configure this service in the Program
(or Startup
) class of the project. Add the following line after AddControllers()
:
builder.Services.AddMemoryCache();
Next, we need to create an instance of IMemoryCache
in the controller, which can be done via dependency injection. Updating our previous example, the controller would look like this:
using CacheDemo.Models;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Caching.Memory;
[ApiController]
[Route("[controller]")]
public class ProductsController : ControllerBase
{
private readonly IMemoryCache _memoryCache;
private readonly ProductsRepository _productsRepository;
public ProductsController(IMemoryCache memoryCache, ProductsRepository productsRepository)
{
_memoryCache = memoryCache;
_productsRepository = productsRepository;
}
[HttpGet]
public IActionResult Get()
{
return Ok(_productsRepository.GetAllProducts());
}
}
In the code above, we declare a readonly IMemoryCache
field and inject it into the constructor, a common practice in ASP.NET projects. The same is done for the ProductsRepository
, which in your project might be a service, a query handler, or even a DbContext
.
Now let’s modify the Get()
action to check the cache before querying the repository:
[HttpGet]
public IActionResult Get()
{
const string cacheKey = "products";
if (!_memoryCache.TryGetValue(cacheKey, out List products))
{
products = _productsRepository.GetAllProducts();
var cacheEntryOptions = new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5)
};
_memoryCache.Set(cacheKey, products, cacheEntryOptions);
}
return Ok(products);
}
In the code above, we first define a key for the cache entry—”products” in this case. We’ll use this key when reading, updating, or removing the cache entry.
We then use the _memoryCache.TryGetValue
method. It takes the key we want to read and an output parameter to hold the cached value if it exists. If the value doesn’t exist in the cache, the if
block is executed. This is where we update the cache with the product list so that it can be retrieved on the next request.
We call the repository, fill the products
output parameter, and then declare a MemoryCacheEntryOptions
variable that contains the cache configuration—most importantly, the expiration time. In the example, AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5)
means the cached value will expire 5 minutes after being stored, ensuring periodic updates.
Finally, _memoryCache.Set(cacheKey, products, cacheEntryOptions);
stores the product list in cache with the specified key and options. For the next 5 minutes, requests to this endpoint will return data from cache without querying the database.
It’s important to highlight that the cache expiration time is one of the key settings for a successful caching strategy. Additionally, it’s sometimes necessary to ensure that the cache is invalidated when data changes at the source, so no requests return outdated data.
Final Considerations
Efficiently implementing in-memory caching can significantly improve application performance. It’s important to choose the appropriate caching strategy based on the application’s requirements and monitor usage to avoid memory issues. With the tools and techniques discussed here, you can start applying caching to your .NET projects for faster and more scalable performance.