-
Notifications
You must be signed in to change notification settings - Fork 0
Caching Mechanism
Caching refers to the process of storing frequently used data so that those data can be served much faster for any future requests. So we take the most frequently used data and copy it into temporary storage so that it can be accessed much faster in future calls from the client. Caching significantly improves the performance of an application, reducing the complexity to generate content.
An in-memory cache is stored in the memory of a single server hosting the application. Basically, the data is cached within the application. This is the easiest way to drastically improve application performance.
The main advantage of In-memory caching is it is much quicker than distributed caching because it avoids communicating over a network and it's suitable for small-scale applications. And the main disadvantage is maintaining the consistency of caches while deployed in the cloud.
There are two important terms used with cache, cache hit and cache miss. A cache hit occurs when data can be found in a cache and a cache miss occurs when data can't be found in the cache.
We need to install Microsoft.Extensions.Caching.Memory and Microsoft.Extensions.Caching.Abstractions packages from NuGet Package Manager.
To implement the In-Memory cache, we create ICacheService interface and its implementation class MemoryCacheService as shown below,
public interface ICacheService
{
bool TryGet<T>(string cacheKey, out T value);
T Set<T>(string cacheKey, T value);
void Remove(string cacheKey);
}
public class MemoryCacheService : ICacheService
{
private readonly IMemoryCache _memoryCache;
private readonly MemoryCacheEntryOptions _cacheOptions;
public MemoryCacheService(IMemoryCache memoryCache, IOptions<CacheConfiguration> cacheConfig)
{
_memoryCache = memoryCache;
var _cacheConfig = cacheConfig.Value;
if (_cacheConfig != null)
{
_cacheOptions = new MemoryCacheEntryOptions
{
AbsoluteExpiration = DateTime.Now.AddHours(_cacheConfig.AbsoluteExpirationInHours),
Priority = CacheItemPriority.High,
SlidingExpiration = TimeSpan.FromMinutes(_cacheConfig.SlidingExpirationInMinutes)
};
}
}
public bool TryGet<T>(string cacheKey, out T value)
{
_memoryCache.TryGetValue(cacheKey, out value);
if (value == null) return false;
else return true;
}
public T Set<T>(string cacheKey, T value)
{
return _memoryCache.Set(cacheKey, value, _cacheOptions);
}
public void Remove(string cacheKey)
{
_memoryCache.Remove(cacheKey);
}
}
The above code snippet has consumed some methods of InMemory cache service for reading and writing the data in the cache.
- TryGet The TryGet() method returns a boolean value indicating whether the item was found or not. The actual item can be pulled out using an output parameter. If TryGet() returns false, Set() is used to add that entry.
- Set This method writes data in the cache. This method has three options - Cache Key name, data which is to be cached, and expiration option respectively.
- Remove This method removes the data in the cache.
The In-Memory caching is a service called by dependency injection in the application, so we register it in the ConfigureServices method of InfrastructureServiceRegistration class, as per the following code snippet.
public static class InfrastructureServiceRegistration
{
public static IServiceCollection AddInfrastructureServices(this IServiceCollection services, IConfiguration configuration)
{
services.Configure<EmailSettings>(configuration.GetSection("EmailSettings"));
services.AddTransient<ICsvExporter, CsvExporter>();
services.AddTransient<IEmailService, EmailService>();
services.Configure<CacheConfiguration>(configuration.GetSection("CacheConfiguration"));
services.AddMemoryCache();
services.AddTransient<ICacheService, MemoryCacheService>();
return services;
}
}
We will use this caching policy to store all categories name so for that we have CategoryRepository which holds the implementation of the In-Memory cache as shown below,
public class CategoryRepository : BaseRepository<Category>, ICategoryRepository
{
private readonly ILogger _logger;
private readonly string cacheKey = $"{typeof(Category)}";
private readonly ICacheService _cacheService;
public CategoryRepository(GloboTicketDbContext dbContext, ILogger<Category> logger, ICacheService cacheService) : base(dbContext, logger)
{
_logger = logger;
_cacheService = cacheService;
}
public async Task<IReadOnlyList<Category>> GetAllCategories()
{
_logger.LogInformation("GetAllCategories Initiated");
if (!_cacheService.TryGet(cacheKey, out IReadOnlyList<Category> cachedList))
{
cachedList = await _dbContext.Set<Category>().ToListAsync();
_cacheService.Set(cacheKey, cachedList);
}
_logger.LogInformation("GetAllCategories Completed");
return cachedList;
}
}
The method named GetAllCategories() sets data in the cache.
With that done, let’s run the application and make request for GetAllCategories and see the results of this API endpoints.
The first API call which takes 2084 ms.
Now, theoretically, in the first call, we are directly calling the database, which may be slow depending on the traffic, connection status, size of the response, and so on. We have configured our API to parallelly store these records to the cache as well. So in the second call, the response is expected at much more better time.
As you can see the second API call takes only 36 ms.