Using LazyCache for clean and simple .NET Core in-memory caching
I'm continuing to use .NET Core 2.1 to power my Podcast Site, and I've done a series of posts on some of the experiments I've been doing. I also upgraded to .NET Core 2.1 RC that came out this week. Here's some posts if you want to catch up:
- Eyes wide open - Correct Caching is always hard
- The Programmer's Hindsight - Caching with HttpClientFactory and Polly Part 2
- Adding Cross-Cutting Memory Caching to an HttpClientFactory in ASP.NET Core with Polly
- Adding Resilience and Transient Fault handling to your .NET Core HttpClient with Polly
- HttpClientFactory for typed HttpClient instances in ASP.NET Core 2.1
- Updating jQuery-based Lazy Image Loading to IntersectionObserver
- Automatic Unit Testing in .NET Core plus Code Coverage in Visual Studio Code
- Setting up Application Insights took 10 minutes. It created two days of work for me.
- Upgrading my podcast site to ASP.NET Core 2.1 in Azure plus some Best Practices
Having a blast, if I may say so.
I've been trying a number of ways to cache locally. I have an expensive call to a backend (7-8 seconds or more, without deserialization) so I want to cache it locally for a few hours until it expires. I have a way that work very well using a SemaphoreSlim. There's some issues to be aware of but it has been rock solid. However, in the comments of the last caching post a number of people suggested I use "LazyCache."
Alastair from the LazyCache team said this in the comments:
LazyCache wraps your "build stuff I want to cache" func in a Lazy<> or an AsyncLazy<> before passing it into MemoryCache to ensure the delegate only gets executed once as you retrieve it from the cache. It also allows you to swap between sync and async for the same cached thing. It is just a very thin wrapper around MemoryCache to save you the hassle of doing the locking yourself. A netstandard 2 version is in pre-release.
Since you asked the implementation is in CachingService.cs#L119 and proof it works is in CachingServiceTests.cs#L343
Nice! Sounds like it's worth trying out. Most importantly, it'll allow me to "refactor via subtraction."
I want to have my "GetShows()" method go off and call the backend "database" which is a REST API over HTTP living at SimpleCast.com. That backend call is expensive and doesn't change often. I publish new shows every Thursday, so ideally SimpleCast would have a standard WebHook and I'd cache the result forever until they called me back. For now I will just cache it for 8 hours - a long but mostly arbitrary number. Really want that WebHook as that's the correct model, IMHO.
LazyCache was added on my Configure in Startup.cs:
services.AddLazyCache();
Kind of anticlimactic. ;)
Then I just make a method that knows how to populate my cache. That's just a "Func" that returns a Task of List of Shows as you can see below. Then I call IAppCache's "GetOrAddAsync" from LazyCache that either GETS the List of Shows out of the Cache OR it calls my Func, does the actual work, then returns the results. The results are cached for 8 hours. Compare this to my previous code and it's a lot cleaner.
public class ShowDatabase : IShowDatabase { private readonly IAppCache _cache; private readonly ILogger _logger; private SimpleCastClient _client; public ShowDatabase(IAppCache appCache, ILogger<ShowDatabase> logger, SimpleCastClient client) { _client = client; _logger = logger; _cache = appCache; } public async Task<List<Show>> GetShows() { Func<Task<List<Show>>> showObjectFactory = () => PopulateShowsCache(); var retVal = await _cache.GetOrAddAsync("shows", showObjectFactory, DateTimeOffset.Now.AddHours(8)); return retVal; } private async Task<List<Show>> PopulateShowsCache() { List<Show> shows = await _client.GetShows(); _logger.LogInformation($"Loaded {shows.Count} shows"); return shows.Where(c => c.PublishedAt < DateTime.UtcNow).ToList(); } }
It's always important to point out there's a dozen or more ways to do this. I'm not selling a prescription here or The One True Way, but rather exploring the options and edges and examining the trade-offs.
- As mentioned before, me using "shows" as a magic string for the key here makes no guarantees that another co-worker isn't also using "shows" as the key.
- Solution? Depends. I could have a function-specific unique key but that only ensures this function is fast twice. If someone else is calling the backend themselves I'm losing the benefits of a centralized (albeit process-local - not distributed like Redis) cache.
- I'm also caching the full list and then doing a where/filter every time.
- A little sloppiness on my part, but also because I'm still feeling this area out. Do I want to cache the whole thing and then let the callers filter? Or do I want to have GetShows() and GetActiveShows()? Dunno yet. But worth pointing out.
- There's layers to caching. Do I cache the HttpResponse but not the deserialization? Here I'm caching the List<Shows>, complete. I like caching List<T> because a caller can query it, although I'm sending back just active shows (see above).
- Another perspective is to use the <cache> TagHelper in Razor and cache Razor's resulting rendered HTML. There is value in caching the object graph, but I need to think about perhaps caching both List<T> AND the rendered HTML.
- I'll explore this next.
I'm enjoying myself though. ;)
Go explore LazyCache! I'm using beta2 but there's a whole number of releases going back years and it's quite stable so far.
Lazy cache is a simple in-memory caching service. It has a developer friendly generics based API, and provides a thread safe cache implementation that guarantees to only execute your cachable delegates once (it's lazy!). Under the hood it leverages ObjectCache and Lazy to provide performance and reliability in heavy load scenarios.
For ASP.NET Core it's quick to experiment with LazyCache and get it set up. Give it a try, and share your favorite caching techniques in the comments.
Sponsor: Check out JetBrains Rider: a cross-platform .NET IDE. Edit, refactor, test and debug ASP.NET, .NET Framework, .NET Core, Xamarin or Unity applications. Learn more and download a 30-day trial!
About Scott
Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.
About Newsletter
Much easier than cache timeouts, and less complicated than using a third party caching mechanism.
I'm also caching the full list and then doing a where/filter every time.
Can you point to where that is occurring? My naive reading of the code suggests the Where is prior to caching.
var item = await cache.GetOrCreateAsync("mykeyfoo-" + id, async (c) =>{
c.SetAbsoluteExpiration(DateTimeOffset.Now.AddMinutes(1));
c.SetSlidingExpiration(TimeSpan.FromMinutes(1));
return await GetViewModelAsync(id, true);
});
Is there any reason to prefer LazyCache's implementation?
That way, you won't have an unlucky user who has a 7-8 second wait every 8 hours. You could also reduce that 8 hour interval to one hour, or even 30 or 15 minutes.
Func<Task<List<Show>>> showObjectFactory = () => PopulateShowsCache();
var retVal = await _cache.GetOrAddAsync("shows", showObjectFactory, DateTimeOffset.Now.AddHours(8));
To this:
var retVal = await _cache.GetOrAddAsync("shows", this.PopulateShowsCache, DateTimeOffset.Now.AddHours(8));
List<Show> shows = shows = await _client.GetShows();
Should it not rather be:
List<Show> shows = await _client.GetShows();
something to notice if LazyCache is using it behind the scenes..
Comments are closed.
nice article, as usual.
What about Microsoft.Extensions.Logging? Isn't supposed to replace ObjectCache?