Caching isn’t an architecture, it’s just about optimisation.
Caching is an extremely important part of an application! It provides fast response time, enabling effortless performance improvements in many use cases. In many situations caching is really useful e.g. when the computation of a value is very expensive or when loading of resources is involved.
We recently encountered a situation where we need to cache certain data, which used to take around 600ms to get from the source, as the data was not much we didn’t want to use Redis as the cost to maintain Redis server will be quite high for just certain amount of data. So we explored In-Memory caching, there we came across different in-memory caching options.
- Google’s Guava
The basic approach was using ConcurrentHashMap, but the disadvantage of using Maps for caching is that you have to implement the eviction of entries yourself, e.g. to keep the size to a given limit. When you develop for a concurrent environment the task gets more complicated, the code looks messy.
Then we came across The most smelly part of this code is eviction apart from null checks, to keep the map size below 100 we need to intercept every add operation.
Google’s Guava Cache, Guava provides a very powerful memory-based caching mechanism by an interface LoadingCache<K, V>. Values are automatically loaded in the cache and it provides many utility methods useful for caching needs.
The above piece of code can be written as
The ugly code has now vanished. The thread-safe storing and the eviction is all done by Guava’s internal implementation. Also, Guava provides a nice API which makes our code much more readable. If you want to understand the caches in detail, the Guava User Guide does a great job.
The problem with Google’s Guava or I can say we found there was a better library available where the hit rate of the cache is quite better than guava and Caffeine is actually a rewrite of Guava’s cache that uses an API that returns CompletableFutures out of the box, allowing asynchronous automatic loading of entries into a cache. So at last we went forward with the Caffeine.
So another advantage which I found is, we can configure Caffeine as CacheManager, so in the future, if you want to use Redis or any other Caching library only you have to do is update the cache Manager, the rest will be taken care of by Cacheable annotation.