In application where there are concurrent requests are hitting, there is a high chance that your cache seems not working and most of the requests are hitting into DB for fetching the value and your cache appears to be missing the value for these many requests!. This is happening due to our cache read is not thread safe and will be like this mostly:
if(cache != null) { //read from the cache } else if(cache == null) { //read from the DB and then add to cache }
so when concurrent requests hits at if block for the first time, cache will be null and going to hit the DB at the same time. This is obvious as the code is not thread safe. To make it thread safe we use lock
static readonly object _object = new object(); if (cache != null) { //return from the cache } else if(cache == null) { lock(_object) { //again try to read from the cache as by the time there is a possibility that cache can be set by a thread which was first inside this else if block //the later threads waiting outside the lock section and in this else if block can read it from the cache object than from DB once the lock is //released by the thread which was first obtained the lock in the else if block and read the value from DB if(cache != null) { object value = (object) cache[key]; //RuntimeContext.Get(cachekey); } if(cache == null) { //read from the DB //Add the key & value to cache } } }
CAUTION: This approach can backfire and is not a recommended one as I have seen places where after implementing this change the throughput of the application got worse than earlier even though DB hits not happening and cache working but so many threads were waiting in parallel for the lock. So need to analyse the performance of the application after making this change if we get good performance make the changes, otherwise need to find a better approach than this one. I am yet to find it, will update this once I know it.
0 comments:
Post a Comment