How to make cache thread-safe in your Java application

This page describes how to develop a thread-sage cache using the synchronized block.

Introduction

A method is thread-safe if multiple threads can call it without breaking the functionality. Achieving thread safety is a complex task, so general-purpose classes are usually not thread-safe. The most common way to achieve thread safety is to lock the resource for exclusive use by a single thread at any given time.

Issue

You need to develop a web application where multiple users can simultaneously view the same file. The web application uses GroupDocs.Viewer on the server side. You have to ensure that multiple threads can safely read and write to the cache.

In GroupDocs.Viewer, you can use caching to improve the performance if the same document is processed multiple times (read more about caching here.) The FileCache class is an implementation of the Cache interface that uses a local disk to store the cache files. The FileCache is not thread safe, so you need to make it so.

Solution

The FileCache class uses a local disk to read and write output files. You need to implement thread safe reading and writing to disk. The simplest way is to use the synchronized statement. Implement the ThreadSafeCache class that wraps around not thread safe class that implements the Cache interface.

class ThreadSafeCache implements Cache {
    private final Cache _cache;

    public ThreadSafeCache(Cache cache) {
        _cache = cache;
    }

    public void set(String key, Object value) {
        synchronized (_cache) {
            _cache.set(key, value);
        }
    }

    public Object get(String key) {
        synchronized (_cache) {
            return _cache.get(key);
        }
    }

    public List<String> getKeys(String filter) {
        synchronized (_cache) {
            return _cache.getKeys(filter);
        }
    }
}

Result

Using the synchronized statement you develop a simple code to achieve thread-safety in your applications as shown in this tutorial.