spring-data-redis icon indicating copy to clipboard operation
spring-data-redis copied to clipboard

Feature request: Enables @CacheEvict to delete a collection of keys

Open VeriahL opened this issue 2 years ago • 0 comments

Hello,

The Problem

I found it difficult to delete the cached collection of records in Redis after making a batch update to database.

public int deleteBatch(Collection<Integer> ids) {
        if (ids == null || ids.isEmpty()) {
            return 0;
        }
        return update(c ->
                c.set(deleted).equalTo((byte) 1)
                        .where(xxx.id, isIn(ids))
                        .and(deleted, isEqualTo((byte) 0)));
    }

In this tough situation, if I use these functions to delete cache:

    public void deleteCache(Collection<Integer> ids) {
        for(Integer id : ids) {
            deleteCache(id);
        }
    }

    @CacheEvict(key = "'id^' + #id")
    default void deleteCache(Integer id) {
    }

A lot of time will be wasted during sending commands to redis.

Using "allEntries=true" is not a good idea too. Not only too many keys will be deleted, but also command KEYS should never be called in PROD envirnoment.

My solution

To solve this problem, I made some change in my project:

First, I noticed that a key collection will be converted to a String divided by ",", for example, when I call

    @CacheEvict(key = "'id^' + #idCollection")
    default void batchDeleteCache(Collection<Integer> idCollection) {
    }

with param idCollection is [1,3,5,7,9], the frame will try to delete a record whose key is "id^1,2,3,4,5" in redis.

So I added 2 properties and a new constructor to RedisCache,

	private final String batchKeyIdentifier;// "^" in this case, any string key contains a batchKeyIdentifier will be regard as a key for collection
	private final String batchKeySplitter; // "," in this case

	/**
	 * Create new {@link RedisCache} with batch key identifier and batch key splitter
	 *
	 * @param name must not be {@literal null}.
	 * @param cacheWriter must not be {@literal null}.
	 * @param cacheConfig must not be {@literal null}.
	 * @param batchKeyIdentifier no key will be regard as a key for collection while batchKeyIdentifier is {@literal null} or empty.
	 * @param batchKeySplitter no key will be regard as a key for collection while batchKeySplitter is {@literal null} or empty.
	 */
	protected RedisCache(String name, RedisCacheWriter cacheWriter, RedisCacheConfiguration cacheConfig, String batchKeyIdentifier, String batchKeySplitter) {

		super(cacheConfig.getAllowCacheNullValues());

		Assert.notNull(name, "Name must not be null");
		Assert.notNull(cacheWriter, "CacheWriter must not be null");
		Assert.notNull(cacheConfig, "CacheConfig must not be null");

		this.name = name;
		this.cacheWriter = cacheWriter;
		this.cacheConfig = cacheConfig;
		this.conversionService = cacheConfig.getConversionService();
		this.batchKeyIdentifier = batchKeyIdentifier;
		this.batchKeySplitter = batchKeySplitter;
	}

then modified RedisCache#evict(Object key) in order to provide services for those special keys with a splitter like ","

    @Override
    public void evict(Object key) {
        if (isBatchKey(key)) {
            cacheWriter.removeBatch(name, createAndConvertCacheKeyBatch(key));
        } else {
            cacheWriter.remove(name, createAndConvertCacheKey(key));
        }
    }

RedisCache#isBatchKey(Object key) checks if the key can be regard as a key for collection.

	private boolean isBatchKey(Object key) {
		return batchEnabled() &&
				key instanceof String
				&& ((String) key).contains(batchKeySplitter)
				&& ((String) key).contains(batchKeyIdentifier);
	}

	private boolean batchEnabled() {
		return !batchKeyIdentifier.isEmpty() && !batchKeySplitter.isEmpty();
	}

RedisCache#createAndConvertCacheKeyBatch(Object key) is a function to split a key for collection like "id^1,2,3,4,5", to a collection of "id^1","id^2","id^3","id^4","id^5", and then convert it into a byte[][].

    private byte[][] createAndConvertCacheKeyBatch(Object batchKey) {

        String strKey = (String) batchKey;
        int identifierIndex = strKey.indexOf(batchKeyIdentifier);

        String prefix = strKey.substring(0, identifierIndex + 1);
        String keys = strKey.substring(identifierIndex + 1);
        String[] keyArray = keys.split(batchKeySplitter);

        for (int i = 0; i < keyArray.length; i++) {
            keyArray[i] = prefix + keyArray[i];
        }

        byte[][] keysBytes = new byte[keyArray.length][];
        for (int i = 0; i < keyArray.length; i++) {
            keysBytes[i] = serializeCacheKey(createCacheKey(keyArray[i]));
        }
        return keysBytes;
    }

RedisCacheWriter#removeBatch(String name, byte[][] keys) is a function that clearing specific keys with one redis command. Be benifited from the varargs from RedisConnection#del(byte[]... keys), I can write removeBatch with ease.

    public void removeBatch(String name, byte[][] keys) {

        Assert.notNull(name, "Name must not be null!");
        Assert.notNull(keys, "Keys must not be null!");

        execute(name, connection -> connection.del(keys));

    }

My project runs steadily with those modifications.

In addition, RedisCache#createAndConvertCacheKeyBatch(Object key) is also can be used to save some scattered key-value records instead of a whole collection, which hard be re-used. (ex. a record with key is "id^1,2,3,4,5" will not be accessed by key "id^2,3,4" although the record contains value for id 2,3,4 already.)

Is this a valuable feature for spring-data-redis?

VeriahL avatar Jun 30 '22 14:06 VeriahL