fast-memoize.js icon indicating copy to clipboard operation
fast-memoize.js copied to clipboard

configure cache size

Open leonardoanalista opened this issue 8 years ago • 13 comments

Is it possible to configure the max cache size? I took a look at the source and noticed that the simple cache (Object cache) applies no limits. I am not very comfortable to use knowing that memory could eventually blow up the browser in a matter of time.

leonardoanalista avatar Jul 03 '16 10:07 leonardoanalista

It is possible. We just have to be sure to not slow down the critical path (that I'm assuming it is without cache). The memoized function without cache limit should not do checks for cache size. Other than that I see it as a nice addition.

I see the API as:

memoize(func, {
  cacheLimit: 256
})

Thoughts about it?

caiogondim avatar Jul 05 '16 14:07 caiogondim

it sounds good to me. Is cacheLimit size in number of entries right?

Additional suggestion: If this second parameter becomes option con config capability, with reselect can take a function to override the equally check. Default is obviously ===.

Leonardo

leonardoanalista avatar Jul 06 '16 00:07 leonardoanalista

About the default equality check: I see it as a responsibility of the cache itself. And you can already pass a custom cache as const memoized = memoize(fn, customCache).

There is no documentation for it. I will update README.

caiogondim avatar Jul 06 '16 10:07 caiogondim

if you can add some examples of custom cache would be great.

leonardoanalista avatar Jul 06 '16 10:07 leonardoanalista

Documentation on usage of custom cache and serializer 187d14c9716f747f1267567dfcc969227be41d36

caiogondim avatar Jul 07 '16 22:07 caiogondim

Isn't cacheLimit option worth mentioning in the docs?

almirfilho avatar Jun 29 '17 05:06 almirfilho

@almirfilho It's not on the source code. I not willing to implement a cacheLimit, since there is another check to be made in the hot path. But an example with a customCache is worth doing.

caiogondim avatar Jun 29 '17 21:06 caiogondim

Another idea is to check for opts.cacheLimit and provide a different strategy if it's present, so we don't slow down the hot path when we are not using it.

caiogondim avatar Jun 29 '17 21:06 caiogondim

A very nice feature would be to allow a cache size of 1 to replicate memoize-once (i.e. leveraging immutability without hogging memory)

Offirmo avatar Aug 22 '18 05:08 Offirmo

This shouldn't be included into this package because its API allows a custom cache handler. A quick implementation of a sized cache could be:

function sizedCache(size)
{
    const track = new Array(size);
    const store = {};
    let idx = 0;

    return {
        has(key)
        {
            return (key in store);
        },
        get(key)
        {
            return store[key];
        },
        set(key, value)
        {
            store[key] = value;

            if(track[idx])
                delete store[track[idx]];

            track[idx] = key;

            ++idx;

            if(idx === size)
                idx = 0;
        }
    }
}

Test suite:


describe(
    'sizedCache',
    function()
    {
        it('Works with integers',
            function()
            {
                const cache = sizedCache(3);
                cache.set(1, 'foo');
                expect(cache.has(1)).toBe(true);
                expect(cache.get(1)).toEqual('foo');
                cache.set(2, 'bar');
                cache.set(3, 'baz');
                cache.set(4, 'baa');
                expect(cache.has(1)).toBe(false); expect(cache.get(1)).toEqual(undefined);
                expect(cache.has(2)).toBe(true); expect(cache.get(2)).toEqual('bar');
                expect(cache.has(3)).toBe(true); expect(cache.get(3)).toEqual('baz');
                expect(cache.has(4)).toBe(true); expect(cache.get(4)).toEqual('baa');
            }
        );

        it('Works with strings',
            function()
            {
                const cache = sizedCache(3);
                cache.set('a', 1);
                expect(cache.has('a')).toBe(true); expect(cache.get('a')).toEqual(1);
                cache.set('b', 2);
                cache.set('c', 3);
                cache.set('d', 4);
                cache.set('e', 5);
                expect(cache.has('a')).toBe(false);
                expect(cache.has('b')).toBe(false); expect(cache.get('b')).toEqual(undefined);
                expect(cache.has('c')).toBe(true); expect(cache.get('c')).toEqual(3);
                expect(cache.has('d')).toBe(true); expect(cache.get('d')).toEqual(4);
                expect(cache.has('e')).toBe(true); expect(cache.get('e')).toEqual(5);
            })
    }
);

GlennMatthys avatar Oct 30 '18 07:10 GlennMatthys

Here there is a method to set the cache size limit combining this package with 2 others, without the need to add any change to fast-memoize.

ecerroni avatar Feb 05 '21 21:02 ecerroni

The main point is that the memoizer's cache must have a limit in order to use this in production. Otherwise it could grow forever which is obviously a problem. Making every developer write their own size-limited cache is unrealistic and inconvenient. This library should offer a size-limited cache in addition to the default unlimited cache. Let the developer choose which one to use.

peacechen avatar Aug 06 '22 04:08 peacechen

I agree with @peacechen , it doesn't seem reasonable to have an unlimited cache in the first place, this is a huge footgun. I can see having an optional cacheLimit and if it's null (default) select a cache strategy that doesn't have a limit.

aaronbeall avatar Mar 19 '23 23:03 aaronbeall