fast-memoize.js icon indicating copy to clipboard operation
fast-memoize.js copied to clipboard

serializing class instances for 2+ arguments

Open stryju opened this issue 9 years ago • 9 comments

reading through the code, I found this:

if (arguments.length === 1) {
  cacheKey = arguments[0]
} else {
  cacheKey = serializer(arguments)
}

given that the default serializer uses JSON.stringify()

function jsonStringify () {
  return JSON.stringify(arguments)
}

you'll potentially get wrong cache-hits for any memoized function calls that accept class instances (that don't "expose" all their props) as their arguments (given that you provide more than one argument).


example

// prep

class LotteryTicket {
  constructor(code) {
    Object.defineProperty(this, 'code', {
      enumerable: false,
      configurable: false,
      writable: false,
      value: code,
    });
  }

  check(code) {
    return this.code === code;
  }
}

function checkWinner(ticket, code) {
  return ticket.check(code);
}

var memoizedCheckWinner = memoize(checkWinner);

// test

memoizedCheckWinner(new LotteryTicket('xxx'), 'xxx'); // true, correct
// serialized cacheKey: {"0":{"0":{},"1":"xxx"}}

memoizedCheckWinner(new LotteryTicket('yyy'), 'xxx'); // true; incorrect
// serialized cacheKey: {"0":{"0":{},"1":"xxx"}}; wrong cache hit

stryju avatar Jul 04 '16 12:07 stryju

Thanks for the well written bug report. Will make a test to cover this case and push a fix.

The trick here is how to make it as fast as possible. Will have to implement a couple different algorithms and add to the benchmark suite.

caiogondim avatar Jul 05 '16 14:07 caiogondim

happy to help.

I understand that the speed is the main priority, but I'd consider it a rather serious flaw - it should be at least documented :)

stryju avatar Jul 05 '16 22:07 stryju

Another potential issue: since you're using instances as cache keys, you potentially keep the instances from being GC'd and hogging the memory - just something to think about :)

stryju avatar Jul 05 '16 22:07 stryju

Totally true. Maybe WeakMap can help with that last issue?

caiogondim avatar Jul 06 '16 10:07 caiogondim

sure, but here's a twist: how will you serialize and compare the arguments?

stryju avatar Jul 06 '16 10:07 stryju

You could create a special cache for a calls with Objects in arguments. This will be a serious perf hit... :/

stryju avatar Jul 06 '16 10:07 stryju

Since this lib is very focused on being fast, I'm starting to think the only way to maintain speed is to expose some complexity to the user.

Like, if you want to memoize a function that will receive non-primitive arguments, do like:

const memoized = memoize(fn, {
  optimizeFor: 'object'
})

For primitive types:

const memoized = memoize(fn, {
  optimizeFor: 'primitive'
})

Or if you (as a client of the lib) don't want to think about details, just use a common solution:

const memoized = memoize(fn)

This way the memoize function works as a factory, returning the best solution for each case.

Thoughts?

caiogondim avatar Jul 06 '16 16:07 caiogondim

I'd just handle the one argument scenario and document the complexity without implementing. current solution will work for most of the cases, so it should be good.

However, it needs to be documented/mentioned :)


You could provide the proposed solution as well, given that it's documented too :)

WDYT?

stryju avatar Jul 06 '16 17:07 stryju

Sorry intruding on this, but I would love to have it fixed. So, what if we had something like:

const hash = Array.prototype.slice(arguments)
  .map(item => typeof item === 'function' ? item.toString() : JSON.stringify(item))
  .join('|');

Then, hash is how you index your result.

danilosterrapid7 avatar Apr 18 '18 08:04 danilosterrapid7