react icon indicating copy to clipboard operation
react copied to clipboard

Request for Object Serialization Support in cache() Function

Open ryo-manba opened this issue 1 year ago • 1 comments

Is there any plan to support object serialization in cache() function similar to how modern data fetching libraries handle caching?

Current Behavior

Currently, React's caching mechanism supports only primitive values. This limitation becomes inconvenient when handling multiple properties, necessitating a workaround to manually serialize objects.

React Cache uses a WeakMap to compare object references, requiring the same object reference to utilize caching effectively.

In previous cache implementations, it was possible to pass a function that serializes keys into strings. Is there any plan to readopt this approach?

Examples from Other Libraries

Looking forward to your thoughts!

ryo-manba avatar May 09 '24 01:05 ryo-manba

Thanks for raising this @ryo-manba. I was just about to request similar as well! Being able to serialize non-primitives so they are cached will be very helpful, especially since fetch patch is being (temporarily) removed in https://github.com/facebook/react/pull/28896.

I gave this a little think to see what an API could look like, and I came up with the following:

Introduce a hash function that receives all function arguments and returns array of argument hashes.

function cache(fn, argsHashFn?){}

For example:

const x = cache((vec) => sum(vec), (vec) => [vec.join(',')])

// Example
x([1, 2]) // Cache miss
x([1, 2]) // Cache hit
x([1, 3]) // Cache miss

Interesting cases

  • Explicitly hashing some arguments but not others by passing empty element or undefined. In this case, default behavior of hashing by the arg itself will apply.
const x = cache((s, vec) => scale(s, vec), (_s, vec) => [,JSON.stringify(vec)])
  • Specifying constant hashes to cache all invocations to the same result
const x = cache((s, vec) => scale(s, vec),  () => [true, true])

Thoughts This may introduce some overhead to the function invocation, but if building the hash array is cheaper than invoking the function, then caching in this way makes sense. Otherwise a different approach would be better suited.

Extra

Edit: could not figure this out. It looks like the best solution is from inside cache itself. ~~As I typed this, I thought of a more 'cursed' way of achieving memoized behavior without making changes to existing cache implementation. Potentially, you could introduce some Map managed in user-land that could be used to stabilize the references of non-primitive arguments to a function call. Not sure if I know enough to build this out, but I'll give it a go!~~

rexfordessilfie avatar May 10 '24 08:05 rexfordessilfie

This issue has been automatically marked as stale. If this issue is still affecting you, please leave any comment (for example, "bump"), and we'll keep it open. We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!

github-actions[bot] avatar Aug 08 '24 10:08 github-actions[bot]

Closing this issue after a prolonged period of inactivity. If this issue is still present in the latest release, please create a new issue with up-to-date information. Thank you!

github-actions[bot] avatar Aug 15 '24 10:08 github-actions[bot]