undici icon indicating copy to clipboard operation
undici copied to clipboard

Question: Does undici.fetch support RFC 7234?

Open loynoir opened this issue 2 years ago • 13 comments

This would solve...

The implementation should look like...

https://www.npmjs.com/package/http-cache-semantics

RFC 7234 RFC 5861

I have also considered...

Additional context

loynoir avatar Dec 14 '21 16:12 loynoir

Not at this point. I think it would be great to support it - I would keep it optional as it'd be unclear where to store the cache.

mcollina avatar Dec 14 '21 17:12 mcollina

https://github.com/sindresorhus/got/blob/HEAD/documentation/cache.md

const storageAdapter = new Map();
await got('https://sindresorhus.com', {cache: storageAdapter});

Maybe something like this

const fetch2 = undici.sudoCreateFetch({
  // all super user options, outside whatwg fetch
  cache: storageAdapter
})

loynoir avatar Dec 14 '21 17:12 loynoir

https://github.com/sindresorhus/got/blob/HEAD/documentation/cache.md

const storageAdapter = new Map();
await got('https://sindresorhus.com', {cache: storageAdapter});

Maybe something like this

const fetch2 = undici.sudoCreateFetch({
  // all super user options, outside whatwg fetch
  cache: storageAdapter
})

I think if any caching layer is implemented, it should support an asynchronous API - it makes more sense to me

MoLow avatar Feb 18 '22 07:02 MoLow

cc @jasnell

mcollina avatar Feb 18 '22 08:02 mcollina

It would make sense for any caching layer to be implemented as a plugin system. I noticed that the RedirectHandler is the closest thing to a "plugin" in undici, and in fact caching could be done as a dispatch interceptor.

The interceptor receives the Request Opts, and the DispatcherHandler, and simply calls the events on the DispatcherHandler directly to fulfill the request.

arontsang avatar Apr 04 '22 09:04 arontsang

After the #1338 is checked in, I've try to integrate the npm package with custom dispatcher/handler to implement cache.

Currently, I think we still have a gap to implement cache gracefully.

I want to hook the cache checking opration right before the dispatch function (by using DispatchInterceptor[]):

(pseudo code)

function createCacheInterceptor(cache: CacheStorage) {
  return (dispatch) => (opts, handler) => {
    // check cache
    const cachedRequest = cache.get(serializeOptions(opts)) // we cannot have promise here... but normally get cache is an async op
    if (!cachedRequest || cachedRequest.isExpire()) {
      // no cache or expired cache, we should dispatch request
      return dispatch(opts, new CacheHandler(handler, cache))
    }
    // otherwise, we should mock the request result to the handler
    handler.onHeaders(cachedRequest.getHeader())
    handler.onData(cachedRequest.getData())
    handler.onComplete(cachedRequest.getTrailer())
    return false
  }
}

In the current design, the Dispatcher.dispatch must be sync, returing a boolean, but normally, retrieving cache is an async operation:

interface CacheStorage {
  get(key: string): Promise<any>
  set(key: string, val: any): Promise<void>
}

I wonder if there is any chance to make dispatch async? Or is there any other design to implement this?

ci010 avatar Sep 18 '22 14:09 ci010

The response type of dispatch is correct: no error or promises should be propagated.

mcollina avatar Sep 18 '22 15:09 mcollina

The response type of dispatch is correct: no error or promises should be propagated.

Yes, that's what I mean. Currently, we can only have a boolean return type for dispatch.

It's not possible to have async cache read ops in dispatch function unless we support Promise<boolean> for dispatch.

ci010 avatar Sep 18 '22 15:09 ci010

That isn't true. dispatch is async. It just isn't async over Promise. dispatch uses old school callbacks.

On Sun, 18 Sept 2022 at 23:36, CI010 @.***> wrote:

The response type of dispatch is correct: no error or promises should be propagated.

Yes, that's what I mean. Currently, we can only have a boolean return type for dispatch.

It's not possible to have async cache read ops in dispatch function unless we support Promise for dispatch.

— Reply to this email directly, view it on GitHub https://github.com/nodejs/undici/issues/1146#issuecomment-1250333034, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA46MLM2VFQ4PV3MKAXOKO3V64ZJNANCNFSM5KBLTJVQ . You are receiving this because you commented.Message ID: @.***>

arontsang avatar Sep 18 '22 15:09 arontsang

That isn't true. dispatch is async. It just isn't async over Promise. dispatch uses old school callbacks. On Sun, 18 Sept 2022 at 23:36, CI010 @.> wrote: The response type of dispatch is correct: no error or promises should be propagated. Yes, that's what I mean. Currently, we can only have a boolean return type for dispatch. It's not possible to have async cache read ops in dispatch function unless we support Promise for dispatch. — Reply to this email directly, view it on GitHub <#1146 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA46MLM2VFQ4PV3MKAXOKO3V64ZJNANCNFSM5KBLTJVQ . You are receiving this because you commented.Message ID: @.>

I agree with "dispatch is async with old school callback", but the return value of dispatch (the boolean) is used to determine if the Pool or Client needDrain, which mean I cannot just return value without cache checking:

function createCacheInterceptor(cache: CacheStorage) {
  return (dispatch) => (opts, handler) => {
    cache.getCache(serializeOptions(opts)).then((cachedRequest) => {
      if (!cachedRequest || cachedRequest.isExpire()) {
        this.dispatcher.dispatch(opts, handler) // ignore drain status...
      } else {
        handler.onHeaders(cachedRequest.getHeader())
        handler.onData(cachedRequest.getData())
        handler.onComplete(cachedRequest.getTrailer())
      }
    }, (error) => {
      // handle error
    })
    
    // return false anyway... this will break the Pool and Agent, which relay on the return value
    return false
  }
}

Suppose we can wrap a customized Dispatcher, we can have the best effort like:

class CustomDispatcher extends Dispatcher {
  private cache: CacheStorage
  private dispatcher: Dispatcher // wrap another dispatcher

  dispatch(opts, handler) {
    this.cache.getCache(serializeOptions(opts)).then((cachedRequest) => {
      if (!cachedRequest || cachedRequest.isExpire()) {
        this.dispatcher.dispatch(opts, handler) // ignore drain status...
      } else {
        handler.onHeaders(cachedRequest.getHeader())
        handler.onData(cachedRequest.getData())
        handler.onComplete(cachedRequest.getTrailer())
      }
    }, (error) => {
      // handle error
    })

    const needDrain = this.dispatcher[kNeedDrainClient]
    if (needDrain !== undefined) {
      // see Client implementation
      return needDrain < 2
    }

    const needDrainP = this.dispatcher[kNeedDrainPool] // the pool needDrain symbol is different from the Client one (core/symbols.js)
    if (needDrainP !== undefined) {
      // see PoolBase implementation
      return needDrainP
    }

    // return false anyway... this will break the Pool and Agent relay on the return value
    return false
  }
}

Still, this is solution is not ideal... it's relay on the Client and Pool implementation.

ci010 avatar Sep 18 '22 16:09 ci010

I don't think you'll need to return anything but false in the custom dispatcher:

class CustomDispatcher extends Dispatcher {
  private cache: CacheStorage
  private dispatcher: Dispatcher // wrap another dispatcher

  dispatch(opts, handler) {
    this.cache.getCache(serializeOptions(opts)).then((cachedRequest) => {
      if (!cachedRequest || cachedRequest.isExpire()) {
        this.dispatcher.dispatch(opts, handler) // ignore drain status...
      } else {
        handler.onHeaders(cachedRequest.getHeader())
        handler.onData(cachedRequest.getData())
        handler.onComplete(cachedRequest.getTrailer())
      }
    }, (error) => {
      // handle error
    })

    
    return false
  }
}

In order for this to work properly, the wrapped dispatcher must be a full Agent (as it handles the drain events correctly inside).


FWIW I think this shows a significant issue in the interceptor design cc @ronag @kibertoad.

mcollina avatar Sep 18 '22 21:09 mcollina

@mcollina Should we mark custom dispatchers as experimental in docs prior to release? Also I can spend some time on interceptors, if you have some ideas how to improve the situation

kibertoad avatar Sep 18 '22 22:09 kibertoad

I think we should mark custom interceptors as experimental prior to release. Custom dispatchers have been implemented for a while, and I don't think we can break them in any form.

mcollina avatar Sep 19 '22 05:09 mcollina