h5web icon indicating copy to clipboard operation
h5web copied to clipboard

Improve fetching for large datasets

Open loichuder opened this issue 4 years ago • 14 comments

Having played around with large datasets (~10M of points but a slice is ~100k), I find that the long fetch of the data breaks the flow...

Let's make use of this issue to gather possible improvements:

  • [x] Allow the user to cancel and retry requests (can be done with axios) #635 #640 #643 #647 #652 #657 #658 #659
  • [x] Make use of binary instead of JSON (benefits: 1. smaller payload, 2. no need to stringify/parse JSON on the server/client, 3. no need to flatten the array on the client) #817
  • [ ] Implement specific strategies for fetching of large datasets (e.g. use subsampling; warn the user that the fetching can take a long time and ask them if they want to proceed anyway...)
  • [ ] Request domain separately (not exactly related to the data fetching but avoiding computing it in the front-end could be a nice improvement for large datasets)

loichuder avatar Apr 20 '21 15:04 loichuder

I'd add that, when performing long downloads and computations, the UI should:

  • be more informative (i.e. progress status, subsampling rate, etc.)
  • remain responsive and allow cancelling slow computations (i.e. not just requests)

axelboc avatar Apr 26 '21 12:04 axelboc

Our discussion on https://github.com/silx-kit/h5web/pull/632 also gave me an idea: we could make the flattening operation more consistent by encapsulating in the get/useValue method.

Edit: This was done in #661

It then becomes relevant to this issue as it would be a stepping stone to request the flattening on the back-end. Thus, avoiding another expensive computation in h5web.

loichuder avatar Apr 28 '21 07:04 loichuder

In the providers' getValue() methods? Yeah, totally 👍

axelboc avatar Apr 28 '21 08:04 axelboc

#635 implements cancellation on the front-end, but it doesn't resolve crashes on Bosquet when attempting to fetch (and cancel the fetch of) extremely large datasets.

axelboc avatar Apr 28 '21 12:04 axelboc

#640 implements retrying after cancelling (including evicting cancellation errors from the value store's cache).

axelboc avatar Apr 29 '21 10:04 axelboc

Just curious - for HSDS, have you tried using HTTP Compression? That should reduce the payload size considerably.

jreadey avatar May 30 '21 20:05 jreadey

Unfortunately, the impact will be limited as most of our heavy datasets are not compatible with HSDS due to https://github.com/HDFGroup/hsds/issues/76 :confused:

But this is something that we still need to try !

loichuder avatar May 31 '21 09:05 loichuder

Binary is now used with H5Grove when getting dataset values: #817

axelboc avatar Oct 26 '21 06:10 axelboc

The auto-scale-off feature in the LineVis that forces us to fetch the whole dataset can be a real limiter for huge datasets (https://github.com/silx-kit/jupyterlab-h5web/issues/71).

Maybe, it is time to review it ? We could

  • Disable it somehow for huge datasets. And give an indication to the user ?
  • Request the domain separately (as proposed originally in https://github.com/silx-kit/h5web/issues/616#issue-862920494) to avoid the need to compute it in the front-end (and therefore the need to have the full dataset values)

loichuder avatar Nov 09 '21 15:11 loichuder

The auto-scale-off feature in the LineVis that forces us to fetch the whole dataset can be a real limiter for huge datasets (silx-kit/jupyterlab-h5web#71).

#877 implemented an intemediate solution:

  • When the auto-scale is on, only the relevant slice is fetched
  • Auto-scale is no longer persisted and is activated by default. That means that, by default, only slices are fetched.
  • Turning the auto-scale off fetches the full dataset. For now, it is up to the user to not trigger this for huge datasets.

loichuder avatar Nov 30 '21 15:11 loichuder

It seems that h5wasm now (as of v0.4.8) supports lazy loading of arrays. Is that beneficial for this issue as well (or in general for loading files >2GB)? Not really familiar with the interal workings though, so excuse me if this has nothing to do with this :)

For reference, see this discussion: https://github.com/usnistgov/h5wasm/issues/40

headtr1ck avatar Jan 16 '23 10:01 headtr1ck

Sure, that's relevant also for large datasets.

For h5wasm, we have a more specific issue tracking this at https://github.com/silx-kit/h5web/issues/1264

loichuder avatar Jan 16 '23 15:01 loichuder

Is it also planned to have streaming binary support for hsds? I could also try to implement it myself in the hsds api but I'm not a typescript expert, so I could use some guidance.

I experienced problems with this while I was experimenting with storing and loading large datasets via hsds and I use h5web in a simple hsds directory browser to view the stored data. However, the hsds server gets stuck on large datasets because h5web requests the data in json format.

domna avatar Mar 07 '23 12:03 domna

Is it also planned to have streaming binary support for hsds? I could also try to implement it myself in the hsds api but I'm not a typescript expert, so I could use some guidance.

To be honest, we don't really plan to improve the HSDS part since we mostly use h5grove and h5wasm. But you are welcome to contribute and we will be happy to help you doing so.

If you have some working code, feel free to open a draft PR to discuss. If something blocks you, you can drop us a line at [email protected].

loichuder avatar Mar 08 '23 10:03 loichuder