polars
polars copied to clipboard
Support HDF5 reading/writing (with optional dependencies)
We can reduce friction by figuring out how to load data most efficiently to polars memory.
If there is a backend for this in Rust, I think we could work it out in arrow2. It is a quite important format imo.
That's a good idea too!
HDF5 has a very big specification: https://docs.hdfgroup.org/hdf5/develop/_f_m_t3.html and as HDF5 is very similar to a filesystem, data stored in HDF5 can be stored in quite a lot of different ways.
Rust bindings to libhdf5 can be found at: https://github.com/aldanor/hdf5-rust
I think we should explore both. Rust backed under a feature flag, and python as optional dependency. I can imagine that it increases binary size quite a bit.
I'm really excited about Polars. But almost all of my data is in large HDF5 files (actually, NetCDF). I can convert to Parquet files. But loading directly from HDF5 into Polars would be ideal :slightly_smiling_face:
Are there any plans to add support to Polars to read HDF5 (ideally lazily)?
There is this rust crate for reading https://github.com/georust/netcdf But as far as I understand NetCDF contains mostly multidimensional data instead of 1D arrays like the arrow format, so I am not sure how useful it would be in general to even consider support for this.
Good point!
For my work, I use NetCDF for n-dimensional arrays, and 2d arrays.
But, if I'm reading this right, this comment suggests that tensor types might come to Polars at some point. It'd be absolutely amazing to be able to load n-dim NetCDF files directly into an n-dimensional in-memory array in Polars :slightly_smiling_face:
(hdf5-rust author here)
Rust bindings to libhdf5 can be found at: https://github.com/aldanor/hdf5-rust
These are not just bindings to libhdf5 (that's the hdf5-sys
crate which is part of it, and which netcdf crate itself depends on), there's quite a bit more like the the whole #[derive(...)]
shebang for struct-like types, thread-safe HDF5 handles management, etc.
Re: NetCDF, while it uses HDF5 as a storage medium, it's not the same thing, it's more like an opinionated meta-format on top of that that is very popular in some communities (e.g. geo).
I think we could make HDF5 work with polars, but would be nice to have something like a spec, or at least a wishlist with some examples – i.e. what do we want?
loading directly from HDF5 into Polars would be ideal
Problem is, polars/arrow/parquet etc are column based, whereas hdf5 is array-based. If you have a polars frame with columns a: i64, b: f64
and you plan on reading/writing that to HDF5, there's a few ways you can do this:
- create a struct type
{ a: i64, b: f64 }
and store it as one structured dataset - store each column separately, like
/path/a
,/path/b
(I believe that's one of the possible ways to dump it to hdf5 from pandas?) - (can come up with other options)
There's even more ambiguity when reading existing data: if you have a structured HDF5 dataset with fields "a" and "b", you may want
- to read it as frame with two columns 'a' and 'b'
- series with a structured type
- etc
One way to go would be to check what pandas does and do the same thing, so you can dump a dataframe from pandas and read it back from polars. Perhaps that's an easiest way to get started.
One way to go would be to check what pandas does and do the same thing, so you can dump a dataframe from pandas and read it back from polars. Perhaps that's an easiest way to get started.
I believe that most of the hdf5 files that we are expected to be able to read are created by pandas. So, yes, I think we should start with supporting what they do.
I believe that most of the hdf5 files that we are expected to be able to read are created by pandas. So, yes, I think we should start with supporting what they do.
I agree, copying Pandas' behaviour sounds like a great place to start!
Pandas to_hdf()
use pytables to store its dataframes: https://www.pytables.org/
PyTable basic fle format overview: https://www.pytables.org/usersguide/file_format.html
I have an application that uses h5py to read h5 data into pandas and I've started to convert this app from Pandas to Polars - the data is relatively big and I'm looking for more performance.
I use h5py to read the h5 datasets into numpy structured arrays (heterogeneous types) and the numpy structured arrays transfer very easily into pandas dataframes.
But getting that same data into a Polars dataframe is proving to be a problem - basically the structure of the numpy structured array is lost and I end up with a single column dataframe with an object dtype.
I suspect there are many users who get their h5 data via h5py and for these users, just supporting fast/easy construction of polars dataframes from numpy structured arrays would be ideal
I believe that most of the hdf5 files that we are expected to be able to read are created by pandas. So, yes, I think we should start with supporting what they do.
I agree, copying Pandas' behaviour sounds like a great place to start!
Copying Pandas behavior for creating dataframes from numpy structured arrays would be great!!!
I use h5py to read the h5 datasets into numpy structured arrays (heterogeneous types) and the numpy structured arrays transfer very easily into pandas dataframes.
It seems np structured array is only a helper here, which the pandas dataframe constructor knows how to handle. So the 'clean' way would be to access the hdf5 directly. But I must admit I have no idea if that is the easier option.
It may be work considering leveraging Vaex for this. You can read/write to hdf5 files natively, and they map directly to/from numpy arrays
import vaex
import numpy as np
x = np.arange(1000)
y = np.random.rand(1000)
z = np.array(['dog', 'cat']*500)
df_numpy = vaex.from_arrays(x=x, y=y, z=z)
display(df_numpy)
df_numpy.export("file.hdf5")
df_numpy_hdf5 = vaex.open("file.hdf5")
x = df_numpy_hdf5.x.to_numpy()
x.dtype, x
just supporting fast/easy construction of polars dataframes from numpy structured arrays would be ideal
Done... structured array support (both initialising frames and exporting from them) will be available in the upcoming 0.17.12
release.
Folks,
Tall towers like
dask, zarr ...
xarray
netCDF4
hdf5
have many users in geographic info -- but towers get shaky as you add more and more stories (medieval cathedrals got higher and higher, a few collapsed). See numpy indexes small amounts of data 1000 faster than xarray (2019)
Not sure what's going on under the hood
and Moving-away-hdf5 (2016) -- excellent.
Fwiw, my use case: 100 ior so 2 GB hdf5 files (24*30, 824, 848)
https://opendata.dwd.de/climate_environment/REA/COSMO_REA6/converted/hourly/2D/WS_100m.2D.201801.nc4
I want slices like wind_data[ :, :100, :100 ] .moveaxis( 0, -1 )
in numpy.
[ :, :n, :n ]
takes 9 sec with xarray on my old iMac with 16 GB even for n=1.
On the topic of the performance of Zarr (and xarray), we've started a concerted effort to significantly speed up Zarr, possibly be re-writing a bunch of it from scratch in Rust (partly inspired by Polars!): https://github.com/zarr-developers/zarr-python/discussions/1479
I agree 100% with your general point: A lot of "scientific Python" code is built on 30 year old foundations. And those foundations are starting to look a little shaky! (Because high-performance code today looks quite different to high-performance code from 30 years ago). So I do agree that there's an argument for thinking hard about re-building some of these things from the ground-up.
@JackKelly
Seems that hdf5 files can be horribly complicated
(which I guess everybody knows, but I didn't --
after too-long surfing, h5stat
of my .nc4 is 100 lines, wow:
2 GB flat to 1.5 GB compressed tree => s l o w reads).
From the top floor of this tower you won't even SEE the ancient crud way down below, let alone be able to point a user to it. Is there a testbench which can track time and space through n layers of python, cython, dylibs ... in this simple case ?
I'm really excited about Polars. But almost all of my data is in large HDF5 files (actually, NetCDF). I can convert to Parquet files. But loading directly from HDF5 into Polars would be ideal 🙂
Are there any plans to add support to Polars to read HDF5 (ideally lazily)?
Interesting. Are you working with weather data ? I knocked together a quick project which ECMWFs eccodes library and exposes the data as arrow https://github.com/hugopendlebury/gribtoarrow
Would be interested in doing something similar with HDF5 do you have any sample files ?
Hi Hugo,
fwiw, a trival test case is Randomcube( gb= )
, see at the end of Specify chunks in bytes
also numpy indexes small amounts of data 1000 faster than xarray.
On weather data, https://opendata.dwd.de/climate_environment/REA/COSMO_REA6/converted/hourly/2D/* have 2 GB files (compressed to 1.5 so uncompress is slow, compress very slow).
A big unknown in runtimes is cache performance, SSD L1 L2 ...; I imagine that they affect runtimes as much as CPU GHz but don't have even a 0-order model, sum coefs * us. Apple and Intel have surely put lots of $$$ and manpower into models of caches and filesystems -- they might have open test benches worth looking at, dunno. (Is there a form of Amdahl's_law for CPU + multilevel caches ?)
cheers -- denis
Pandas
to_hdf()
use pytables to store its dataframes: https://www.pytables.org/PyTable basic fle format overview: https://www.pytables.org/usersguide/file_format.html
In one of my projects, we use a python stack with pandas doing all the DataFrame stuff. We're currently depending on an SQL database but want to migrate to pytables. It would be amazing if polars offered an easy way how to load/store data from/to pytables HDF5 files!
Pandas
to_hdf()
use pytables to store its dataframes: pytables.org PyTable basic fle format overview: pytables.org/usersguide/file_format.htmlIn one of my projects, we use a python stack with pandas doing all the DataFrame stuff. We're currently depending on an SQL database but want to migrate to pytables. It would be amazing if polars offered an easy way how to load/store data from/to pytables HDF5 files!
Looks like pytables may be the way to go to support this on the Python side. That would probably be a good first step. We can look into Rust support later.
Yes, we can start with pytables. For Rust support I first want to extend the plugin system with readers.
Hi, can I have a go at implementing this with pytables?
@galbwe Definitely! You can take inspiration from read/scan/write_delta
, which should be comparable in the sense that it is I/O enabled by a third party dependency on the Python side.
Python dev and Newbie Rust dev, I would like to try and implement that HDF5 crate and create a data load function..
https://github.com/aldanor/hdf5-rust
(hdf5-rust author here)
Rust bindings to libhdf5 can be found at: https://github.com/aldanor/hdf5-rust
These are not just bindings to libhdf5 (that's the
hdf5-sys
crate which is part of it, and which netcdf crate itself depends on), there's quite a bit more like the the whole#[derive(...)]
shebang for struct-like types, thread-safe HDF5 handles management, etc.Re: NetCDF, while it uses HDF5 as a storage medium, it's not the same thing, it's more like an opinionated meta-format on top of that that is very popular in some communities (e.g. geo).
I think we could make HDF5 work with polars, but would be nice to have something like a spec, or at least a wishlist with some examples – i.e. what do we want?
loading directly from HDF5 into Polars would be ideal
Problem is, polars/arrow/parquet etc are column based, whereas hdf5 is array-based. If you have a polars frame with columns
a: i64, b: f64
and you plan on reading/writing that to HDF5, there's a few ways you can do this:
- create a struct type
{ a: i64, b: f64 }
and store it as one structured dataset- store each column separately, like
/path/a
,/path/b
(I believe that's one of the possible ways to dump it to hdf5 from pandas?)- (can come up with other options)
There's even more ambiguity when reading existing data: if you have a structured HDF5 dataset with fields "a" and "b", you may want
- to read it as frame with two columns 'a' and 'b'
- series with a structured type
- etc
One way to go would be to check what pandas does and do the same thing, so you can dump a dataframe from pandas and read it back from polars. Perhaps that's an easiest way to get started.
I'm going to work on adding this client crate into polars in my fork, when I'm ready I'll submit a PR.
@timothyhutz, I'd welcome a 1-page spec and a little testbench first. Is that realistic -- what do you think ? (Pandas is deeply 2D, geopandas size and complexity pandas * 2. Afaik it's moving to PyArrow; see also nanoarrow --
This entire extension is currently experimental and awaiting use-cases that will drive future development.
Added: do you have a short list of max 5 use cases for people to vote on: must-have, would-be-nice, too-complex ?
When could one expect to have hdf5 implementation to be done?