xarray icon indicating copy to clipboard operation
xarray copied to clipboard

support for units

Open p3trus opened this issue 8 years ago • 62 comments

One thing that bother me alot is that pandas lacks good unit support (e.g. quantities, pint, ...)

Is there a chance that xray will support it?

p3trus avatar Aug 11 '15 11:08 p3trus

Eventually, I hope so!

Unfortunately, doing this in a feasible and maintainable way will probably require upstream fixes in NumPy. In particular, better support for duck-array types (https://github.com/numpy/numpy/issues/4164) and/or the ability to write units as a custom NumPy dtypes. Both of these are on the NumPy roadmap, though they don't have a timeframe for when that will happen.

shoyer avatar Aug 12 '15 02:08 shoyer

Astropy has pretty good units support: http://astropy.readthedocs.org/en/latest/units/ Would it be possible to copy what they do?

rabernat avatar Aug 17 '15 15:08 rabernat

Unfortunately, the astropy approach uses a numpy.ndarray subclass, which means it's mutually exclusive with dask.array. Otherwise, it does look very nice, though.

On Mon, Aug 17, 2015 at 8:38 AM, Ryan Abernathey [email protected] wrote:

Astropy has pretty good units support: http://astropy.readthedocs.org/en/latest/units/

Would it be possible to copy what they do?

Reply to this email directly or view it on GitHub: https://github.com/xray/xray/issues/525#issuecomment-131866203

shoyer avatar Aug 17 '15 16:08 shoyer

@shoyer - as one who thinks unit support is probably the single best thing astropy has (and is co-maintainer of astropy.units), I thought I'd pipe in: why would it be a problem that astropy's Quantity is an ndarray subclass? I must admit not having used dask arrays, but since they use ndarray internally for the pieces, shouldn't the fact that Quantity has the same interface/methods, make it relatively easy to swap ndarray for Quantity internally? I'd be quite happy to help think about this (surely it cannot be as bad as it is for MaskedArray ;-).

Alternatively, maybe it is easier to tag on the outside rather than the inside. This would also not seem to be that hard, given that astropy's Quantity is really just a wrapper around ndarray that carries a Unit instance. I think the parts that truly wrap might be separated from those that override ndarray methods, and would be willing to implement that if there is a good reason (like making dask quantities possible...). It may be that in this case one would not use Quantity proper, but rather just the parts of units where the real magic happens: in the Unit class (which does the unit conversion) and in quantity_helpers.py (which tells what unit conversion is necessary for a given operation/function).

mhvk avatar Sep 19 '15 18:09 mhvk

@mhvk It would certainly be possible to extend dask.array to handle units, in either of the ways you suggest.

Although you could allow Quantity objects inside dask.arrays, I don't like that approach, because static checks like units really should be done only once when arrays are constructed (akin to dtype checks) rather than at evaluation time, and for every chunk. This suggests that tagging on the outside is the better approach.

So far, so good -- but with the current state of duck array typing in NumPy, it's really hard to be happy with this. Until __numpy_ufunc__ lands, we can't override operations like np.sqrt in a way that is remotely feasible for dask.arrays (we can't afford to load big arrays into memory). Likewise, we need overrides for standard numpy array utility functions like concatenate. But the worst part is that the lack of standard interfaces means that we lose the possibility of composing different arrays backends with your Quantity type -- it will only be able to wrap dask or numpy arrays, not sparse matrices or bolt arrays or some other type yet to be invented.

Once we have all that duck-array stuff, then yes, you certainly could write all a duck-array Quantity type that can wrap generic duck-arrays. But something like Quantity really only needs to override compute operations so that they can propagate dtypes -- there shouldn't be a need to override methods like concatenate. If you had an actual (parametric) dtype for units (e.g., Quantity[float64, 'meters']), then you would get all those dtype agnostic methods for free, which would make your life as an implementer much easier. Hence why I think custom dtypes would really be the ideal solution.

shoyer avatar Sep 21 '15 01:09 shoyer

@shoyer - fair enough, and sad we don't have __numpy_ufunc__ yet... I agree that with Quantity inside, one would end up duplicating work for every chunk, which makes it less than ideal even though it would probably be the easier approach to implement.

For the outside method, from the dask perspective, it would indeed be easiest if units were done as a dtype, since then you can punt all the decisions to helper routines. My guess, though, is that it will be a while before numpy will include what is required to tell, e.g., that if I add something in m to something in cm, the second argument has to be multiplied by 0.01. But astropy does provide something just like that: quantity_helpers exposes a dict keyed by operation, which holds functions that return the required converters given the units. E.g., in the above example, internally what happens is

converters, result_unit = UFUNC_HELPERS[np.add](np.add, *units)
result_unit
# Unit("m")
converters[0]
# None
converters[1]
# <function astropy.units.quantity_helper.get_converter.<locals>.<lambda>>
converters[1](1.)
# 0.01

In dask, you could run the converters on your individual chunks, though obviously I don't know how easy it is to add an extra step like this without slowing down other aspects too much.

mhvk avatar Sep 21 '15 14:09 mhvk

p.s. For concatenate, you need unit conversion as well, so sadly Quantity does need to override that too (and currently cannot, which is rather annoying).

mhvk avatar Sep 21 '15 14:09 mhvk

I am one of the authors of Pint and I was just pointed here by @arsenovic

Pint does not subclass ndarray, it rathers wrap any numerical type dispatching to the wrapped value any attribute access that does not understand. By defining __array_prepare__ and __array_wraps__ most numpy functions and array attributes work as expected without monkey patching or having a specialized math module. For example:

>>> import numpy as np
>>> import pint
>>> ureg = pint.UnitRegistry()
>>> [1., 4., 9.] * ureg.meter # a list is interpreted as an ndarray
<Quantity([1. 4. 9.], 'meter')>
>>> np.sqrt(_)
<Quantity([ 1.  2.  3.], 'meter ** 0.5')>
>>> _.sum()
<Quantity(6.0, 'meter ** 0.5')>

I think something similar can be done for xarray.

hgrecco avatar Feb 08 '16 23:02 hgrecco

@hgrecco - for astropy's Quantity, we currently also rely on __array_prepare__ and __array_wrap__. The main annoyances are (1) one cannot change the input before a numpy ufunc is called, and therefore often has no choice but to let a wrong calculation proceed; (2) proper recognition in non-ufunc functions is sparse (e.g., np.dot, etc.; see http://docs.astropy.org/en/latest/known_issues.html#quantity-issues)

Aside: at some point I'd hope to get the various implementations of units to talk together: it would be good to have an API that works such that units are inter-operable.

mhvk avatar Feb 09 '16 15:02 mhvk

@hgrecco Are you suggesting that pint could wrap xarray objects, or that xarray could wrap pint? Either is certainly possible, though I'm a bit pessimistic that we can come up with a complete solution without the numpy fixes we've been discussing.

Also, just to note, xarray contains a Dataset type for representing collections of variables that often have different units (e.g., temperature and pressure). That suggests to me that it could make more sense to put pint and/or astropy.Quantity objects inside xarray arrays rather than the other way around.

shoyer avatar Feb 09 '16 16:02 shoyer

id just like to chime in and say that this feature would really be sweet. i always find myself doing a lot work to handle/convert different units. it seems that adding units to labeled axes does a lot to describe a set of data.

arsenovic avatar Feb 09 '16 16:02 arsenovic

@shoyer When we prototyped Pint we tried putting Quantity objects inside numpy array. It was was working fine but the performance and memory hit was too large. We were convinced that our current design was right when we wrote the first code using it. The case might be different with xarray. It would be nice to see some code using xarray and units (as if this was an already implemented feature).

@mhvk I do agree with your views. We also mention these limitations in the Pint documentation. Wrapping (instead of subclassing) adds another issue: some Numpy functions do not recognize a Quantity object as an array. Therefore any function that call numpy.asanyarray will erase the information that this is a quantity (See my issue here numpy/numpy#4072).

In any case, as was mentioned before in the thread Custom dtypes and Duck typing will be great for this.

In spite of this limitations, we chose wrapping because we want to support quantities even if NumPy is not installed. It has worked really nice for us, working in most of the common cases even for numpy arrays.

Regarding interoperating, it will be great. It will be even better if we can move into one, blessed, solution under the pydata umbrella (or similar).

hgrecco avatar Feb 10 '16 02:02 hgrecco

Not to be pedantic, but just one more :+1: on ultimately implementing units support within xarray -- that would be huge.

spencerahill avatar Feb 10 '16 04:02 spencerahill

If anyone is excited about working on the NumPy improvement we need to make this more feasible (namely, custom dtypes and duck typing) at BIDS, you should talk to @njsmith.

shoyer avatar Feb 11 '16 07:02 shoyer

I agree that custom dtypes is the right solution (and I'll go dig some more there). In the meantime, I'm not sure why you couldn't wrap an xarray DataArray in one of pint's Quantity instances. With the exception of also wanting units on coordinates, this seems like a straightforward way to get at least some unit functionality.

dopplershift avatar Jun 29 '16 17:06 dopplershift

#988 describes a possible approach for allowing third-party libraries to add units to xarray. It's not as comprehensive as a custom dtype, but might be enough to be useful.

shoyer avatar Aug 27 '16 19:08 shoyer

+1 for units support. I agree, parametrised dtypes would be the preferred solution, but I don't want to wait that long (I would be willing to contribute to that end, but I'm afraid that would exceed my knowledge of numpy).

I have never used dask. I understand that the support for dask arrays is a central feature for xarray. However, the way I see it, if one would put a (unit-aware) ndarray subclass into an xarray, then units should work out of the box. As you discussed, this seems not so easy to make work together with dask (particularly in a generic way). However, shouldn't that be an issue that the dask community anyway has to solve (i.e.: currently there is no way to use any units package together with dask, right)? In that sense, allowing such arrays inside xarrays would force users to choose between dask and units, which is something they have to do anyway. But for a big part of users, that would be a very quick way to units!

Or am I missing something here? I'll just try to monkeypatch xarray to that end, and see how far I get...

burnpanck avatar Sep 19 '16 17:09 burnpanck

@burnpanck Take a look at the approach described in #988 and let me know if you think that sounds viable.

NumPy subclasses inside xarray objects would probably mostly work, if we changed some internal uses of np.asarray to np.asanyarray. But it's also a pretty big rabbit hole. I'm still not sure there are any good ways to do operations like concatenate.

shoyer avatar Sep 19 '16 18:09 shoyer

#988 would certainly allow to me to implement unit functionality on xarray, probably by leveraging an existing units package. What I don't like with that approach is the fact that I essentially end up with a separate distinct implementation of units. I am afraid that I will either have to re-implement many of the helpers that I wrote to work with physical quantities to be xarray aware. Furthermore, one important aspect of units packages is that it prevents you from doing conversion mistakes. But that only works as long as you don't forget to carry the units with you. Having units just as attributes to xarray makes it as simple as forgetting to read the attributes when accessing the data to lose the units. The units inside xarray approach would have the advantage that whenever you end up accessing the data inside xarray, you automatically have the units with you. From a conceptual point of view, the units are really an integral part of the data, so they should sit right there with the data. Whenever you do something with the data, you have to deal with the units. That is true no matter if it is implemented as an attribute handler or directly on the data array. My fear is, attributes leave the impression of "optional" metadata which are too easily lost. E.g. xarray doesn't call it's ufunc_hook for some operation where it should, and you silently lose units. My hope is that with nested arrays that carry units, you would instead fail verbosely. Of course, np.concatenate is precisely one of these cases where unit packages struggle with to get their hook in (and where units on dtypes would help). So they fight the same problem. Nonetheless, these problems are known and solved as well as possible in the units packages, but in xarray, one would have to deal with them all over again.

burnpanck avatar Sep 20 '16 09:09 burnpanck

Or another way to put it: While typical metadata/attributes are only relevant if you eventually read them (which is where you will notice if they were lost on the way), units are different: They work silently behind the scene at all times, even if you do not explicitly look for them. You want an addition to fail if units don't match, without having to explicitly first test if the operands have units. So what should the ufunc_hook do if it finds two Variables that don't seem to carry units, raise an exception? Most probably not, as that would prevent to use xarray at the same time without units. So if the units are lost on the way, you might never notice, but end up with wrong data. To me, that is just not unlikely enough to happen given the damage it can do (e.g. the time it takes to find out what's going on once you realise you get wrong data).

burnpanck avatar Sep 20 '16 09:09 burnpanck

So for now, I'm hunting for np.asarray.

burnpanck avatar Sep 20 '16 09:09 burnpanck

@burnpanck - thanks for the very well-posed description of why units are so useful not as some meta-data, but as an integral property. Of course, this is also why making them part of a new dtype is a great idea! But failing that, I'd agree that it has to be part of something like an ndarray subclass; this is indeed what we do in astropy.units.Quantity (and concatenate does not work for us either...).

Now, off-topic but still: what is a little less wonderful is that there seem to be quite a few independent units implementations around (even just in astronomy, there is that of amuse; ours is based on things initially developped by pynbody). It may well be hard to merge them at this stage, but it would be good to think how we could at least interoperate...

mhvk avatar Sep 20 '16 23:09 mhvk

In order to maintain a list of currently relevant issues, we mark issues as stale after a period of inactivity If this issue remains relevant, please comment here; otherwise it will be marked as closed automatically

stale[bot] avatar Jan 26 '19 13:01 stale[bot]

This is still relevant. Hopefully the advent of __array_function__ in NumPy will make this easier/possible.

shoyer avatar Jan 26 '19 22:01 shoyer

@rabernat recent post inspired me to check out this issue. What would this issue entail now that __array_function__ is in numpy? Is there some reason this is more complicated than adding an appropriate __array_function__ to pint's quantity class?

nbren12 avatar Apr 12 '19 16:04 nbren12

Three things will need to change internally in xarray:

  1. .data is currently required to return a NumPy or dask array. This will need to be relaxed to include "any duck array type". (For now, we can store an explicit list of these types.)
  2. We need to rewrite Xarray's internal array operations, found in xarray/core/duck_array_ops.py, to use NumPy's API when __array_function__ is enabled instead of our ad-hoc checks. Eventually (once our minimum required numpy version is 1.17), we should be able to delete most of duck_array_ops entirely!
  3. We should figure out what a minimal "units layer" would look like in xarray, exposing a few attributes or methods that call out to underlying unit implementations, e.g., DataArray.units should be redirected to pull out DataArray.data.units

shoyer avatar Apr 12 '19 16:04 shoyer

Probably worth pinging @dopplershift again. He has wrestled with this a lot.

rabernat avatar Apr 12 '19 16:04 rabernat

2. once our minimum required numpy version is 1.17

@shoyer - what would be an approximate time frame for this?

rabernat avatar Apr 12 '19 16:04 rabernat

(I just added a third bullet to my list above)

  1. once our minimum required numpy version is 1.17

@shoyer - what would be an approximate time frame for this?

First, we'll need to wait for NumPy 1.17 to be released :). But more seriously, if we do a breaking release of xarray we can probably with bumping the required NumPy version significantly.

It's definitely a smoother experience for users if we allow at least slightly older versions of NumPy (e.g., so they can use newer xarray with a version of NumPy pre-packaged with their system), but if keeping existing things working with the current version of NumPy is a pain, then it may be worth upgrading the minimum required version.

shoyer avatar Apr 12 '19 16:04 shoyer

One additional issue. It seems like pint has some odd behavior with dask. Multiplication (and I assume addition) is not commutative:

In [42]: da.ones((10,)) * ureg.m
Out[42]: dask.array<mul, shape=(10,), dtype=float64, chunksize=(10,)>

In [43]: ureg.m * da.ones((10,))
Out[43]: dask.array<mul, shape=(10,), dtype=float64, chunksize=(10,)> <Unit('meter')>

nbren12 avatar Apr 12 '19 16:04 nbren12