xarray icon indicating copy to clipboard operation
xarray copied to clipboard

Public testing framework for duck array integration

Open TomNicholas opened this issue 3 years ago • 8 comments

What is your issue?

In #4972 @keewis started writing a public framework for testing the integration of any duck array class in xarray, inspired by the testing framework pandas has for ExtensionArrays. This is a meta-issue for what our version of that framework for wrapping numpy-like duck arrays should look like.

(Feel free to edit / add to this)

What behaviour should we test?

We have a lot of xarray methods to test with any type of duck array. Each of these bullets should correspond to one or more testing base classes which the duck array library author would inherit from. In rough order of increasing complexity:

  • [x] Constructors - Including for Variable #6903
  • [x] Properties - checking that .shape, .dtype etc. exist on the wrapped array, see #4285 for example #6903
  • [x] Reductions - #4972 also uses parameters to automatically test many methods, and hypothesis to test each method for many different array instances.
  • [ ] Unary ops
  • [ ] Binary ops
  • [ ] Selection
  • [ ] Computation
  • [ ] Combining
  • [ ] Groupby
  • [ ] Rolling
  • [ ] Coarsen
  • [ ] Weighted

We don't need to test that the array class obeys everything else in the Array API Standard. (For instance .device is probably never going to be used by xarray directly.) We instead assume that if the array class doesn't implement something in the API standard but all the generated tests pass, then all is well.

How extensible does our testing framework need to be?

To be able to test any type of wrapped array our testing framework needs to itself be quite flexible.

  • User-defined checking - For some arrays np.testing.assert_equal is not enough to guarantee correctness, so the user creating tests needs to specify additional checks. #4972 shows how to do this for checking the units of resulting pint arrays.
  • User-created data? - Some array libraries might need to test array data that is invalid for numpy arrays. I'm thinking specifically of testing wrapping ragged arrays. #4285
  • Parallel computing frameworks? - Related to the last point is chunked arrays. Here the strategy requires an extra chunks argument when the array is created, and any results need to first call .compute(). Testing parallel-executed arrays might also require pretty complicated SetUps and TearDowns in fixtures too. (see also #6807)

What documentation / examples do we need?

All of this content should really go on a dedicated page in the docs, perhaps grouped alongside other ways of extending xarray.

  • [ ] Motivation
  • [ ] What subset of the Array API standard we expect duck array classes to define (could point to a typing protocol?)
  • [ ] Explanation that the array type needs to return the same type for any numpy-like function which xarray might call upon that type (i.e. the set of duckarray instances is closed under numpy operations)
  • [ ] Explanation of the different base classes
  • [ ] Simple demo of testing a toy numpy-like array class
  • [ ] Point to code testing more advanced examples we actually use (e.g. sparse, pint)
  • [ ] Which advanced behaviours are optional (e.g. Constructors and Properties have to work, but Groupby is optional)

Where should duck array compatibility testing eventually live?

Right now the tests for sparse & pint are going into the xarray repo, but presumably we don't want tests for every duck array type living in this repository. I suggest that we want to work towards eventually having no array library-specific tests in this repository at all. (Except numpy I guess.) Thanks @crusaderky for the original suggestion.

Instead all tests involving pint could live in pint-xarray, all involving sparse could live in the sparse repository (or a new sparse-xarray repo), etc. etc. We would set those test jobs to re-run when xarray is released, and then xref any issues revealed here if needs be.

We should probably also move some of our existing tests https://github.com/pydata/xarray/pull/7023#pullrequestreview-1104932752

TomNicholas avatar Aug 08 '22 18:08 TomNicholas

with the implementation in #4972 you should already be able to specify a hypothesis strategy to create e.g. a random awkward array. Same with dask or other parallel computing frameworks: if you can construct a hypothesis strategy for them the testing framework should be able to use that. check_reduce (or maybe it should be just check?) should allow customizing the comparison (or actually, that's the entire test code at the moment), so putting compute (or todense / get) calls should be easy.

For setup and teardown I think we could use pytest fixtures (and apply them automatically to each function). However, maybe we should just not use parametrize but instead define separate functions for each reduce operation? Then it would be possible to override that manually. As far as I remember I chose not to do that because tests that only delegate to super().test_function() just are not great design – if we can think of a way to do that while avoiding those kinds of test redefinitions I'd be happy with that (and then we could get rid of the apply_marks function which is a ugly hack of pytest internals).

I agree that moving the array library tests to dedicated repositories makes a lot sense (for example, the pint tests use old versions of the conversion functions from pint-xarray), but note that at the moment the tests for pint seem to increase the total test coverage of xarray a bit. I guess that just means we'd have to improve the rest of the testsuite?

keewis avatar Aug 09 '22 09:08 keewis

you should already be able to specify a hypothesis strategy to create e.g. a random awkward array

Sounds good!

or maybe it should be just check?

Yes just check probably.

However, maybe we should just not use parametrize but instead define separate functions for each reduce operation?

But then the user writing the test code would have to write one of their own tests per xarray method wouldn't they? I think we should avoid putting that much work on them if we can. I think your current approach seems fine so far...

the pint tests use old versions of the conversion functions from pint-xarray

That's basically technical debt, so we should make a point to remove them from xarray eventually.

the tests for pint seem to increase the total test coverage of xarray https://github.com/pydata/xarray/pull/5692#issuecomment-1040002844. I guess that just means we'd have to improve the rest of the testsuite?

So long as @benbovy (or someone) writes new tests to cover the bugs that were revealed then this is fine.

TomNicholas avatar Aug 09 '22 15:08 TomNicholas

Typing duck array is also a little challenging I find, we pretty much only do Any at the moment. I found some nice references and discussions that might be interesting for this: https://github.com/pmeier/array-protocol https://github.com/data-apis/array-api/issues/229

Illviljan avatar Aug 09 '22 18:08 Illviljan

Typing duck array is also a little challenging I find

Thanks @Illviljan - I was literally just thinking about that here.

TomNicholas avatar Aug 09 '22 18:08 TomNicholas

Another thing that might be useful is the hypothesis strategies in the test suite for the array API consortium standard (cc @keewis).

TomNicholas avatar Aug 10 '22 05:08 TomNicholas

there's also the experimental array api strategies built into hypothesis

keewis avatar Aug 16 '22 10:08 keewis

@asmeurer recently pointed me to https://data-apis.org/array-api-tests/. Would that be useful here?

jhamman avatar Sep 22 '22 22:09 jhamman

Looks like these

https://data-apis.org/array-api-tests/.

use these

experimental array api strategies

Would that be useful here?

I think they are complementary. In theory if xarray supports the array API standard and a library passes all the data array API tests, then it should also pass all of xarray's tests (rendering the latter uneccessary). But in practice I think the tests here would still be useful, if only for the possible case of libraries that don't fully meet the API standard yet would still work fine in xarray.

TomNicholas avatar Sep 22 '22 23:09 TomNicholas