Nick Hodgskin
Nick Hodgskin
### Todo - [x] remove example benchmarks - [x] Add integration test benchmarks (See `asv_bench/benchmarks/benchmarks_integration.py` for an example.) - [x] advection 2d - [x] ARGO float example - [x] tutorial_nemo_curvilinear.ipynb...
Zarr v3 was released yesterday, causing our unit tests to fail. See [migration guide](https://zarr.readthedocs.io/en/stable/user-guide/v3_migration.html#v3-migration-guide).
For some reason, trying to read in multiple files into a fieldset is failing when there is a depth dimension in the fieldset. I think this is something to do...
In v3 we used fieldtype and the mesh (flat or spherical) to determine the UnitConverter object that is associated with a field. It would be good to reconsider how this...
`FieldSet.from_xarray_dataset()` fundamentally works off of the assumption that a single dataset contains all field information, which may not be correct as the field information can be scattered across multiple files...
- [x] Chose the correct base branch (`main` for v3 changes, `v4-dev` for v4 changes) `numpy.testing.assert_allclose` has a better Pytest error message. Before: ``` > assert np.allclose(pset.u, lat, rtol=1e-6) E...
As discussed in person, and mentioned in #1844, in `v3` there is some coupling in the codebase between FieldSet and ParticleFile in order to support the writing of analytical trajectories....
In `main` we have `check_fieldsets_in_kernels` which defines restrictions on the FieldSet. I think that these can be implemented a different way directly via the kernel API in v4. I propose...
We need to decide to what extent we want to support broken or xarray incompatible metadata in Parcels (e.g., for units). I am of the mindset that we rely on...
To support timevarying depth we have `Field.set_depth_from_field()` and "not_yet_set" in the dimension dict (this is outlined in `tutorial_timevaryingdepthdimensions.ipynb`). This feature, however, is quite intertwined with the codebase (in `FieldSet._check_complete` as...