Tom White
Tom White
**What happened**: calling `matmul` with incompatible array sizes doesn't always fail immediately **What you expected to happen**: if the array sizes are incompatible, then the operation should (consistently) fail immediately,...
- [ ] Closes https://github.com/dask/community/issues/109 - [ ] Tests added / passed - [ ] Passes `pre-commit run --all-files` This is a Dask implementation of the [Python Array API standard](https://data-apis.org/array-api/latest/),...
For very large datasets it might be beneficial to run on Dask (although there is a lot of mileage in machines with lots of cores and RAM). Joblib supports Dask...
See https://github.com/pystatgen/sgkit/runs/7745453482?check_suite_focus=true The problem is that cyvcf2 wheels for macOS Python 3.10 are only available from version [0.30.16](https://pypi.org/project/cyvcf2/0.30.16/#files), but that is built with NumPy 1.23 which is incompatible with Numba....
Hypothesis is finding cases where sgkit and scikit-allel differ. For example: ``` =================================== FAILURES =================================== _______________________________ test_vs_skallel ________________________________ @given(args=ld_prune_args()) # pylint: disable=no-value-for-parameter > @settings(max_examples=50, deadline=None, phases=PHASES_NO_SHRINK) sgkit/tests/test_ld.py:158: _ _ _...
I have a new Mac Mini with an Apple M1 chip. It would be good to be able to run (and develop) sgkit on this architecture.
From #785: ```python import sgkit as sg import sgkit.io.vcf as sgvcf sgvcf.vcf_to_zarr("sgkit/tests/io/vcf/data/sample.vcf.gz", "sample.vcf.gz.zarr") ds = sg.load_dataset("sample.vcf.gz.zarr") sg.save_dataset(ds, "sample2.vcf.gz.zarr", mode="w") ``` prints the warning: ``` SerializationWarning: variable None has data in...
[Pyodide](https://github.com/pyodide/pyodide) uses WebAssembly to run Python in the browser. It has support for a lot of the PyData stack, so I wondered how easy it would be to get sgkit...