Use `shell=False` for `Popen` on Windows
Description
I'm having hard times creating new environments with working compiler dependencies.
My debugging attempts show that shell=False could fix this, however this observation contradicts the comments next to that line.
Further testing may be required.
Related Issue
Hints?
Checklist
- [x] Checked that the pre-commit linting/style checks pass
- [ ] Included tests that prove the fix is effective or that the new feature works
- [ ] ~Added necessary documentation (docstrings and/or example notebooks)~
- [x] If you are a pro: each commit corresponds to a relevant logical change
Type of change
- [ ] New feature / enhancement
- [x] Bug fix
- [ ] Documentation
- [ ] Maintenance
- [ ] Other (please specify):
📚 Documentation preview 📚: https://pytensor--1324.org.readthedocs.build/en/1324/
Codecov Report
Attention: Patch coverage is 85.91549% with 10 lines in your changes missing coverage. Please review.
Project coverage is 82.08%. Comparing base (
f1514eb) to head (fa88367). Report is 185 commits behind head on main.
| Files with missing lines | Patch % | Lines |
|---|---|---|
| pytensor/link/c/cmodule.py | 85.91% | 7 Missing and 3 partials :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## main #1324 +/- ##
==========================================
+ Coverage 82.05% 82.08% +0.02%
==========================================
Files 203 206 +3
Lines 48863 49171 +308
Branches 8695 8718 +23
==========================================
+ Hits 40093 40360 +267
- Misses 6619 6654 +35
- Partials 2151 2157 +6
| Files with missing lines | Coverage Δ | |
|---|---|---|
| pytensor/link/c/cmodule.py | 61.10% <85.91%> (+0.22%) |
:arrow_up: |
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
@Armavica you also worked on these components - what do you think about shell=False? Does it work on your machine?
I will have a look this weekend or next week
I will have a look this weekend or next week
@Armavica did you already get a chance?
I don't think I worked specifically on this besides cosmetic changes, and I never tried to install PyTensor on Windows.
However, I think if we can make things work with shell=False it's probably for the best. Have you tried without the list to string conversion that follows, that is supposed to be only needed for shell=True?
Have you tried without the list to string conversion that follows
Without the list to string conversion already the first test failed.
Should we have a try/except where this is called that tries the shell trick if it fails without it, just in case this is actually doing something for someone out there?
I don't think that's needed. To me it looks like the recent changes to compilation dependencies (switching to g++, no longer using mkl-toolchain and so on) simply don't work with shell=True.
And this is only going to affect future versions which will be in that new configuration..
We didn't switch to g++ recently or anything like that.
We didn't switch to g++ recently or anything like that.
I'm mean this: https://github.com/pymc-devs/pytensor/commit/f37380f5aa04c81b5f98a898395c3b358f8338b5
Related https://github.com/conda-forge/pytensor-suite-feedstock/pull/133#issuecomment-2673831166
What about non-conda users?
Do you have a specific scenario in mind?
~I can also introduce a subprocess_shell config variable.~
can we get to a conclusion on this one?
from my point of view this change fixes all Windows setups I can come up with; all of which were broken before.. If there are other relevant setups we should test, please let me know.
It all seems good to me, but I'm gun shy about approving since I haven't touched this part of the codebase before. You're saying the recent changes to the tooling makes that comment about the conda g++.batobsolete? @maresb might also weigh in, since he knows a lot about the conda tooling stuff
I took this further because halfway workarounds are very difficult for me to reason about.
Have you tried without the list to string conversion that follows
Without the list to string conversion already the first test failed.
Is this in reference to the following?
https://github.com/pymc-devs/pytensor/blob/2928211abbd4d55fcab25dde98a2048ec0ed9428/pytensor/utils.py#L140-L145
I'd love to get rid of that too, but I'm not quite feeling that motivated.
Thanks @maresb, your commits look good! I hesitated about that part of the code, but it looks like you brought the right tools to the party.
Running the tests now..
Hmm, this may be an important occasion to make codecov happy. This is a situation where the tests are basically the only indication we're not breaking stuff.
Just about to take off now though. ✈️
Unsurprisingly I get failing tests when running things locally (even pip install -e . doesn't work because I can't compile the scan module..), but with this branch it's better than on main, so that's nice.
91 failed, 8397 passed, 455 skipped, 91 xfailed, 2704 warnings, 40 errors in 864.99s
List of failing tests
FAILED tests/compile/test_mode.py::test_get_target_language - AssertionError: assert ('py',) == ('c', 'py')
FAILED tests/compile/test_shared.py::TestSharedVariable::test_ctors - AssertionError: TensorType(int64, shape=())
FAILED tests/link/c/test_cmodule.py::test_compiler_error - pytensor.link.c.exceptions.MissingGXX: g++ not available! We can't compile c code.
FAILED tests/link/c/test_cmodule.py::test_cache_versioning - UserWarning: `pytensor.config.cxx` is not an identifiable `g++` compiler. PyTensor will disable compiler optimizations specific to `g++`. At worst, this c...
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_intel-Working_CXX-Linux] - AssertionError: assert {''} == {'-liomp5', '..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_intel-Working_CXX-Windows] - AssertionError: assert {''} == {'-liomp5', '..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_intel-Working_CXX-Darwin] - AssertionError: assert {''} == {'-liomp5', '..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_intel-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_intel-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_intel-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags_conda_windows[mkl_intel] - assert {''} == {'-L"C:\\User..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_gnu-Working_CXX-Linux] - AssertionError: assert {''} == {'-lgomp', '-..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_gnu-Working_CXX-Windows] - AssertionError: assert {''} == {'-lgomp', '-..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_gnu-Working_CXX-Darwin] - AssertionError: assert {''} == {'-lgomp', '-..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_gnu-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_gnu-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[mkl_gnu-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,)
were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags_conda_windows[mkl_gnu] - assert {''} == {'-L"C:\\User..., '-lpthread'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[accelerate-Working_CXX-Darwin] - AssertionError: assert {''} == {'-framework', 'Accelerate'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[accelerate-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[accelerate-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[accelerate-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[openblas-Working_CXX-Linux] - AssertionError: assert {''} == {'-fopenmp', ... '-lopenblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[openblas-Working_CXX-Windows] - AssertionError: assert {''} == {'-fopenmp', ... '-lopenblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[openblas-Working_CXX-Darwin] - AssertionError: assert {''} == {'-fopenmp', ... '-lopenblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[openblas-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,)
were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[openblas-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[openblas-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags_conda_windows[openblas] - assert {''} == {'-L"C:\\User... '-lopenblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[lapack-Working_CXX-Linux] - AssertionError: assert {''} == {'-lblas', '-...apack', '-lm'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[lapack-Working_CXX-Windows] - AssertionError: assert {''} == {'-lblas', '-...apack', '-lm'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[lapack-Working_CXX-Darwin] - AssertionError: assert {''} == {'-lblas', '-...apack', '-lm'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[lapack-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[lapack-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,)
were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[lapack-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags_conda_windows[lapack] - assert {''} == {'-L"C:\\User...apack', '-lm'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[blas-Working_CXX-Linux] - AssertionError: assert {''} == {'-lblas', '-lcblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[blas-Working_CXX-Windows] - AssertionError: assert {''} == {'-lblas', '-lcblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[blas-Working_CXX-Darwin] - AssertionError: assert {''} == {'-lblas', '-lcblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[blas-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[blas-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[blas-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags_conda_windows[blas] - assert {''} == {'-L"C:\\User...s', '-lcblas'}
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[no_blas-Broken_CXX-Linux] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[no_blas-Broken_CXX-Windows] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
FAILED tests/link/c/test_cmodule.py::test_default_blas_ldflags[no_blas-Broken_CXX-Darwin] - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,)
were emitted.
FAILED tests/link/c/test_cmodule.py::test_cache_race_condition - AttributeError: Can't get local object 'test_cache_race_condition.<locals>.f_build'
FAILED tests/link/c/test_op.py::test_ExternalCOp_c_code_cache_version - assert 2 == 0
FAILED tests/link/c/test_type.py::TestEnumTypes::test_op_with_cenumtype_debug - NotImplementedError
FAILED tests/link/test_vm.py::test_partial_function[cvm] - pytensor.link.c.exceptions.MissingGXX: lazylinker will not be imported if pytensor.config.cxx is not set.
FAILED tests/link/test_vm.py::test_partial_function_with_output_keys[cvm] - TypeError: Loop.__call__() got an unexpected keyword argument 'output_subset'
FAILED tests/link/test_vm.py::test_partial_function_with_updates[cvm] - TypeError: Loop.__call__() got an unexpected keyword argument 'output_subset'
FAILED tests/scalar/test_math.py::test_gammainc_nan_c - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(gammainc(ScalarFromTensor(<Scalar(float64, shape=())...
FAILED tests/scalar/test_math.py::test_gammainc_inf_c - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(gammainc(ScalarFromTensor(<Scalar(float64, shape=())...
FAILED tests/scalar/test_math.py::test_gammaincc_nan_c - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(gammaincc(ScalarFromTensor(<Scalar(float64, shape=()...
FAILED tests/scalar/test_math.py::test_gammaincc_inf_c - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(gammaincc(ScalarFromTensor(<Scalar(float64, shape=()...
FAILED tests/scalar/test_math.py::test_gammal_nan_c - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(gammal(ScalarFromTensor(<Scalar(float64, shape=())>)...
FAILED tests/scalar/test_math.py::test_gammau_nan_c - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(gammau(ScalarFromTensor(<Scalar(float64, shape=())>)...
FAILED tests/scalar/test_math.py::test_betainc[c] - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(betainc(ScalarFromTensor(a), ScalarFromTensor(b), Sc...
FAILED tests/scan/test_basic.py::test_output_storage_reuse[cvm] - ModuleNotFoundError: No module named 'pytensor.scan.scan_perform'
FAILED tests/scan/test_rewriting.py::TestSaveMem::test_while_scan_taps - OverflowError: Python integer 1000 out of bounds for int8
FAILED tests/scan/test_rewriting.py::TestSaveMem::test_while_scan_taps_and_map - OverflowError: Python integer 200 out of bounds for int8
FAILED tests/tensor/rewriting/test_basic.py::TestLocalCanonicalizeAlloc::test_inconsistent_shared[True] - AssertionError: Regex pattern did not match.
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-None-floatX-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-None-int32-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-0-floatX-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-0-int32-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-1-floatX-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-1-int32-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-axis3-floatX-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-axis3-int32-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-axis4-floatX-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/rewriting/test_elemwise.py::TestFusion::test_CAReduce_single_input[sum-sum-axis4-int32-cvm] - ValueError: too many values to unpack (expected 1)
FAILED tests/tensor/test_basic.py::TestAlloc::test_alloc_of_view_linker - UserWarning: `pytensor.config.cxx` is not an identifiable `g++` compiler. PyTensor will disable compiler optimizations specific to `g++`. At worst, this c...zations specific to `g++`. At worst, this c... ++`. At worst, this c...
FAILED tests/tensor/test_basic.py::TestAlloc::test_runtime_broadcast[mode1] - UserWarning: `pytensor.config.cxx` is not an identifiable `g++` compiler. PyTensor will disable compiler optimizations specific to `g At worst, this c...timizations specific to `g++`. At worst, this c...
FAILED tests/tensor/test_basic.py::TestJoinAndSplit::test_split_view[c] - UserWarning: `pytensor.config.cxx` is not an identifiable `g++` compiler. PyTensor will disable compiler optimizations specific to `g++`.zations specific to `g++`. At worst, this c... e=(?,))>))')
FAILED tests/tensor/test_basic.py::TestLongTensor::test_fit_int64 - AssertionError: assert 'int64' == 'int32' Use `specify_broadcastable`...
FAILED tests/tensor/test_blas.py::test_batched_dot_blas_flags - AssertionError: assert False
FAILED tests/tensor/test_elemwise.py::TestDimShuffle::test_c_views - pytensor.link.c.exceptions.MissingGXX: ("g++ not available! We can't compile c code.", 'FunctionGraph(ExpandDims{axis=0}(<Vector(float64, shaps=0}(<Vector(float64, shape=(?,))>))')
FAILED tests/tensor/test_extra_ops.py::test_broadcast_shape_symbolic_one_symbolic - AssertionError: Could not broadcast dimensions. Broadcasting is only allowed along axes that have a statically known length 1. tatically known length 1. Use `specify_broadcastable`...
FAILED tests/tensor/test_interpolate.py::test_interpolate_scalar_extrapolate[linear] - TypeError: 'numpy.int64' object does not support item assignment
FAILED tests/tensor/test_interpolate.py::test_interpolate_scalar_extrapolate[nearest] - TypeError: 'numpy.int64' object does not support item assignment
FAILED tests/tensor/test_interpolate.py::test_interpolate_scalar_extrapolate[first] - TypeError: 'numpy.int64' object does not support item assignment
FAILED tests/tensor/test_interpolate.py::test_interpolate_scalar_extrapolate[last] - TypeError: 'numpy.int64' object does not support item assignment .]], [[0.]], [[-...
FAILED tests/tensor/test_interpolate.py::test_interpolate_scalar_extrapolate[mean] - TypeError: 'numpy.int64' object does not support item assignment
FAILED tests/tensor/test_math.py::TestInvBroadcast::test_grad - pytensor.gradient.GradientError: GradientError: numeric gradient and analytic gradient exceed tolerance:
FAILED tests/tensor/test_math.py::TestMinMax::test_uint[uint64]
FAILED tests/tensor/test_math.py::TestMinMax::test_uint64_special_value
FAILED tests/tensor/test_math_scipy.py::TestHyp2F1Broadcast::test_grad - AttributeError: ("'Scratchpad' object has no attribute 'ufunc'\nApply node that caused the error: Scalarloop(Composite{...}.3, [[0.]], [[0posite{...}.3, [[0.]], [[0.]], [[0.]], [[-...
FAILED tests/tensor/test_math_scipy.py::TestHyp2F1Grad::test_hyp2f1_grad_stan_cases - AttributeError: 'Scratchpad' object has no attribute 'ufunc'
FAILED tests/tensor/test_math_scipy.py::TestHyp2F1Grad::test_unused_grad_loop_opt[wrt6] - ValueError: not enough values to unpack (expected 2, got 1)
ERROR tests/compile/function/test_types.py::test_minimal_random_function_call_benchmark[True]
ERROR tests/compile/function/test_types.py::test_minimal_random_function_call_benchmark[False]
ERROR tests/graph/test_basic.py::TestTruncatedGraphInputs::test_single_pass_per_node
ERROR tests/scan/test_basic.py::TestExamples::test_reordering
ERROR tests/scan/test_basic.py::TestExamples::test_scan_as_tensor_on_gradients
ERROR tests/scan/test_basic.py::TestExamples::test_multiple_outs_taps
ERROR tests/scan/test_rewriting.py::TestPushOutAddScan::test_pregreedy_optimizer
ERROR tests/scan/test_rewriting.py::TestSaveMem::test_savemem_opt
ERROR tests/tensor/rewriting/test_elemwise.py::TestFusion::test_eval_benchmark
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:()-mu:()]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:()-mu:(1000,)]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:()-mu:(4, 1000)]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:(1000,)-mu:()]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:(1000,)-mu:(1000,)]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:(1000,)-mu:(4, 1000)]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:(4, 1000)-mu:()]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:(4, 1000)-mu:(1000,)]
ERROR tests/tensor/test_blockwise.py::test_batched_mvnormal_logp_and_dlogp[cov:(4, 1000)-mu:(4, 1000)]
ERROR tests/tensor/test_elemwise.py::TestDimShuffle::test_benchmark[True]
ERROR tests/tensor/test_elemwise.py::TestDimShuffle::test_benchmark[False]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=0]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=1]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=2]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=(0, 1)]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=(0, 2)]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=(1, 2)]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=True-axis=None]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=0]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=1]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=2]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=(0, 1)]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=(0, 2)]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=(1, 2)]
ERROR tests/tensor/test_elemwise.py::test_c_careduce_benchmark[c_contiguous=False-axis=None]
ERROR tests/tensor/test_math_scipy.py::test_gammaincc_ddk_performance
ERROR tests/tensor/test_math_scipy.py::TestHyp2F1Grad::test_benchmark[a-case0]
ERROR tests/tensor/test_math_scipy.py::TestHyp2F1Grad::test_benchmark[a-case1]
ERROR tests/tensor/test_math_scipy.py::TestHyp2F1Grad::test_benchmark[all-case0]
ERROR tests/tensor/test_math_scipy.py::TestHyp2F1Grad::test_benchmark[all-case1]
ERROR tests/tensor/test_shape.py::TestReshape::test_benchmark
Ran a bunch of tests, did some more refactoring and fixed some tests for Windows.
Everything appears to be working fine.
Can we merge this and cut a release?
There is a slight chance that this might break things for some people, but I vote for releasing and then dealing with it in case anything goes wrong.