esda icon indicating copy to clipboard operation
esda copied to clipboard

`Moran_Local` fails when `numba` installed – Spatial Autocorrelation notebook

Open jGaboardi opened this issue 6 months ago • 1 comments

---------------------------------------------------------------------------
TypingError                               Traceback (most recent call last)
Cell In[41], line 1
----> 1 li = esda.moran.Moran_Local(y, wq)

File [~/github_repos/esda/esda/moran.py:1329](http://localhost:8888/~/github_repos/esda/esda/moran.py#line=1328), in Moran_Local.__init__(self, y, w, transformation, permutations, geoda_quads, n_jobs, keep_simulations, seed, island_weight)
   1327 self.__moments()
   1328 if permutations:
-> 1329     self.p_sim, self.rlisas = _crand_plus(
   1330         z,
   1331         w,
   1332         self.Is,
   1333         permutations,
   1334         keep_simulations,
   1335         n_jobs=n_jobs,
   1336         stat_func=_moran_local_crand,
   1337         seed=seed,
   1338     )
   1339     self.sim = np.transpose(self.rlisas)
   1340     if keep_simulations:

File [~/github_repos/esda/esda/crand.py:188](http://localhost:8888/~/github_repos/esda/esda/crand.py#line=187), in crand(z, w, observed, permutations, keep, n_jobs, stat_func, scaling, seed, island_weight)
    185         n_jobs = 1
    187 if n_jobs == 1:
--> 188     larger, rlocals = compute_chunk(
    189         0,  # chunk start
    190         z,  # chunked z, for serial this is the entire data
    191         z,  # all z, for serial this is also the entire data
    192         observed,  # observed statistics
    193         cardinalities,  # cardinalities conforming to chunked z
    194         self_weights,  # n-length vector containing the self-weights.
    195         other_weights,  # flat weights buffer
    196         permuted_ids,  # permuted ids
    197         scaling,  # scaling applied to all statistics
    198         keep,  # whether or not to keep the local statistics
    199         stat_func,
    200         island_weight,
    201     )
    202 else:
    203     if n_jobs == -1:

File [~/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/core/dispatcher.py:424](http://localhost:8888/~/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/core/dispatcher.py#line=423), in _DispatcherBase._compile_for_args(self, *args, **kws)
    420         msg = (f"{str(e).rstrip()} \n\nThis error may have been caused "
    421                f"by the following argument(s):\n{args_str}\n")
    422         e.patch_message(msg)
--> 424     error_rewrite(e, 'typing')
    425 except errors.UnsupportedError as e:
    426     # Something unsupported is present in the user code, add help info
    427     error_rewrite(e, 'unsupported_error')

File [~/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/core/dispatcher.py:365](http://localhost:8888/~/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/core/dispatcher.py#line=364), in _DispatcherBase._compile_for_args.<locals>.error_rewrite(e, issue_type)
    363     raise e
    364 else:
--> 365     raise e.with_traceback(None)

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Failed in nopython mode pipeline (step: nopython frontend)
Failed in nopython mode pipeline (step: nopython frontend)
No implementation of function Function(<intrinsic _impl>) found for signature:
 
 >>> _impl(array(float32, 2d, C), array(float64, 1d, C))
 
There are 2 candidate implementations:
  - Of which 2 did not match due to:
  Intrinsic in function 'dot_2_impl.<locals>._impl': File: numba[/np/linalg.py](http://localhost:8888/np/linalg.py): Line 554.
    With argument(s): '(array(float32, 2d, C), array(float64, 1d, C))':
   Rejected as the implementation raised a specific error:
     TypingError: '@' arguments must all have the same dtype
  raised from [~/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/np/linalg.py:574](http://localhost:8888/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/np/linalg.py#line=573)

During: resolving callee type: Function(<intrinsic _impl>)
During: typing of call at [~/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/np/linalg.py](http://localhost:8888/miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/np/linalg.py) (593)


File "../../../miniforge3/envs/py313_esda-latest/lib/python3.13/site-packages/numba/np/linalg.py", line 593:
            def _dot2_codegen(context, builder, sig, args):
                <source elided>

        return lambda left, right: _impl(left, right)
        ^

During: Pass nopython_type_inference
During: typing of intrinsic-call at [~/esda/esda/moran.py](http://localhost:8888/esda/moran.py) (2879)

File ".[./esda/moran.py", line 2879](http://localhost:8888/esda/moran.py#line=2878):
def _moran_local_crand(i, z, permuted_ids, weights_i, scaling):
    <source elided>
    zi, zrand = _prepare_univariate(i, z, permuted_ids, other_weights)
    return zi * (zrand @ other_weights + self_weight * zi) * scaling
    ^

During: Pass nopython_type_inference
During: resolving callee type: type(CPUDispatcher(<function _moran_local_crand at 0x14f1a7920>))
During: typing of call at [~/esda/esda/crand.py](http://localhost:8888/esda/crand.py) (325)

During: resolving callee type: type(CPUDispatcher(<function _moran_local_crand at 0x14f1a7920>))
During: typing of call at [~/esda/esda/crand.py](http://localhost:8888/esda/crand.py) (325)

During: resolving callee type: type(CPUDispatcher(<function _moran_local_crand at 0x14f1a7920>))
During: typing of call at [~/esda/esda/crand.py](http://localhost:8888/esda/crand.py) (325)


File ".[./esda/crand.py", line 325](http://localhost:8888/esda/crand.py#line=324):
def compute_chunk(
    <source elided>
        mask[chunk_start + i] = False
        rstats = stat_func(chunk_start + i, z, permuted_ids, weights_i, scaling)
        ^

During: Pass nopython_type_inference

jGaboardi avatar Jun 29 '25 19:06 jGaboardi

can replicate. Because of the scope of #281 this blocks that. I'm trying to fix so we can merge #281 once and for all!

ljwolf avatar Sep 10 '25 07:09 ljwolf