noisereduce
noisereduce copied to clipboard
NumPy 2.0 release
Since numpy 2.0.0 release, I got this error when importing the module:
File "/usr/local/lib/python3.10/site-packages/noisereduce/__init__.py", line 1, in <module>
from noisereduce.noisereduce import reduce_noise
File "/usr/local/lib/python3.10/site-packages/noisereduce/noisereduce.py", line 1, in <module>
from noisereduce.spectralgate.stationary import SpectralGateStationary
File "/usr/local/lib/python3.10/site-packages/noisereduce/spectralgate/__init__.py", line 1, in <module>
from .nonstationary import SpectralGateNonStationary
File "/usr/local/lib/python3.10/site-packages/noisereduce/spectralgate/nonstationary.py", line 3, in <module>
from librosa import stft, istft
File "/usr/local/lib/python3.10/site-packages/lazy_loader/__init__.py", line 83, in __getattr__
attr = getattr(submod, name)
File "/usr/local/lib/python3.10/site-packages/lazy_loader/__init__.py", line 82, in __getattr__
submod = importlib.import_module(submod_path)
File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/usr/local/lib/python3.10/site-packages/librosa/core/spectrum.py", line 17, in <module>
from .audio import resample
File "/usr/local/lib/python3.10/site-packages/librosa/core/audio.py", line 15, in <module>
import soxr
File "/usr/local/lib/python3.10/site-packages/soxr/__init__.py", line 10, in <module>
from . import cysoxr
File "src/soxr/cysoxr.pyx", line 1, in init soxr.cysoxr
ImportError: numpy.core.multiarray failed to import (auto-generated because you didn't call 'numpy.import_array()' after cimporting numpy; use '<void>numpy._import_array' to disable if you are certain you don't need it).
A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.0.0 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.
If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.
I think an easy fix should be to specify a numpy version in the requirements.txt, waiting the update of soxr. (https://github.com/dofuuz/python-soxr/issues/28#issue-2314108483)
If you are installing with pip then I think a workaround for now is to specify numpy <2.0 when installing, e.g. pip install noisereduce numpy<2.0
But yes if you clone the repo and then set up an environment with the requirements.txt, you could change it to upper bound numpy
librosa will release a new version once python-soxr makes a new release that supports Numpy 2.0. There is a beta version of python-soxr 0.4.0 that supports numpy 2.0, as stated in the linked comment, but I think librosa might still break on import with numpy >=2.0.
In case it helps, tracking issue for librosa supporting numpy 2.0 is here: https://github.com/librosa/librosa/issues/1831
Hey,
Thanks for pointing that out. Librosa isn't really necessary and it was causing issues with llvmlite. I've already created a branch without Librosa.
We should think about merging this into the master branch, @timsainb.
Hi @nuniz @timsainb two things:
-
just a heads up that
stftis about to get replaced with theShortTimeFFTclass: https://docs.scipy.org/doc/scipy/tutorial/signal.html#short-time-fourier-transform -- so you can get ahead of that deprecation error before someone raises another issue 🙂 -
not sure how worried you all are about this--looks like you already cut a new release--but if you're not testing for numerical differences in results between the librosa and scipy implementations of stft/istft, it might be worth checking just to see how switching to the scipy versions changes the output of
noisereduce?
see e.g. https://dsp.stackexchange.com/questions/71410/difference-between-librosa-stft-and-scipy-signal-stft and also this 2018 post from the lead Librosa maintainer https://groups.google.com/g/librosa/c/Lbh_NT0n7Bo/m/hN045mbeCAAJ
If it's just some numerical error at the fifth decimal place then probably most people won't care of course, but I get nervous about that when I want results that rely on some pre-processing to be replicable
Also not sure if the change to ShortTimeFFT will further change results -- from following the PR, it looked to me like a pretty big overhaul and there were minor bugs that got fixed, e.g. https://github.com/scipy/scipy/commit/54b53022da2b3924f595b1699f7a9c66e62ff811
Hi @NickleDave
- Thanks. Good to know :-) I'll look into that.
- I've tested that change separately before for both stationary and non-stationary versions. It's not bit-wise exact, but close enough.
Maximum value: 0.030 Median value: 0.00087 Mean value: 0.0019 / +-0.9 wav file
I've tested that change separately before for both stationary and non-stationary versions. It's not bit-wise exact, but close enough.
Excellent, sounds close enough to me 🚀