LLVM ERROR: Symbol not found: __sync_fetch_and_add_4 running tests on armv7l (armel)
- [No ] I have tried using the latest released version of Numba (0.52 tested, not 0.54).
- [Yes] I have included a self contained code sample to reproduce the problem.
python3 -m numba.runtests --log -lv
numba 0.52 is crashing on armv7l (armel). The error message is
LLVM ERROR: Symbol not found: __sync_fetch_and_add_4
It 's the same error message as reported in #3052 , but that instance was supposed to have been fixed already.
This is on a debian unstable armel build with python3-numba 0.52.0-4, llvm-9 1:9.0.1-20.
A detailed test log reports
experimental_armel-dchroot$ python3 -m numba.runtests --log -lv
DEBUG:numba.core.byteflow:bytecode dump:
> 0 NOP(arg=None, lineno=38)
2 LOAD_FAST(arg=0, lineno=38)
4 LOAD_FAST(arg=1, lineno=38)
6 BINARY_ADD(arg=None, lineno=38)
8 RETURN_VALUE(arg=None, lineno=38)
DEBUG:numba.core.byteflow:pending: deque([State(pc_initial=0 nstack_initial=0)])
DEBUG:numba.core.byteflow:stack: []
DEBUG:numba.core.byteflow:dispatch pc=0, inst=NOP(arg=None, lineno=38)
DEBUG:numba.core.byteflow:stack []
DEBUG:numba.core.byteflow:dispatch pc=2, inst=LOAD_FAST(arg=0, lineno=38)
DEBUG:numba.core.byteflow:stack []
DEBUG:numba.core.byteflow:dispatch pc=4, inst=LOAD_FAST(arg=1, lineno=38)
DEBUG:numba.core.byteflow:stack ['$a2.0']
DEBUG:numba.core.byteflow:dispatch pc=6, inst=BINARY_ADD(arg=None, lineno=38)
DEBUG:numba.core.byteflow:stack ['$a2.0', '$b4.1']
DEBUG:numba.core.byteflow:dispatch pc=8, inst=RETURN_VALUE(arg=None, lineno=38)
DEBUG:numba.core.byteflow:stack ['$6binary_add.2']
DEBUG:numba.core.byteflow:end state. edges=[]
DEBUG:numba.core.byteflow:-------------------------Prune PHIs-------------------------
DEBUG:numba.core.byteflow:Used_phis: defaultdict(<class 'set'>, {State(pc_initial=0 nstack_initial=0): set()})
DEBUG:numba.core.byteflow:defmap: {}
DEBUG:numba.core.byteflow:phismap: defaultdict(<class 'set'>, {})
DEBUG:numba.core.byteflow:changing phismap: defaultdict(<class 'set'>, {})
DEBUG:numba.core.byteflow:keep phismap: {}
DEBUG:numba.core.byteflow:new_out: defaultdict(<class 'dict'>, {})
DEBUG:numba.core.byteflow:----------------------DONE Prune PHIs-----------------------
DEBUG:numba.core.byteflow:block_infos State(pc_initial=0 nstack_initial=0):
AdaptBlockInfo(insts=((0, {}), (2, {'res': '$a2.0'}), (4, {'res': '$b4.1'}), (6, {'lhs': '$a2.0', 'rhs': '$b4.1', 'res': '$6binary_add.2'}), (8, {'retval': '$6binary_add.2', 'castval': '$8return_value.3'})), outgoing_phis={}, blockstack=(), active_try_block=None, outgoing_edgepushed={})
DEBUG:numba.core.interpreter:label 0:
a = arg(0, name=a) ['a']
b = arg(1, name=b) ['b']
$6binary_add.2 = a + b ['$6binary_add.2', 'a', 'b']
$8return_value.3 = cast(value=$6binary_add.2) ['$6binary_add.2', '$8return_value.3']
return $8return_value.3 ['$8return_value.3']
DEBUG:numba.core.ssa:==== SSA block analysis pass on 0
DEBUG:numba.core.ssa:Running <numba.core.ssa._GatherDefsHandler object at 0xae297c70>
DEBUG:numba.core.ssa:on stmt: a = arg(0, name=a)
DEBUG:numba.core.ssa:on stmt: b = arg(1, name=b)
DEBUG:numba.core.ssa:on stmt: $6binary_add.2 = a + b
DEBUG:numba.core.ssa:on stmt: $8return_value.3 = cast(value=$6binary_add.2)
DEBUG:numba.core.ssa:on stmt: return $8return_value.3
DEBUG:numba.core.ssa:defs defaultdict(<class 'list'>,
{'$6binary_add.2': [<numba.core.ir.Assign object at 0xae29d0a0>],
'$8return_value.3': [<numba.core.ir.Assign object at 0xae29d118>],
'a': [<numba.core.ir.Assign object at 0xae29d298>],
'b': [<numba.core.ir.Assign object at 0xae29d6d0>]})
DEBUG:numba.core.ssa:SSA violators set()
LLVM ERROR: Symbol not found: __sync_fetch_and_add_4
So the error seems to be happening in numba.core.ssa.
Debian bug is: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=994294
I don't think it's going to be easy to move forward much with this until Debian picks up 0.54 (and a newer version of llvmlite based on 11).
(Adding the "more info needed" pending the results of testing 0.54 on Debian ARM).
Thanks. Hopefully we'll have numba 0.54 packaged with llvmlite/11before too long.
This issue is marked as stale as it has had no activity in the past 30 days. Please close this issue if no further response or action is needed. Otherwise, please respond with any updates and confirm that this issue still needs to be addressed.
@drew-parsons are there any updates on this? Shall we keep the issue open?
We're still waiting for Debian to package 0.54, sorry! It's slowed also because packaging for llvmlite 0.37 (for llvm 11) also needs to get done. Please do keep the issue open a little longer.
OK, I removed the stale label.
llvmlite 0.37 is now packaged for debian. The next delay is support for Python 3.10, #7562
This issue is marked as stale as it has had no activity in the past 30 days. Please close this issue if no further response or action is needed. Otherwise, please respond with any updates and confirm that this issue still needs to be addressed.
This issue is still live. numba has only just resolved python3.10 support, and it's only available in RC 0.55.0rc1 not in stable release yet. Debian is working on the numba packaging, but apart from the python3.10 issue there are also some complications with tbb. The packaging trouble is discussed at Debian Bug#1000336
Many thanks for the update - I'm going to remove the "more info needed" label so this won't go stale again.
Hi, Debian has finally got its python3.10 sorted out, and has updated numba to 0.55.2 with llvm 11.1.0 and llvmlite 0.39.0.
Note that debian intends to drop llvm support to a single version (12 or 13) in the next stable release, so llvm-11 will be dropped, causing problems again in the future for testing numba updates. See Debian Bug#1000922. I see that llvmlite support for llvm 12 or 13 is being tracked at https://github.com/numba/llvmlite/issues/688
Unfortunately the __sync_fetch_and_add_4 error is still present in 0.55.2.
Current error output is more or less the same as before:
(sid_armel-dchroot)$ python3 -m numba.runtests --log -lv
/usr/lib/python3/dist-packages/llvmlite/llvmpy/__init__.py:3: UserWarning: The module `llvmlite.llvmpy` is deprecated and will be removed in the future.
warnings.warn(
/usr/lib/python3/dist-packages/llvmlite/llvmpy/core.py:8: UserWarning: The module `llvmlite.llvmpy.core` is deprecated and will be removed in the future. Equivalent functionality is provided by `llvmlite.ir`.
warnings.warn(
/usr/lib/python3/dist-packages/llvmlite/llvmpy/passes.py:17: UserWarning: The module `llvmlite.llvmpy.passes` is deprecated and will be removed in the future. If you are using this code, it should be inlined into your own project.
warnings.warn(
DEBUG:numba.core.byteflow:bytecode dump:
> 0 NOP(arg=None, lineno=38)
2 LOAD_FAST(arg=0, lineno=38)
4 LOAD_FAST(arg=1, lineno=38)
6 BINARY_ADD(arg=None, lineno=38)
8 RETURN_VALUE(arg=None, lineno=38)
DEBUG:numba.core.byteflow:pending: deque([State(pc_initial=0 nstack_initial=0)])
DEBUG:numba.core.byteflow:stack: []
DEBUG:numba.core.byteflow:dispatch pc=0, inst=NOP(arg=None, lineno=38)
DEBUG:numba.core.byteflow:stack []
DEBUG:numba.core.byteflow:dispatch pc=2, inst=LOAD_FAST(arg=0, lineno=38)
DEBUG:numba.core.byteflow:stack []
DEBUG:numba.core.byteflow:dispatch pc=4, inst=LOAD_FAST(arg=1, lineno=38)
DEBUG:numba.core.byteflow:stack ['$a2.0']
DEBUG:numba.core.byteflow:dispatch pc=6, inst=BINARY_ADD(arg=None, lineno=38)
DEBUG:numba.core.byteflow:stack ['$a2.0', '$b4.1']
DEBUG:numba.core.byteflow:dispatch pc=8, inst=RETURN_VALUE(arg=None, lineno=38)
DEBUG:numba.core.byteflow:stack ['$6binary_add.2']
DEBUG:numba.core.byteflow:end state. edges=[]
DEBUG:numba.core.byteflow:-------------------------Prune PHIs-------------------------
DEBUG:numba.core.byteflow:Used_phis: defaultdict(<class 'set'>, {State(pc_initial=0 nstack_initial=0): set()})
DEBUG:numba.core.byteflow:defmap: {}
DEBUG:numba.core.byteflow:phismap: defaultdict(<class 'set'>, {})
DEBUG:numba.core.byteflow:changing phismap: defaultdict(<class 'set'>, {})
DEBUG:numba.core.byteflow:keep phismap: {}
DEBUG:numba.core.byteflow:new_out: defaultdict(<class 'dict'>, {})
DEBUG:numba.core.byteflow:----------------------DONE Prune PHIs-----------------------
DEBUG:numba.core.byteflow:block_infos State(pc_initial=0 nstack_initial=0):
AdaptBlockInfo(insts=((0, {}), (2, {'res': '$a2.0'}), (4, {'res': '$b4.1'}), (6, {'lhs': '$a2.0', 'rhs': '$b4.1', 'res': '$6binary_add.2'}), (8, {'retval': '$6binary_add.2', 'castval': '$8return_value.3'})), outgoing_phis={}, blockstack=(), active_try_block=None, outgoing_edgepushed={})
DEBUG:numba.core.interpreter:label 0:
a = arg(0, name=a) ['a']
b = arg(1, name=b) ['b']
$6binary_add.2 = a + b ['$6binary_add.2', 'a', 'b']
$8return_value.3 = cast(value=$6binary_add.2) ['$6binary_add.2', '$8return_value.3']
return $8return_value.3 ['$8return_value.3']
DEBUG:numba.core.ssa:==== SSA block analysis pass on 0
DEBUG:numba.core.ssa:Running <numba.core.ssa._GatherDefsHandler object at 0xebc53cd0>
DEBUG:numba.core.ssa:on stmt: a = arg(0, name=a)
DEBUG:numba.core.ssa:on stmt: b = arg(1, name=b)
DEBUG:numba.core.ssa:on stmt: $6binary_add.2 = a + b
DEBUG:numba.core.ssa:on stmt: $8return_value.3 = cast(value=$6binary_add.2)
DEBUG:numba.core.ssa:on stmt: return $8return_value.3
DEBUG:numba.core.ssa:defs defaultdict(<class 'list'>,
{'$6binary_add.2': [<numba.core.ir.Assign object at 0xebc57520>],
'$8return_value.3': [<numba.core.ir.Assign object at 0xebc57040>],
'a': [<numba.core.ir.Assign object at 0xebc573e8>],
'b': [<numba.core.ir.Assign object at 0xebc572b0>]})
DEBUG:numba.core.ssa:SSA violators set()
LLVM ERROR: Symbol not found: __sync_fetch_and_add_4
Fatal Python error: Aborted
Current thread 0xf7883310 (most recent call first):
File "/usr/lib/python3/dist-packages/llvmlite/binding/ffi.py", line 151 in __call__
File "/usr/lib/python3/dist-packages/llvmlite/binding/executionengine.py", line 92 in finalize_object
File "/usr/lib/python3/dist-packages/numba/core/codegen.py", line 1061 in wrapper
File "/usr/lib/python3/dist-packages/numba/core/codegen.py", line 1000 in _finalize_specific
File "/usr/lib/python3/dist-packages/numba/core/codegen.py", line 798 in _finalize_final_module
File "/usr/lib/python3/dist-packages/numba/core/codegen.py", line 766 in finalize
File "/usr/lib/python3/dist-packages/numba/core/codegen.py", line 568 in _ensure_finalized
File "/usr/lib/python3/dist-packages/numba/core/codegen.py", line 990 in get_pointer_to_function
File "/usr/lib/python3/dist-packages/numba/core/cpu.py", line 230 in get_executable
File "/usr/lib/python3/dist-packages/numba/core/typed_passes.py", line 423 in run_pass
File "/usr/lib/python3/dist-packages/numba/core/compiler_machinery.py", line 269 in check
File "/usr/lib/python3/dist-packages/numba/core/compiler_machinery.py", line 296 in _runPass
File "/usr/lib/python3/dist-packages/numba/core/compiler_lock.py", line 35 in _acquire_compile_lock
File "/usr/lib/python3/dist-packages/numba/core/compiler_machinery.py", line 341 in run
File "/usr/lib/python3/dist-packages/numba/core/compiler.py", line 463 in _compile_core
File "/usr/lib/python3/dist-packages/numba/core/compiler.py", line 497 in _compile_bytecode
File "/usr/lib/python3/dist-packages/numba/core/compiler.py", line 429 in compile_extra
File "/usr/lib/python3/dist-packages/numba/core/compiler.py", line 693 in compile_extra
File "/usr/lib/python3/dist-packages/numba/np/ufunc/ufuncbuilder.py", line 153 in _compile_core
File "/usr/lib/python3/dist-packages/numba/np/ufunc/ufuncbuilder.py", line 120 in compile
File "/usr/lib/python3/dist-packages/numba/np/ufunc/ufuncbuilder.py", line 172 in _compile_element_wise_function
File "/usr/lib/python3/dist-packages/numba/np/ufunc/dufunc.py", line 219 in _compile_for_argtys
File "/usr/lib/python3/dist-packages/numba/np/ufunc/dufunc.py", line 171 in add
File "/usr/lib/python3/dist-packages/numba/np/ufunc/decorators.py", line 125 in wrap
File "/usr/lib/python3/dist-packages/numba/tests/npyufunc/test_ufuncbuilding.py", line 37 in <module>
File "<frozen importlib._bootstrap>", line 241 in _call_with_frames_removed
File "<frozen importlib._bootstrap_external>", line 883 in exec_module
File "<frozen importlib._bootstrap>", line 688 in _load_unlocked
File "<frozen importlib._bootstrap>", line 1006 in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 1027 in _find_and_load
File "/usr/lib/python3.10/unittest/loader.py", line 154 in loadTestsFromName
File "/usr/lib/python3/dist-packages/numba/testing/__init__.py", line 29 in load_testsuite
File "/usr/lib/python3/dist-packages/numba/tests/npyufunc/__init__.py", line 9 in load_tests
File "/usr/lib/python3.10/unittest/loader.py", line 130 in loadTestsFromModule
File "/usr/lib/python3/dist-packages/numba/testing/loader.py", line 18 in _find_tests
File "/usr/lib/python3.10/unittest/loader.py", line 349 in discover
File "/usr/lib/python3/dist-packages/numba/testing/__init__.py", line 23 in load_testsuite
File "/usr/lib/python3/dist-packages/numba/tests/__init__.py", line 27 in load_tests
File "/usr/lib/python3.10/unittest/loader.py", line 130 in loadTestsFromModule
File "/usr/lib/python3.10/unittest/loader.py", line 191 in loadTestsFromName
File "/usr/lib/python3.10/unittest/loader.py", line 220 in <listcomp>
File "/usr/lib/python3.10/unittest/loader.py", line 220 in loadTestsFromNames
File "/usr/lib/python3.10/unittest/main.py", line 158 in createTests
File "/usr/lib/python3/dist-packages/numba/testing/main.py", line 263 in parseArgs
File "/usr/lib/python3.10/unittest/main.py", line 100 in __init__
File "/usr/lib/python3/dist-packages/numba/testing/main.py", line 168 in __init__
File "/usr/lib/python3/dist-packages/numba/testing/__init__.py", line 54 in run_tests
File "/usr/lib/python3/dist-packages/numba/testing/_runtests.py", line 25 in _main
File "/usr/lib/python3/dist-packages/numba/runtests.py", line 9 in <module>
File "/usr/lib/python3.10/runpy.py", line 86 in _run_code
File "/usr/lib/python3.10/runpy.py", line 196 in _run_module_as_main
Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg.lapack_lite, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, numba.core.typeconv._typeconv, numba._helperlib, numba._dynfunc, numba._dispatcher, numba.core.runtime._nrt_python, numba.np.ufunc._internal, scipy._lib._ccallback_c, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg._cythonized_array_utils, scipy.linalg._flinalg, scipy.linalg._solve_toeplitz, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg.cython_lapack, scipy.linalg._decomp_update, scipy.sparse._sparsetools, scipy.sparse._csparsetools, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, numba.experimental.jitclass._box (total: 40)
Aborted
The context patch for llvm 13 in https://github.com/numba/llvmlite/pull/802 references the need to use libgcc_s.so.1 with armv7l.
However it doesn't help directly (still symbol not found). This is with libgcc_s.so provide by gcc 12.1.0.
(sid_armel-dchroot)dparsons@amdahl:~$ nm -D /lib/arm-linux-gnueabi/libgcc_s.so.1 | grep add
0000e4dc T __adddf3@@GCC_3.0
0000efc8 T __addsf3@@GCC_3.0
0000f978 T __addvdi3@@GCC_3.0
0000f968 T __addvsi3@@GCC_3.0
0000e4dc T __aeabi_dadd@@GCC_3.5
0000efc8 T __aeabi_fadd@@GCC_3.5
0001c060 T __emutls_get_address@@GCC_4.3.0
000125d8 T __gnu_addda3@@GCC_4.3.0
000125ac T __gnu_adddq3@@GCC_4.3.0
000125c8 T __gnu_addha3@@GCC_4.3.0
0001259c T __gnu_addhq3@@GCC_4.3.0
00012594 T __gnu_addqq3@@GCC_4.3.0
000125d0 T __gnu_addsa3@@GCC_4.3.0
000125a4 T __gnu_addsq3@@GCC_4.3.0
00012638 T __gnu_adduda3@@GCC_4.3.0
0001260c T __gnu_addudq3@@GCC_4.3.0
00012628 T __gnu_adduha3@@GCC_4.3.0
000125fc T __gnu_adduhq3@@GCC_4.3.0
000125f4 T __gnu_adduqq3@@GCC_4.3.0
00012630 T __gnu_addusa3@@GCC_4.3.0
00012604 T __gnu_addusq3@@GCC_4.3.0
0001447c T __gnu_ssaddda3@@GCC_4.3.0
000143b8 T __gnu_ssadddq3@@GCC_4.3.0
00014404 T __gnu_ssaddha3@@GCC_4.3.0
00014340 T __gnu_ssaddhq3@@GCC_4.3.0
00014308 T __gnu_ssaddqq3@@GCC_4.3.0
00014450 T __gnu_ssaddsa3@@GCC_4.3.0
0001438c T __gnu_ssaddsq3@@GCC_4.3.0
00014a68 T __gnu_usadduda3@@GCC_4.3.0
000149c0 T __gnu_usaddudq3@@GCC_4.3.0
00014a1c T __gnu_usadduha3@@GCC_4.3.0
00014974 T __gnu_usadduhq3@@GCC_4.3.0
00014950 T __gnu_usadduqq3@@GCC_4.3.0
00014a4c T __gnu_usaddusa3@@GCC_4.3.0
000149a4 T __gnu_usaddusq3@@GCC_4.3.0
@drew-parsons To help understand the setup a bit better, could you provide the output of numba -s please?
This GCC bug report seems to suggest that libgcc.a needs linking along with libgcc_s.so.1 - does adding that to the link help?
The output of numba -s in the armel chroot (standard debian build without patching for libgcc.a) is
/usr/lib/python3/dist-packages/llvmlite/llvmpy/__init__.py:3: UserWarning: The module `llvmlite.llvmpy` is deprecated and will be removed in the future.
warnings.warn(
/usr/lib/python3/dist-packages/llvmlite/llvmpy/core.py:8: UserWarning: The module `llvmlite.llvmpy.core` is deprecated and will be removed in the future. Equivalent functionality is provided by `llvmlite.ir`.
warnings.warn(
/usr/lib/python3/dist-packages/llvmlite/llvmpy/passes.py:17: UserWarning: The module `llvmlite.llvmpy.passes` is deprecated and will be removed in the future. If you are using this code, it should be inlined into your own project.
warnings.warn(
System info:
--------------------------------------------------------------------------------
__Time Stamp__
Report started (local time) : 2022-08-08 11:59:27.338957
UTC start time : 2022-08-08 11:59:27.338974
Running time (s) : 1.621224
__Hardware Information__
Machine : armv8l
CPU Name : generic
CPU Count : 4
Number of accessible CPUs : 4
List of accessible CPUs cores : 0-3
CFS Restrictions (CPUs worth of runtime) : None
CPU Features : fp16 hwdiv hwdiv-arm neon vfp3
vfp4
Memory Total (MB) : 11954
Memory Available (MB) : 11381
__OS Information__
Platform Name : Linux-5.10.0-16-arm64-armv8l-with-glibc2.34
Platform Release : 5.10.0-16-arm64
OS Name : Linux
OS Version : #1 SMP Debian 5.10.127-2 (2022-07-23)
OS Specific Version : ?
Libc Version : glibc 2.34
__Python Information__
Python Compiler : GCC 11.3.0
Python Implementation : CPython
Python Version : 3.10.5
Python Locale : en_US.UTF-8
__Numba Toolchain Versions__
Numba Version : 0.55.2
llvmlite Version : 0.39.0
__LLVM Information__
LLVM Version : 11.1.0
__CUDA Information__
CUDA Device Initialized : False
CUDA Driver Version : ?
CUDA Runtime Version : ?
CUDA NVIDIA Bindings Available : ?
CUDA NVIDIA Bindings In Use : ?
CUDA Detect Output:
None
CUDA Libraries Test Output:
None
__NumPy Information__
NumPy Version : 1.21.5
NumPy Supported SIMD features : ('NEON', 'NEON_FP16', 'NEON_VFPV4')
NumPy Supported SIMD dispatch : ('None found.',)
NumPy Supported SIMD baseline : ('None found.',)
NumPy AVX512_SKX support detected : False
__SVML Information__
SVML State, config.USING_SVML : False
SVML Library Loaded : False
llvmlite Using SVML Patched LLVM : False
SVML Operational : False
__Threading Layer Information__
TBB Threading Layer Available : True
+-->TBB imported successfully.
OpenMP Threading Layer Available : True
+-->Vendor: GNU
Workqueue Threading Layer Available : True
+-->Workqueue imported successfully.
__Numba Environment Variable Information__
None found.
__Conda Information__
Conda not available.
__Installed Packages__
Package Version
------------------ ------------
alabaster 0.7.12
attrs 21.2.0
Babel 2.8.0
beniget 0.4.1
certifi 2020.6.20
chardet 4.0.0
charset-normalizer 2.0.6
colorama 0.4.5
decorator 5.1.1
devscripts 2.22.2
docutils 0.17.1
execnet 1.9.0
gast 0.5.2
idna 3.3
imagesize 1.4.1
importlib-metadata 4.6.4
iniconfig 1.1.1
Jinja2 3.0.3
llvmlite 0.39.0
MarkupSafe 2.0.1
more-itertools 8.10.0
numba 0.55.2
numpy 1.21.5
numpydoc 1.3.1
packaging 21.3
pip 22.2
pluggy 1.0.0+repack
ply 3.11
py 1.10.0
Pygments 2.12.0
pyparsing 3.0.7
pytest 7.1.2
pytest-forked 1.4.0
pytest-xdist 2.5.0
python-debian 0.1.46
pythran 0.11.0
pytz 2022.1
requests 2.27.1
roman 3.3
scipy 1.8.1
setuptools 59.6.0
six 1.16.0
snowballstemmer 2.2.0
Sphinx 4.5.0
sphinx-rtd-theme 1.0.0
tomli 2.0.1
urllib3 1.26.9
wheel 0.37.1
zipp 1.0.0
No errors reported.
__Warning log__
Warning (cuda): CUDA is disabled or no CUDA enabled devices are present.
Exception class: <class 'numba.cuda.cudadrv.error.CudaSupportError'>
Warning: Conda not available.
Error was [Errno 2] No such file or directory: 'conda'
Warning (psutil): psutil cannot be imported. For more accuracy, consider installing it.
Warning (no file): /sys/fs/cgroup/cpuacct/cpu.cfs_quota_us
Warning (no file): /sys/fs/cgroup/cpuacct/cpu.cfs_period_us
--------------------------------------------------------------------------------
If requested, please copy and paste the information between
the dashed (----) lines, or from a given specific section as
appropriate.
=============================================================
IMPORTANT: Please ensure that you are happy with sharing the
contents of the information present, any information that you
wish to keep private you should remove before sharing.
=============================================================
It makes sense that linking to libgcc.a should help. It does contain the missing symbol:
(sid_armel-dchroot)$ nm --quiet /usr/lib/gcc/arm-linux-gnueabi/12/libgcc.a | grep sync_fetch
00000398 T __sync_fetch_and_add_1
00000154 T __sync_fetch_and_add_2
00000000 T __sync_fetch_and_add_4
000004ac T __sync_fetch_and_and_1
00000274 T __sync_fetch_and_and_2
000000a8 T __sync_fetch_and_and_4
00000564 T __sync_fetch_and_nand_1
00000334 T __sync_fetch_and_nand_2
00000118 T __sync_fetch_and_nand_4
00000450 T __sync_fetch_and_or_1
00000214 T __sync_fetch_and_or_2
00000070 T __sync_fetch_and_or_4
000003f4 T __sync_fetch_and_sub_1
000001b4 T __sync_fetch_and_sub_2
00000038 T __sync_fetch_and_sub_4
00000508 T __sync_fetch_and_xor_1
000002d4 T __sync_fetch_and_xor_2
000000e0 T __sync_fetch_and_xor_4
00000048 T __sync_fetch_and_add_8
00000170 T __sync_fetch_and_and_8
00000240 T __sync_fetch_and_nand_8
00000108 T __sync_fetch_and_or_8
000000a8 T __sync_fetch_and_sub_8
000001d8 T __sync_fetch_and_xor_8
I'm not certain how it should be linked in the context of numba and llvmlite however. I tried adding
ll.load_library_permanently('libgcc.a')
after the armv7l handling of ll.load_library_permanently('libgcc.a') in numba/core/cpu.py l.52, but it still generated a "Symbol not found: __sync_fetch_and_add_4" error.
I tried export LDFLAGS="-lgcc" and running the test python3 -m numba.runtests --log -lv, but again still gives the error.
Likely I haven't provided the extra link instruction at the right place.
The error is still present in the debian build of numba 0.56.2+dfsg-1 (with llvmlite 0.39.1 and llvm 11.1.0). How should the numba configuration be edited in order to link with libgcc.a?
I am trying to look into the correct way to add this but so far I haven't found the right way to replicate the issue. I've just installed Raspberry Pi OS on a Raspberry Pi 2 (based on Debian Bullseye), installed llvm-dev (to get LLVM 11), built llvmlite and numba from the main branches, and the tests seem to be proceeding without issue.
@drew-parsons Is there something I've missed or something that's obviously different I should do to try and replicate the issue?
I also installed the Debian-packaged Numba 0.52.0 and I haven't so far replicated the issue with that either.
I just noticed this in the numba -s output from above:
__Hardware Information__
Machine : armv8l
whereas I have armv7l.
I just tried the SD card in a Raspberry Pi 3 (as it's ARMv8, a Cortex A53 IIUC) and didn't replicate the issue in that either (though I did also still see the machine listed as armv7l).
I'm not certain how it should be linked in the context of numba and llvmlite however. I tried adding
ll.load_library_permanently('libgcc.a')after the armv7l handling of
ll.load_library_permanently('libgcc.a')in numba/core/cpu.py l.52, but it still generated a "Symbol not found: __sync_fetch_and_add_4" error.
Thanks for the hint, @drew-parsons ; adding ll.load_library_permanently('libatomic.so') fixed the issue for me on armhf; I'm guessing this will help armel (armv7l) as well.
Confirming my armel chroot is armv8l not armv7l (as returned by platform.machine()). That partly explains why my loading 'libgcc.a' didn't help, since the loading in cpu.py is done in an armv7l block.
But if I invoke ll.load_library_permanently directly in the init() function in cpu.py, I get an error message
RuntimeError: libgcc.a: cannot open shared object file: No such file or directory
for both libgcc.a and libatomic.so.
Both those files are located where they are "expected" to be, in /usr/lib/gcc/arm-linux-gnueabi/12/, same as other arches (apart from the details of the arch triplet). It's behaving as if gcc on armel is not adding /usr/lib/gcc/arm-linux-gnueabi/12/, which would explain the original error. Strange that this one architecture would behave differently.
A little strange too that the patch worked for you on armhf, strange in the sense that armhf must be using it's /usr/lib/gcc/ dir but then not using libatomic.so inside it.
I wonder if there is some other gcc flag we should be adding on armel (for armv8l).
edit:
On the other hand, looking more closely at the existing armv7l block, it loads 'libgcc_s.so.1' not 'libgcc_s.so'
The versioned library is in the standard library path (/usr/lib/arm-linux-gnueabi/), so I can load libatomic with ll.load_library_permanently('libatomic.so.1'). But libgcc.a is not located there (it's in the gcc dir, not the standard dir), so either the __sync_fetch_and_add_4 error, or the "libgcc.a: No such file" error still occurs
Loading an archive is not yet supported, but there's a PR to add it in llvmlite: https://github.com/numba/llvmlite/pull/902 - you could try adding the code from that branch and using the add_archive() function. I don't know where to place the call without trying but one possibilty is to add ee.add_archive('libgcc.a') after this line: https://github.com/numba/numba/blob/main/numba/core/codegen.py#L1070 - let me know if this doesn't help and I can search a bit more for an appropriate place to put it (basically I'm uncertain because I don't know whether LLVM will behave like the linker on the command line, where adding the archive first doesn't help to resolve relocations in objects later on in the link line).
Looks like something more is needed. I rebuilt llmvlite 0.39.1 with PR#902 applied. Then injected ee.add_archive('libgcc.a') at codegen.py l.1070.
But when running the tests, it gets "No such file or directory", presumably referring to libgcc.a. Looks like it mightn't have identified the path required (/usr/lib/gcc/arm-linux-gnueabi/12/)
The error message and python trace was
debian/rules override_dh_auto_test-arch
make[1]: Entering directory '/build/numba/numba-0.56.4+dfsg'
PYBUILD_SYSTEM=custom \
PYTHONPATH=/build/numba/numba-0.56.4+dfsg/../llvmlite-0.39.1/debian/python3-llvmlite/usr/lib/python3/dist-packages/:/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build \
PYBUILD_TEST_ARGS="ls {build_dir} && cd {build_dir} && MPLBACKEND=Agg {interpreter} -Wd runtests.py --exclude-tags='long-running,gdb,compiled_caching' -v -m --random 0.2 -- numba.tests" dh_auto_test || true
I: pybuild base:240: ls /build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build && cd /build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build && MPLBACKEND=Agg python3.11 -Wd runtests.py --exclude-tags='long-running,gdb,compiled_caching' -v -m --random 0.2 -- numba.tests
numba
runtests.py
/build/numba/llvmlite-0.39.1/debian/python3-llvmlite/usr/lib/python3/dist-packages/llvmlite/binding/ffi.py:159: DeprecationWarning: path is deprecated. Use files() instead. Refer to https://importlib-resources.readthedocs.io/en/latest/using.html#migrating-from-legacy for migration advice.
_lib_handle = importlib.resources.path(pkgname, _lib_name)
/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/pycc/platform.py:6: DeprecationWarning:
`numpy.distutils` is deprecated since NumPy 1.23.0, as a result
of the deprecation of `distutils` itself. It will be removed for
Python >= 3.12. For older Python versions it will remain present.
It is recommended to use `setuptools < 60.0` for those Python versions.
For more details, see:
https://numpy.org/devdocs/reference/distutils_status_migration.html
import numpy.distutils.misc_util as np_misc
/usr/lib/python3/dist-packages/setuptools/_distutils/msvccompiler.py:66: DeprecationWarning: msvccompiler is deprecated and slated to be removed in the future. Please discontinue use or file an issue with pypa/distutils describing your use case.
warnings.warn(
Traceback (most recent call last):
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/testing/__init__.py", line 29, in load_testsuite
suite.addTests(loader.loadTestsFromName(f))
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/unittest/loader.py", line 154, in loadTestsFromName
module = __import__(module_name)
^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/tests/npyufunc/test_ufuncbuilding.py", line 32, in <module>
@vectorize(nopython=True)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/np/ufunc/decorators.py", line 123, in wrap
vec = Vectorize(func, **kws)
^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/np/ufunc/decorators.py", line 38, in __new__
return imp(func, identity=identity, cache=cache, targetoptions=kws)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/np/ufunc/dufunc.py", line 85, in __init__
self._initialize(dispatcher, identity)
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/np/ufunc/dufunc.py", line 94, in _initialize
self._install_cg()
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/np/ufunc/dufunc.py", line 313, in _install_cg
targetctx = self._dispatcher.targetdescr.target_context
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/np/ufunc/ufuncbuilder.py", line 65, in target_context
return cpu_target.target_context
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/registry.py", line 47, in target_context
return self._toplevel_target_context
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/functools.py", line 1001, in __get__
val = self.func(instance)
^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/registry.py", line 31, in _toplevel_target_context
return cpu.CPUContext(self.typing_context, self._target_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/cpu.py", line 40, in __init__
super().__init__(typingctx, target)
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/base.py", line 257, in __init__
self.init()
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/compiler_lock.py", line 35, in _acquire_compile_lock
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/cpu.py", line 49, in init
self._internal_codegen = codegen.JITCPUCodegen("numba.exec")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/codegen.py", line 1172, in __init__
self._init(self._llvm_module)
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/codegen.py", line 1188, in _init
self._engine = JitEngine(engine)
^^^^^^^^^^^^^^^^^
File "/build/numba/numba-0.56.4+dfsg/.pybuild/cpython3_3.11_numba/build/numba/core/codegen.py", line 1071, in __init__
ee.add_archive('libgcc.a')
File "/build/numba/llvmlite-0.39.1/debian/python3-llvmlite/usr/lib/python3/dist-packages/llvmlite/binding/executionengine.py", line 168, in add_archive
raise RuntimeError(str(outerr))
RuntimeError: No such file or directory
Ah, I don't think it will identify the required path - does it get further if you give the full path?
I should have though of using the full path! Using
ee.add_archive('/usr/lib/gcc/arm-linux-gnueabi/12/libgcc.a')
at codegen.py l.1070 does get it going. The tests start running successfully on armel without the Symbol not found: __sync_fetch_and_add_4 error.
It's not so stable of course. The 12 in the path refers to gcc-12, so of course it wil break with gcc-13 (the same is true for the existing ll.load_library_permanently('libgcc_s.so.1') line in cpu.py, it will break if there's an ABI bump in the soname).
I looked into some documentation of libgcc.a with respect to link options. There's some discussion at https://gcc.gnu.org/onlinedocs/gcc/Link-Options.html, in particular discussion with respect to the -nostdlib flag. There they suggest -lgcc should be used when needed. There are also -static-libgcc -shared-libgcc flags, but the documention says they might be ignored, so -lgcc would be more robust.
There's related forum discussions at https://askubuntu.com/questions/1166961/gcc-static-linking-tries-fails-to-find-libgcc-s-a https://stackoverflow.com/questions/23410221/linking-libgcc-into-a-nostdlib-compilation
llvmlite's load_library_permanently and the new add_archive need a full file reference. Is there a way to instead have them work with the equivalent of -lgcc (or -lgcc_s), allowing the linker to find the path and file?
llvmlite's
load_library_permanentlyand the newadd_archiveneed a full file reference. Is there a way to instead have them work with the equivalent of-lgcc(or-lgcc_s), allowing the linker to find the path and file?
Sorry for the delay in this reply, I missed your comment. I think it's not easy to emulate the behaviour of GCC's -lgcc / -lgcc_s flags because it's GCC that knows where to look for its libraries. One thing I think you could do is leverage GCC to help you locate it though, using its -print-search-dirs command. On my system this gives:
$ gcc -print-search-dirs
install: /opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/
programs: =/opt/gcc/11.2.0/libexec/gcc/x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/libexec/gcc/x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/libexec/gcc/x86_64-pc-linux-gnu/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/bin/x86_64-linux-gnu/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/bin/
libraries: =/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/lib/x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/lib/x86_64-linux-gnu/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/lib/../lib64/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../x86_64-pc-linux-gnu/11.2.0/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../x86_64-linux-gnu/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../lib64/:/lib/x86_64-pc-linux-gnu/11.2.0/:/lib/x86_64-linux-gnu/:/lib/../lib64/:/usr/lib/x86_64-pc-linux-gnu/11.2.0/:/usr/lib/x86_64-linux-gnu/:/usr/lib/../lib64/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../../x86_64-pc-linux-gnu/lib/:/opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/../../../:/lib/:/usr/lib/
The relevant line begins with libraries. The following extracts the search dirs from GCC and attempts to locate libgcc.a:
import subprocess
from pathlib import Path
def get_libgcc_a_location():
cp = subprocess.run(['gcc', '-print-search-dirs'], capture_output=True)
for line in cp.stdout.decode().splitlines():
if line.startswith('libraries'):
search_dirs = line.split('=')[1]
for search_dir in search_dirs.split(':'):
candidate_path = Path(search_dir) / 'libgcc.a'
print(f'Trying {candidate_path}...')
if candidate_path.exists():
return candidate_path
raise FileNotFoundError('Could not locate libgcc.a')
print(f'libgcc.a path is: {get_libgcc_a_location()}')
which on my system gives:
$ python find_libgcc_a.py
Trying /opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/libgcc.a...
libgcc.a path is: /opt/gcc/11.2.0/lib/gcc/x86_64-pc-linux-gnu/11.2.0/libgcc.a
Could somehow building this in to use GCC to help locate the path to pass to add_archive() in your setup / scenario?
In the armel chroot, gcc returns
(sid_armel-dchroot):~$ gcc -print-search-dirs
install: /usr/lib/gcc/arm-linux-gnueabi/12/
programs: =/usr/lib/gcc/arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/:/usr/lib/gcc/arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/bin/arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/bin/arm-linux-gnueabi/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/bin/
libraries: =/usr/lib/gcc/arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/lib/arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/lib/arm-linux-gnueabi/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/lib/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../arm-linux-gnueabi/12/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../arm-linux-gnueabi/:/usr/lib/gcc/arm-linux-gnueabi/12/../../../:/lib/arm-linux-gnueabi/12/:/lib/arm-linux-gnueabi/:/lib/:/usr/lib/arm-linux-gnueabi/12/:/usr/lib/arm-linux-gnueabi/:/usr/lib/
So the directory in question is the first entry in libraries.
Not sure if we also need to keep clang in mind. It returns
(sid_armel-dchroot):~$ clang -print-search-dirs
programs: =/usr/bin:/usr/lib/llvm-14/bin:/usr/bin/../lib/gcc/arm-linux-gnueabi/12/../../../../arm-linux-gnueabi/bin
libraries: =/usr/lib/llvm-14/lib/clang/14.0.6:/usr/bin/../lib/gcc/arm-linux-gnueabi/12:/usr/bin/../lib/gcc/arm-linux-gnueabi/12/../../../../lib:/lib/arm-linux-gnueabi:/lib/../lib:/usr/lib/arm-linux-gnueabi:/usr/lib/../lib:/lib:/usr/lib
in which case the directory for libgcc.a is the second entry in libraries.
So if add_archive() parses over the entries in libraries returned by the -print-search-dirs flag, then that should resolve the problem.
Your code snippet returns
Trying /usr/lib/gcc/arm-linux-gnueabi/12/libgcc.a...
libgcc.a path is: /usr/lib/gcc/arm-linux-gnueabi/12/libgcc.a
Adapted for clang, it returns
Trying /usr/lib/llvm-14/lib/clang/14.0.6/libgcc.a...
Trying /usr/bin/../lib/gcc/arm-linux-gnueabi/12/libgcc.a...
libgcc.a path for clang is: /usr/bin/../lib/gcc/arm-linux-gnueabi/12/libgcc.a
(arguably the path should be canonicalised without the relative disjoint)