MAVProxy
MAVProxy copied to clipboard
Difficulty building on alpine linux
Hi,
I have just tried to build mavproxy on alpine, eg:
apk update && apk add python-pip python-dev gcc libxml2 libxml2-dev py-lxml libxslt libxslt-dev musl-dev
pip install MAVproxy
But I run into errors about conflicting functions
mavnative/mavnative.c:322:13: error: conflicting types for '__assert_fail'
extern void __assert_fail(const char *__assertion, const char *__file, unsigned int __line, const char *__function)
^~~~~~~~~~~~~
In file included from mavnative/mavnative.c:14:0:
/usr/include/assert.h:19:16: note: previous declaration of '__assert_fail' was here
_Noreturn void __assert_fail (const char *, const char *, int, const char *);
^~~~~~~~~~~~~
error: command 'gcc' failed with exit status 1
Any ideas what I should do to build this?
Many thanks, Ed
On Wed, 20 Dec 2017, Ed Simmons wrote:
I have just tried to build mavproxy on alpine, eg:
Which compiler version is this?
I have tried to build this now with several versions of alpine. My latest attempt gets very similar results.
This time I tried using python 3 instead of 2.7.
mavnative/mavnative.c: In function 'py_mavlink_parse_char':
mavnative/mavnative.c:155:5: warning: enumeration value 'MAVLINK_PARSE_STATE_GOT_BAD_CRC1' not handled in switch [-Wswitch]
switch (status->parse_state)
^~~~~~
mavnative/mavnative.c: At top level:
mavnative/mavnative.c:322:13: error: conflicting types for '__assert_fail'
extern void __assert_fail(const char *__assertion, const char *__file, unsigned int __line, const char *__function)
^~~~~~~~~~~~~
In file included from mavnative/mavnative.c:14:0:
/usr/include/assert.h:19:16: note: previous declaration of '__assert_fail' was here
_Noreturn void __assert_fail (const char *, const char *, int, const char *);
^~~~~~~~~~~~~
mavnative/mavnative.c: In function 'msg_to_py':
mavnative/mavnative.c:646:24: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
for(fnum = 0; fnum < info->num_fields && objValid; fnum++) {
^
error: command 'gcc' failed with exit status 1
/ # uname -a
Linux cdeffd623252 4.10.0-42-generic #46~16.04.1-Ubuntu SMP Mon Dec 4 15:57:59 UTC 2017 x86_64 Linux
/ # gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/libexec/gcc/x86_64-alpine-linux-musl/6.4.0/lto-wrapper
Target: x86_64-alpine-linux-musl
Configured with: /home/buildozer/aports/main/gcc/src/gcc-6.4.0/configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --build=x86_64-alpine-linux-musl --host=x86_64-alpine-linux-musl --target=x86_64-alpine-linux-musl --with-pkgversion='Alpine 6.4.0' --enable-checking=release --disable-fixed-point --disable-libstdcxx-pch --disable-multilib --disable-nls --disable-werror --disable-symvers --enable-__cxa_atexit --enable-default-pie --enable-cloog-backend --enable-languages=c,c++,objc,java,fortran,ada --disable-libssp --disable-libmpx --disable-libmudflap --disable-libsanitizer --enable-shared --enable-threads --enable-tls --with-system-zlib --with-linker-hash-style=gnu
Thread model: posix
gcc version 6.4.0 (Alpine 6.4.0)
I think my previous error was just operator error, so I went right back to the start... Here's some more info: Using this basic dockerfile:
FROM python:2.7-alpine3.7
RUN apk update && apk add build-base libxml2-dev libxslt-dev
RUN pip install lxml
RUN pip install MAVproxy
I now get as far as pymavlink failing to build.
running bdist_wheel
running build
running build_py
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-build-meDpBw/pymavlink/setup.py", line 160, in <module>
ext_modules = extensions
File "/usr/local/lib/python2.7/site-packages/setuptools/__init__.py", line 129, in setup
return distutils.core.setup(**attrs)
File "/usr/local/lib/python2.7/distutils/core.py", line 151, in setup
dist.run_commands()
File "/usr/local/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/usr/local/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/local/lib/python2.7/site-packages/wheel/bdist_wheel.py", line 204, in run
self.run_command('build')
File "/usr/local/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/local/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/local/lib/python2.7/distutils/command/build.py", line 127, in run
self.run_command(cmd_name)
File "/usr/local/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/local/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/tmp/pip-build-meDpBw/pymavlink/setup.py", line 93, in run
generate_content()
File "/tmp/pip-build-meDpBw/pymavlink/setup.py", line 21, in generate_content
from generator import mavgen, mavparse
File "generator/mavgen.py", line 12, in <module>
from future import standard_library
ImportError: No module named future
----------------------------------------
Failed building wheel for pymavlink
And digging through all the logs it looks like the module future HAS been downloaded...
Collecting MAVproxy
Downloading MAVProxy-1.6.2.tar.gz (6.6MB)
Collecting pymavlink>=1.1.73 (from MAVproxy)
Downloading pymavlink-2.2.7.tar.gz (2.9MB)
Collecting pyserial>=3.0 (from MAVproxy)
Downloading pyserial-3.4-py2.py3-none-any.whl (193kB)
Collecting future (from pymavlink>=1.1.73->MAVproxy)
Downloading future-0.16.0.tar.gz (824kB)
/ # python --version
Python 2.7.14
/ # gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/libexec/gcc/x86_64-alpine-linux-musl/6.4.0/lto-wrapper
Target: x86_64-alpine-linux-musl
Configured with: /home/buildozer/aports/main/gcc/src/gcc-6.4.0/configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --build=x86_64-alpine-linux-musl --host=x86_64-alpine-linux-musl --target=x86_64-alpine-linux-musl --with-pkgversion='Alpine 6.4.0' --enable-checking=release --disable-fixed-point --disable-libstdcxx-pch --disable-multilib --disable-nls --disable-werror --disable-symvers --enable-__cxa_atexit --enable-default-pie --enable-cloog-backend --enable-languages=c,c++,objc,java,fortran,ada --disable-libssp --disable-libmpx --disable-libmudflap --disable-libsanitizer --enable-shared --enable-threads --enable-tls --with-system-zlib --with-linker-hash-style=gnu
Thread model: posix
gcc version 6.4.0 (Alpine 6.4.0)
/ # uname -a
Linux 0ff821c41d67 4.10.0-42-generic #46~16.04.1-Ubuntu SMP Mon Dec 4 15:57:59 UTC 2017 x86_64 Linux
Does the version >=1.1.73 here
Collecting future (from pymavlink>=1.1.73->MAVproxy)
Downloading future-0.16.0.tar.gz (824kB)
refer to pymavlink version or future? I'm not familiar with this, but to me it looks like the version of future is too old?
On Thu, 21 Dec 2017, Ed Simmons wrote:
Ok, I went right back to the start... Here's some more info: Using this basic dockerfile:
OK, the future thing is weird. Only thing I can currently think of there is some sort of python2 vs python3 issue.
The __assert_fail is happening because the function definitions are different.
The GCC you're using is more recent than any I have here; I'm seeing if artful happens to have the right one installed now....
Peter
I also tried this dockerfile:
FROM python:2.7-alpine3.7
RUN apk update && apk add build-base libxml2-dev libxslt-dev
RUN apk add git
RUN cd /tmp && git clone https://github.com/ArduPilot/MAVProxy.git && cd MAVProxy && python setup.py install
also resulting in an error about future:
ImportError: No module named future
On Thu, 21 Dec 2017, Ed Simmons wrote:
FROM python:2.7-alpine3.7 RUN apk update && apk add build-base libxml2-dev libxslt-dev RUN apk add git RUN cd /tmp && git clone https://github.com/ArduPilot/MAVProxy.git && cd MA VProxy && python setup.py install
also resulting in an error about future:
ImportError: No module named future
Throw a "pip install future" and "pip install lxml" into that mix.
I was trying that as your message arrived :)
FROM python:2.7-alpine3.6
RUN apk update && apk add build-base libxml2-dev libxslt-dev
RUN pip install future
RUN apk add git
RUN cd /tmp && git clone https://github.com/ArduPilot/MAVProxy.git && cd MAVProxy && python setup.py install
Gives this (the XML validation message repeats a lot!)
WARNING: Unable to load XML validator libraries. XML validation will not be performed
WARNING: Unable to load XML validator libraries. XML validation will not be performed
WARNING: Unable to load XML validator libraries. XML validation will not be performed
mavnative/mavnative.c: In function 'py_mavlink_parse_char':
mavnative/mavnative.c:155:5: warning: enumeration value 'MAVLINK_PARSE_STATE_GOT_BAD_CRC1' not handled in switch [-Wswitch]
switch (status->parse_state)
^~~~~~
mavnative/mavnative.c: At top level:
mavnative/mavnative.c:322:13: error: conflicting types for '__assert_fail'
extern void __assert_fail(const char *__assertion, const char *__file, unsigned int __line, const char *__function)
^~~~~~~~~~~~~
In file included from mavnative/mavnative.c:14:0:
/usr/include/assert.h:19:16: note: previous declaration of '__assert_fail' was here
_Noreturn void __assert_fail (const char *, const char *, int, const char *);
^~~~~~~~~~~~~
error: Setup script exited with error: command 'gcc' failed with exit status 1
Can't be far off now, I will also try jumping back a version with the alpine base image.
I think I'm stuck here, because of this:
mavnative/mavnative.c: In function 'py_mavlink_parse_char':
mavnative/mavnative.c:155:5: warning: enumeration value 'MAVLINK_PARSE_STATE_GOT_BAD_CRC1' not handled in switch [-Wswitch]
switch (status->parse_state)
^~~~~~
mavnative/mavnative.c: At top level:
mavnative/mavnative.c:322:13: error: conflicting types for '__assert_fail'
extern void __assert_fail(const char *__assertion, const char *__file, unsigned int __line, const char *__function)
^~~~~~~~~~~~~
In file included from mavnative/mavnative.c:14:0:
/usr/include/assert.h:19:16: note: previous declaration of '__assert_fail' was here
_Noreturn void __assert_fail (const char *, const char *, int, const char *);
^~~~~~~~~~~~~
error: Setup script exited with error: command 'gcc' failed with exit status 1
The difference in assert_fail is puzzling...
I should clarify, just in case it wasn't clear above, that all my recent testing has been using python 2.7.
Thanks for your help :)
Turns out it is possible to build... no idea if this is a good idea but I found the inspiration here: https://github.com/gmyoungblood-parc/docker-alpine-ardupilot/blob/master/Dockerfile#L81
I ran into this as well. This is an apparent incompat issue with musl libc vs glibc. Anything prefixed with underscores is considered internal/private and not meant to be used as an external interface. Or this is what I am led to believe.
With out knowing anything about this call specifically, is there an actual documented call that can be used rather than this private one?
See https://alex-robenko.gitbook.io/bare_metal_cpp/basic_needs/assertion.