Package request: Tensorflow for mobile
hoping you could build this and add it to the packages #tensorflow check out build process https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/makefile/README.md
@fornwall what is the update on the building process of mobile Tensorflow?
Iv had a brief go at it... Getting a compiled library is easy as it does it for you but im guessing you want python bindings?
@its-pointless I will be glad if you compile it for termux and provide python bindings
Im doing a compile on a test device see if i can get it to build arm. Once that is done i can work out what is required from there.
@its-pointless am counting on your infinity support and help you give to the community. #Thank you
@its-pointless Any progress on build for arm?
im trying a few things... i think i might have a method down that works for both arm and aarch64. But its complicated.
@its-pointless hit me with the method that works and let me also dive into the ocean with you or lets starts with what works first
First issue tensoflow NEEDS bazel to install properly. Bazel is in java. Bazel is not a simple thing that can be worked around it seems. We don't have javac to build bazel ... we do on arm only and that build is kind of broken at this point. Open-jdk9 does work for arm but its broken when loading vm. its in disabled
How do we work around this? i was thinking a method of mounting a linux chroot ala linuxdeploy and mounting android dirs /system /data and then using that to load termux/android compilers protobuf libs and python binaries. I have done something similar before ... i dont even know if that will work.
The whole thing at this point is a cluster fuck of pain trying to get this to work.
https://github.com/samjabrahams/tensorflow-on-raspberry-pi/blob/master/GUIDE.md As a start point.
we haven't even gotten to tensorflow as yet. Tensorflow will not compile on clang. You have to use to gcc. We can cross compile the lib but we need gcc. Fortunately iv got that already done.
@its-pointless I can you try this method from this link https://zhiyisun.github.io/2017/02/15/Running-Google-Machine-Learning-Library-Tensorflow-On-ARM-64-bit-Platform.html
It sounds really cool but i haven't been able to pull it off
Any progress yet?
I tried compiling tensorflow , but no success .
We could take this to the tensorflow community, can they use the tensorflow lite and bring it to Android in a way so that we could program it
I managed to get TensorFlow 1.9.0 installed under Termux natively, but the process is unpleasant. Before you can install it you MUST install numpy like this: LDFLAGS="-lm -lcompiler_rt" pip install --no-cache-dir numpy
I posted the WHL file(s) here: https://github.com/dilbrent/tensorflow/releases
Once it's installed there is a significant number of missing libraries to track down. I pulled most of them out of the last build of Kali nethunter. There is almost certainly an easier way to do that.
I installed the Python 3.6 version. My screenshot is of the 2.7 version running in JupyterLab. Ignore that, it's from a different install because I can't get JupyterLab to install under Termux.
@dilbrent just fyi, you can install numpy, scipy and more from @its-pointless repo: https://wiki.termux.com/wiki/Package_Management#By_its-pointless_.28live_the_dream.29
If you try compiling bazel 14.1 it actually gets somewhat into the build process using local javac from disabled-packages. it takes a long time though so its best to use x86 cpu and virtualbox.
here is error message ERROR: No default_toolchain found for cpu 'unknown'. Valid cpus are: [ k8, local, armeabi-v7a, x64_windows, x64_windows_msvc, x64_windows_msys, s390x, ios_x86_64, ] INFO: Elapsed time: 61.269s INFO: 0 processes. FAILED: Build did NOT complete successfully (3 packages loaded)
I recently built Bazel 0.15.0 for arm as well, so if you want to take a stab at running the actual build process for tensorflow under Termux you can start with that: https://github.com/dilbrent/bazel/releases/
But since I already built tensorflow you can skip Bazel entirely, although I agree that it would be nice to be able to build it under Termux. It takes about 9 hours to build on my arm device. It's even slower in an emulated environment.
I found that bazel already supports armv7 (32bit) and aarch64 (64bit). Compile bazel on arm is an easy thing now. @its-pointless @dilbrent
I managed to get TensorFlow 1.9.0 installed under Termux natively, but the process is unpleasant. Before you can install it you MUST install numpy like this: LDFLAGS="-lm -lcompiler_rt" pip install --no-cache-dir numpy
I posted the WHL file(s) here: https://github.com/dilbrent/tensorflow/releases
Once it's installed there is a significant number of missing libraries to track down. I pulled most of them out of the last build of Kali nethunter. There is almost certainly an easier way to do that.
I installed the Python 3.6 version. My screenshot is of the 2.7 version running in JupyterLab. Ignore that, it's from a different install because I can't get JupyterLab to install under Termux.
Could you please post a guide about what steps it takes to install tensorflow? I saw your result but I don't really understand how you got it works.
I managed to get TensorFlow 1.9.0 installed under Termux natively, but the process is unpleasant. Before you can install it you MUST install numpy like this: LDFLAGS="-lm -lcompiler_rt" pip install --no-cache-dir numpy I posted the WHL file(s) here: https://github.com/dilbrent/tensorflow/releases Once it's installed there is a significant number of missing libraries to track down. I pulled most of them out of the last build of Kali nethunter. There is almost certainly an easier way to do that. I installed the Python 3.6 version. My screenshot is of the 2.7 version running in JupyterLab. Ignore that, it's from a different install because I can't get JupyterLab to install under Termux.
Could you please post a guide about what steps it takes to install tensorflow? I saw your result but I don't really understand how you got it works.
@brehimpxlin: Install numpy as I mentioned in my previous comment, and then install the 32 bit tensorflow WHL file with pip like you would install any other WHL file. I used the WHL file I built and released on Github, but in theory the official ARM builds should work too, although I did not confirm that.
Then, you will start to get complaints about missing libraries, one at a time, until you finish resolving all of them. As I mentioned, there is probably an easier way to do this.
The way I resolved them was to pull their equivalent versions out of the 32 bit nethunter build. You can download it here: https://build.nethunter.com/nightly/2017.11-18-1618/
Specifically I used: nethunter-generic-armhf-kalifs-full-rolling-2017.11-18-1618.zip
Basically, for each missing file that it complains about, pull the matching file out of the nethunter image and put it in your termux path. Once all of the files are resolved you should be able to create a tensorflow object in python 3.6 under termux.
I am certain that there is an easier way to do this, but I was in "slash and burn" mode just trying to prove it would run under termux. If we could determine a full list of required files that run under a fresh Termux install, ideally in a fresh Android environment, we could make a solution that should work easily for everyone.
I managed to get TensorFlow 1.9.0 installed under Termux natively, but the process is unpleasant. Before you can install it you MUST install numpy like this: LDFLAGS="-lm -lcompiler_rt" pip install --no-cache-dir numpy I posted the WHL file(s) here: https://github.com/dilbrent/tensorflow/releases Once it's installed there is a significant number of missing libraries to track down. I pulled most of them out of the last build of Kali nethunter. There is almost certainly an easier way to do that. I installed the Python 3.6 version. My screenshot is of the 2.7 version running in JupyterLab. Ignore that, it's from a different install because I can't get JupyterLab to install under Termux.
Could you please post a guide about what steps it takes to install tensorflow? I saw your result but I don't really understand how you got it works.
@brehimpxlin: Install numpy as I mentioned in my previous comment, and then install the 32 bit tensorflow WHL file with pip like you would install any other WHL file. I used the WHL file I built and released on Github, but in theory the official ARM builds should work too, although I did not confirm that.
Then, you will start to get complaints about missing libraries, one at a time, until you finish resolving all of them. As I mentioned, there is probably an easier way to do this.
The way I resolved them was to pull their equivalent versions out of the 32 bit nethunter build. You can download it here: https://build.nethunter.com/nightly/2017.11-18-1618/
Specifically I used: nethunter-generic-armhf-kalifs-full-rolling-2017.11-18-1618.zip
Basically, for each missing file that it complains about, pull the matching file out of the nethunter image and put it in your termux path. Once all of the files are resolved you should be able to create a tensorflow object in python 3.6 under termux.
I am certain that there is an easier way to do this, but I was in "slash and burn" mode just trying to prove it would run under termux. If we could determine a full list of required files that run under a fresh Termux install, ideally in a fresh Android environment, we could make a solution that should work easily for everyone.
Thanks! I tried the whl files in your github but did not work. Is it because of my 64 bit phone? Do you have 64 bit versions or do you know where to download them?
@brehimpxlin: Install numpy as I mentioned in my previous comment, and then install the 32 bit tensorflow WHL file with pip like you would install any other WHL file. I used the WHL file I built and released on Github, but in theory the official ARM builds should work too, although I did not confirm that. Then, you will start to get complaints about missing libraries, one at a time, until you finish resolving all of them. As I mentioned, there is probably an easier way to do this. The way I resolved them was to pull their equivalent versions out of the 32 bit nethunter build. You can download it here: https://build.nethunter.com/nightly/2017.11-18-1618/ Specifically I used: nethunter-generic-armhf-kalifs-full-rolling-2017.11-18-1618.zip Basically, for each missing file that it complains about, pull the matching file out of the nethunter image and put it in your termux path. Once all of the files are resolved you should be able to create a tensorflow object in python 3.6 under termux. I am certain that there is an easier way to do this, but I was in "slash and burn" mode just trying to prove it would run under termux. If we could determine a full list of required files that run under a fresh Termux install, ideally in a fresh Android environment, we could make a solution that should work easily for everyone.
Thanks! I tried the whl files in your github but did not work. Is it because of my 64 bit phone? Do you have 64 bit versions or do you know where to download them?
It's possible that 64 bit arm could be the problem. My builds are specifically optimized for armvl7 (on Kali 32 bit arm), which is why I use the Kali image to extract the missing libraries under Termux.
You can try the official nightly ARM build for Raspberry Pi 3: https://storage.googleapis.com/tensorflow-nightly/tensorflow-1.10.0-cp34-none-linux_armv7l.whl
The Pi 3 B+ is 64 bit and the official nightlies will run on that, so it might be worth a shot since it would not take very long to try, although it appears that there are a few Pi-specific adjustments made when building for Pi, so YMMV.
I have additional thoughts about other ways to accomplish this, but each one gets progressively more complicated and time-consuming.
@its-pointless Iv had a brief go at it... Getting a compiled library is easy as it does it for you but im guessing you want python bindings?
How did you compile the library? I'm trying to install the C API (don't care about the python bindings). They provide x86 versions, but it looks like I'll have to build from source to get it to work on my aarch64 device. Building without Bazel seems complex...
How did you compile the library? I'm trying to install the C API (don't care about the python bindings). They provide x86 versions, but it looks like I'll have to build from source to get it to work on my aarch64 device. Building without Bazel seems complex...
I have a few Bazel binaries for aarch64 built on a Pi 3B+ running Kali. They also work properly on my Galaxy S8. You can get the binaries here: https://github.com/dilbrent/bazel/releases
I have a single binary of Tensorflow 1.13.1 for aarch64 here: https://github.com/dilbrent/tensorflow-1.13.1/releases
I was eventually able to get this to instantiate in Python under Termux but getting all of the dependencies in place was so complicated I didn't share it here. I don't think it's practical to assume we can easily package this for Termux.
1.13.1 is the newest version I built. If you want one of the newer versions you definitely want to use Bazel. I can't imagine how difficult it would be to build Tensorflow without it.
I am attempting to build Tensorflow via TF Lite Guide, and have had to track down a few missing libraries and header files. I'm stuck at:
tensorflow/lite/nnapi/nnapi_implementation.cc:81:12: error: use of undeclared identifier 'shm_open'
It seems that there is no definition of 'shm_open' function on my device. I have researched it and I think that the function should be defined in /usr/include/sys/mann.h , but alas it is not. Globally the internet talks about this function throwing an error in an Android build, and they say it is deprecated.
Any thoughts?
I had successfully built Tensor 1.14 for aarch64 and install it in my ASUS ROG Phone II / Termux / Ubuntu.. I created the blog post describing all the steps I took here:
https://medium.com/@twilightdema_14017/developing-tensorflow-on-android-phone-cfc4297b676e
(The link to download built wheel file is also in my blog post).
2024 update: installing it on prooted Debian was kinda banal.
Yet on Termux: the most tricky was to disable some sub- sub-packages in setup.py-s.
E.g. to have this compile :
Successfully built tensorflow-io Installing collected packages: tensorflow-io Successfully installed tensorflow-io-0.37.0 ~/downloads/io $
one needs to change:
40 '''
41 subpackages = ["tensorflow-io-gcs-filesystem"]
42 '''
43
44 # Instead, we define subpackages as an empty list, to disable these:
45 subpackages = []
etc .
The most involved was how to compile it without bazel. [Update: bazel also runs in Termux, see below.]
A series of 'make, cmake, rm CMakeCache.txt' and juggling the below was needed:
ARMCC_FLAGS=" -pthread -g -march=armv8-a -mtune=cortex-a53 -Wall -Wextra -funsafe-math-optimizations"
cmake -DCMAKE_C_COMPILER=cc -DCMAKE_CXX_COMPILER=c++ -DCMAKE_C_FLAGS="${ARMCC_FLAGS}" -DCMAKE_CXX_FLAGS="${ARMCC_FLAGS}" -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON -DCMAKE_SYSTEM_NAME=Linux -DCMAKE_SYSTEM_PROCESSOR=armv7 -DTFLITE_ENABLE_XNNPACK=OFF .
(one can also compile it with XNNPACK, a separate git clone, with e.g.:
cmake -DCMAKE_C_COMPILER=cc \
-DCMAKE_CXX_COMPILER=c++ \
-DCMAKE_C_FLAGS="${ARMCC_FLAGS}" \
-DCMAKE_CXX_FLAGS="${ARMCC_FLAGS}" \
-DCMAKE_VERBOSE_MAKEFILE:BOOL=ON \
-DCMAKE_SYSTEM_NAME=Linux \
-DCMAKE_SYSTEM_PROCESSOR=armv7 \
-DTFLITE_ENABLE_XNNPACK=OFF \
-DCMAKE_INSTALL_PREFIX=$PREFIX \
-DCMAKE_SYSROOT=$ANDROID_NDK_ROOT/sysroot \
-DCMAKE_ANDROID_API=26 .
My box:
Environment at system: Linux localhost 4.14.186+ #1 SMP PREEMPT Thu Mar 17 16:28:22 CST 2022 aarch64 Android
Make version: GNU Make 4.4.1
LD_LIBRARY_PATH: /data/data/com.termux/files/usr/lib/jvm/java-17-openjdk/lib/: CFLAGS: CXXFLAGS: LDFLAGS: -lpython3.11 {not needed}
CPPFLAGS:
CPPFLAGS:
ANDROID_NDK: /storage/emulated/0/Download/android-ndk-r26b
2024 update: installing it on prooted Debian was kinda banal.
Yet on Termux: the most tricky was to disable some sub- sub-packages in setup.py-s.
E.g. in :
Successfully built tensorflow-io Installing collected packages: tensorflow-io Successfully installed tensorflow-io-0.37.0 ~/downloads/io $:
40 ''' 41 subpackages = ["tensorflow-io-gcs-filesyst em"] 42 ''' 43 44 # Instead, we define subpackages as an emp ty list 45 subpackages = []etc .
The most involved was how to compile it without bazel.
A series of 'make, cmake, rm CMakeCache.txt' and juggling the below was needed:
ARMCC_FLAGS=" -pthread -g -march=armv8-a -mtune=cortex-a53 -Wall -Wextra -funsafe-math-optimizations" cmake -DCMAKE_C_COMPILER=cc -DCMAKE_CXX_COMPILER=c++ -DCMAKE_C_FLAGS="${ARMCC_FLAGS}" -DCMAKE_CXX_FLAGS="${ARMCC_FLAGS}" -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON -DCMAKE_SYSTEM_NAME=Linux -DCMAKE_SYSTEM_PROCESSOR=armv7 -DTFLITE_ENABLE_XNNPACK=OFF .(one can also compile it with XNNPACK, a separate git clone, with e.g.:
cmake -DCMAKE_C_COMPILER=cc \ -DCMAKE_CXX_COMPILER=c++ \ -DCMAKE_C_FLAGS="${ARMCC_FLAGS}" \ -DCMAKE_CXX_FLAGS="${ARMCC_FLAGS}" \ -DCMAKE_VERBOSE_MAKEFILE:BOOL=ON \ -DCMAKE_SYSTEM_NAME=Linux \ -DCMAKE_SYSTEM_PROCESSOR=armv7 \ -DTFLITE_ENABLE_XNNPACK=OFF \ -DCMAKE_INSTALL_PREFIX=$PREFIX \ -DCMAKE_SYSROOT=$ANDROID_NDK_ROOT/sysroot \ -DCMAKE_ANDROID_API=26 .My box: Environment at system: Linux localhost 4.14.186+ #1 SMP PREEMPT Thu Mar 17 16:28:22 CST 2022 aarch64 Android Make version: GNU Make 4.4.1
LD_LIBRARY_PATH: /data/data/com.termux/files/usr/lib/jvm/java-17-openjdk/lib/: CFLAGS: CXXFLAGS: LDFLAGS: -lpython3.11 {not needed} CPPFLAGS: CPPFLAGS: ANDROID_NDK: /storage/emulated/0/Download/android-ndk-r26b
I have a question that how did you disable the tensorflow-io-gcs-filesystem? Can I get steps that you did to build it?
- I do not use tensorflow on Droid at all nowadays (as torch etc. solves most of my needs), so I have uninstalled it.
- Bazel can be run in plain Termux (I forgot how I had installed it) so it may help you as the first step.
- Install it in the prooted Debian (
proot-distro login debian) if the regular installation (using our tips) in Termux fails.