HowTo: Get a minimal tensorflow environment for your RaspberryPi
Sometimes you have this deep desire to run your fancy neural network while you're on the go with just an battery pack.
This means no fancy x86 server with oh-so fancy graphic cards. My normal go to plattform for mobile processing means arduino or esp32, but this time we need something more powerful - entry RaspberryPi.
Although there are many reports about how people run Tensorflow on their RaspberryPi, most tutorials are veraltet
and missing some pieces.
Let`s start with some common prerequirements do load hdf5 with our pre-trained model and libatalas for numeric stuff.
sudo apt-get install libhdf5-dev
sudo apt-get install libatlas-base-dev
I couldn`t find any recent manual to run tensorflow on my RaspberryPi and I first hunted down an already cross-compiled tensorflow wheel to not wait ages for compilation on my PI or set up a cross-compilation chain on my local x86 machine.
The easy solution:
Use (tensorflow-on-arm)[https://github.com/lhelontra/tensorflow-on-arm/releases] cross-compiled by a friendly user on GitHub
and directly install the provided wheel with your global python3
installation. (first needed to install python3-pip)
As I'm constantly switching my servers and machines at work I decided to not just pinpoint every dependency but take care of harmonized python versions. I started two years ago with (miniconda)[] but I didn't feel like the standard python way and I also hated the speed of the dependency manager baked into conda.
Nowadays I used py-env (internal link)
pyenv install 3.5
pyenv local 3.5
python -m venv venv
source venv/bin/activate
pip install *whl
pip install poetry
poetry update
AAAAAANND we receive an error. But it worked with the same pre-installed python version....?
Checking and comparing the compiler flags of the different python installations and we learn that it is different to my ubuntu server installation. Instead of hunting for the minimal flag (I suspect with-fpectl
) I used all the flags :)
export PYTHON_CONFIGURE_OPTS="--enable-ipv6\
--enable-unicode=ucs4\
--with-dbmliborder=bdb:gdbm\
--with-system-expat\
--with-system-ffi\
--with-fpectl"
pyenv install 3.5.3
Last step to make my document-everything heart happy. pyproject.toml
is the new way to document python dependencies instead of the old-dreadfull setup.py
.
show export
and now lets type poetry update
.
python import tensorflow is happy.
Finally, lets do some predictions.
My requirements are: 1.
Serving
https://github.com/emacski/tensorflow-serving-arm/releases