Add Python quickstart using the new tflite_runtime package

PiperOrigin-RevId: 260810074
This commit is contained in:
A. Unique TensorFlower 2019-07-30 15:36:58 -07:00 committed by TensorFlower Gardener
parent dcb9e62c6c
commit 93cd6c35c4
5 changed files with 175 additions and 31 deletions

View File

@ -24,6 +24,8 @@ upper_tabs:
path: /lite/guide/android path: /lite/guide/android
- title: "iOS quickstart" - title: "iOS quickstart"
path: /lite/guide/ios path: /lite/guide/ios
- title: "Python quickstart"
path: /lite/guide/python
- title: "FAQ" - title: "FAQ"
path: /lite/guide/faq path: /lite/guide/faq
- title: "Roadmap" - title: "Roadmap"

View File

@ -1,23 +1,37 @@
# Build TensorFlow Lite for ARM64 boards # Build TensorFlow Lite for ARM64 boards
## Cross compiling This page describes how to build the TensorFlow Lite static library for
ARM64-based computers. If you just want to start using TensorFlow Lite to
execute your models, the fastest option is to install the TensorFlow Lite
runtime package as shown in the [Python quickstart](python.md).
### Installing the toolchain Note: This page shows how to compile only the C++ static library for
TensorFlow Lite. Alternative install options include: [install just the Python
interpreter API](python.md) (for inferencing only); [install the full
TensorFlow package from pip](https://www.tensorflow.org/install/pip);
or [build the full TensorFlow package](
https://www.tensorflow.org/install/source).
## Cross-compile for ARM64
To ensure the proper build environment, we recommend using one of our TensorFlow
Docker images such as [tensorflow/tensorflow:nightly-devel](
https://hub.docker.com/r/tensorflow/tensorflow/tags/).
To get started, install the toolchain and libs:
```bash ```bash
sudo apt-get update sudo apt-get update
sudo apt-get install crossbuild-essential-arm64 sudo apt-get install crossbuild-essential-arm64
``` ```
> If you are using Docker, you may not use `sudo`. If you are using Docker, you may not use `sudo`.
### Building Now git-clone the TensorFlow repository
(`https://github.com/tensorflow/tensorflow`)—if you're using the TensorFlow
Clone this Tensorflow repository. Run this script at the root of the repository Docker image, the repo is already provided in `/tensorflow_src/`—and then run
to download all the dependencies: this script at the root of the TensorFlow repository to download all the
build dependencies:
> The Tensorflow repository is in `/tensorflow` if you are using
> `tensorflow/tensorflow:nightly-devel` docker image, just try it.
```bash ```bash
./tensorflow/lite/tools/make/download_dependencies.sh ./tensorflow/lite/tools/make/download_dependencies.sh
@ -25,7 +39,7 @@ to download all the dependencies:
Note that you only need to do this once. Note that you only need to do this once.
Compile: Then compile:
```bash ```bash
./tensorflow/lite/tools/make/build_aarch64_lib.sh ./tensorflow/lite/tools/make/build_aarch64_lib.sh
@ -34,17 +48,19 @@ Compile:
This should compile a static library in: This should compile a static library in:
`tensorflow/lite/tools/make/gen/aarch64_armv8-a/lib/libtensorflow-lite.a`. `tensorflow/lite/tools/make/gen/aarch64_armv8-a/lib/libtensorflow-lite.a`.
## Native compiling ## Compile natively on ARM64
These steps were tested on HardKernel Odroid C2, gcc version 5.4.0. These steps were tested on HardKernel Odroid C2, gcc version 5.4.0.
Log in to your board, install the toolchain. Log in to your board and install the toolchain:
```bash ```bash
sudo apt-get install build-essential sudo apt-get install build-essential
``` ```
First, clone the TensorFlow repository. Run this at the root of the repository: Now git-clone the TensorFlow repository
(`https://github.com/tensorflow/tensorflow`) and run this at the root of
the repository:
```bash ```bash
./tensorflow/lite/tools/make/download_dependencies.sh ./tensorflow/lite/tools/make/download_dependencies.sh
@ -52,7 +68,7 @@ First, clone the TensorFlow repository. Run this at the root of the repository:
Note that you only need to do this once. Note that you only need to do this once.
Compile: Then compile:
```bash ```bash
./tensorflow/lite/tools/make/build_aarch64_lib.sh ./tensorflow/lite/tools/make/build_aarch64_lib.sh

View File

@ -1,30 +1,42 @@
# Build TensorFlow Lite for Raspberry Pi # Build TensorFlow Lite for Raspberry Pi
## Cross compiling This page describes how to build the TensorFlow Lite static library for
Raspberry Pi. If you just want to start using TensorFlow Lite to execute your
models, the fastest option is to install the TensorFlow Lite runtime package as
shown in the [Python quickstart](python.md).
### Installing the toolchain Note: This page shows how to compile only the C++ static library for
TensorFlow Lite. Alternative install options include: [install just the Python
interpreter API](python.md) (for inferencing only); [install the full
TensorFlow package from pip](https://www.tensorflow.org/install/pip);
or [build the full TensorFlow package](
https://www.tensorflow.org/install/source_rpi).
This has been tested on Ubuntu 16.04.3 64bit and Tensorflow devel docker image
## Cross-compile for Raspberry Pi
This has been tested on Ubuntu 16.04.3 64bit and TensorFlow devel docker image
[tensorflow/tensorflow:nightly-devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). [tensorflow/tensorflow:nightly-devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/).
To cross compile TensorFlow Lite, first install the toolchain and libs. To cross compile TensorFlow Lite, first install the toolchain and libs:
```bash ```bash
sudo apt-get update sudo apt-get update
sudo apt-get install crossbuild-essential-armhf sudo apt-get install crossbuild-essential-armhf
``` ```
> If you are using Docker, you may not use `sudo`. If you are using Docker, you may not use `sudo`.
### Building Now git-clone the TensorFlow repository
(`https://github.com/tensorflow/tensorflow`)—if you're using the TensorFlow
Clone this Tensorflow repository, Run this script at the root of the repository to download all the dependencies: Docker image, the repo is already provided in `/tensorflow_src/`—and then run
this script at the root of the TensorFlow repository to download all the
> The Tensorflow repository is in `/tensorflow` if you are using `tensorflow/tensorflow:nightly-devel` docker image, just try it. build dependencies:
```bash ```bash
./tensorflow/lite/tools/make/download_dependencies.sh ./tensorflow/lite/tools/make/download_dependencies.sh
``` ```
Note that you only need to do this once. Note that you only need to do this once.
You should then be able to compile: You should then be able to compile:
@ -36,23 +48,29 @@ You should then be able to compile:
This should compile a static library in: This should compile a static library in:
`tensorflow/lite/tools/make/gen/rpi_armv7l/lib/libtensorflow-lite.a`. `tensorflow/lite/tools/make/gen/rpi_armv7l/lib/libtensorflow-lite.a`.
## Native compiling
## Compile natively on Raspberry Pi
This has been tested on Raspberry Pi 3b, Raspbian GNU/Linux 9.1 (stretch), gcc version 6.3.0 20170516 (Raspbian 6.3.0-18+rpi1). This has been tested on Raspberry Pi 3b, Raspbian GNU/Linux 9.1 (stretch), gcc version 6.3.0 20170516 (Raspbian 6.3.0-18+rpi1).
Log in to you Raspberry Pi, install the toolchain. Log in to your Raspberry Pi and install the toolchain:
```bash ```bash
sudo apt-get install build-essential sudo apt-get install build-essential
``` ```
First, clone the TensorFlow repository. Run this at the root of the repository: Now git-clone the TensorFlow repository
(`https://github.com/tensorflow/tensorflow`) and run this at the root of
the repository:
```bash ```bash
./tensorflow/lite/tools/make/download_dependencies.sh ./tensorflow/lite/tools/make/download_dependencies.sh
``` ```
Note that you only need to do this once. Note that you only need to do this once.
You should then be able to compile: You should then be able to compile:
```bash ```bash
./tensorflow/lite/tools/make/build_rpi_lib.sh ./tensorflow/lite/tools/make/build_rpi_lib.sh
``` ```

View File

@ -211,9 +211,15 @@ developers should use the
### Linux ### Linux
Embedded Linux is an important platform for deploying machine learning. We Embedded Linux is an important platform for deploying machine learning. To get
provide build instructions for both [Raspberry Pi](build_rpi.md) and started using Python to perform inference with your TensorFlow Lite models,
[Arm64-based boards](build_arm64.md) such as Odroid C2, Pine64, and NanoPi. follow the [Python quickstart](python.md).
To instead install the C++ library, see the
build instructions for [Raspberry Pi](build_rpi.md) or
[Arm64-based boards](build_arm64.md) (for boards such as Odroid C2, Pine64, and
NanoPi).
### Microcontrollers ### Microcontrollers
@ -289,5 +295,8 @@ resources:
* If you're a mobile developer, visit [Android quickstart](android.md) or * If you're a mobile developer, visit [Android quickstart](android.md) or
[iOS quickstart](ios.md). [iOS quickstart](ios.md).
* If you're building Linux embedded devices, see the [Python quickstart](
python.md) or C++ build instructions for [Raspberry Pi](build_rpi.md) and
[Arm64-based boards](build_arm64.md).
* Explore our [pre-trained models](../models). * Explore our [pre-trained models](../models).
* Try our [example apps](https://www.tensorflow.org/lite/examples). * Try our [example apps](https://www.tensorflow.org/lite/examples).

View File

@ -0,0 +1,99 @@
# Python quickstart
Using TensorFlow Lite with Python is great for embedded devices based on Linux,
such as [Raspberry Pi](https://www.raspberrypi.org/){:.external} and
[Coral devices with Edge TPU](https://coral.withgoogle.com/){:.external},
among many others.
This page shows how you can start running TensorFlow Lite models with Python in
just a few minutes. All you need is a TensorFlow model [converted to TensorFlow
Lite](../convert/). (If you don't have a model converted yet, you can experiment
using the model provided with the example linked below.)
## Install just the TensorFlow Lite interpreter
To quickly start executing TensorFlow Lite models with Python, you can install
just the TensorFlow Lite interpreter, instead of all TensorFlow packages.
This interpreter-only package is a fraction the size of the full TensorFlow
package and includes the bare minimum code required to run inferences with
TensorFlow Lite—it includes only the [`tf.lite.Interpreter`](
https://www.tensorflow.org/api_docs/python/tf/lite/Interpreter) Python class.
This small package is ideal when all you want to do is execute `.tflite` models
and avoid wasting disk space with the large TensorFlow library.
Note: If you need access to other Python APIs, such as the [TensorFlow Lite
Converter](../convert/python_api.md), you must install the [full TensorFlow
package](https://www.tensorflow.org/install/).
To install, download the appropriate Python wheel for your system from the
following table, and then install it with with `pip install` command.
For example, if you're setting up a Raspberry Pi (using Raspbian Buster, which
has Python 3.7), install the Python wheel as follows (after you click to
download the `.whl` file below):
<pre class="devsite-terminal devsite-click-to-copy">
pip3 install tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whl
</pre>
<table>
<tr><th></th><th>ARM 32</th><th>ARM 64</th><th>x86-64</th></tr>
<tr><th style="white-space:nowrap">Python 3.5</th>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp35-cp35m-linux_armv7l.whl"
>tflite_runtime-1.14.0-cp35-cp35m-linux_armv7l.whl</a></td>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp35-cp35m-linux_aarch64.whl"
>tflite_runtime-1.14.0-cp35-cp35m-linux_aarch64.whl</a></td>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp35-cp35m-linux_x86_64.whl"
>tflite_runtime-1.14.0-cp35-cp35m-linux_x86_64.whl</a></td>
</tr>
<tr><th>Python 3.6</th>
<td>N/A</td>
<td>N/A</td>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp36-cp36m-linux_x86_64.whl"
>tflite_runtime-1.14.0-cp36-cp36m-linux_x86_64.whl</a></td>
</tr>
<tr><th>Python 3.7</th>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whl"
>tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whl</a></td>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp37-cp37m-linux_aarch64.whl"
>tflite_runtime-1.14.0-cp37-cp37m-linux_aarch64.whl</a></td>
<td><a href="https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp37-cp37m-linux_x86_64.whl"
>tflite_runtime-1.14.0-cp37-cp37m-linux_x86_64.whl</a></td>
</tr>
</table>
## Run an inference using tflite_runtime
To distinguish this interpreter-only package from the full TensorFlow package
(allowing both to be installed, if you choose), the Python module provided in
the above wheel is named `tflite_runtime`.
So instead of importing `Interpreter` from the `tensorflow` module, you need to
import it from `tflite_runtime`.
For example, after you install the package above, copy and run the
[`label_image.py`](
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/examples/python/)
file. It will (probably) fail because you don't have the `tensorflow` library
installed. To fix it, simply edit this line of the file:
```python
from tensorflow.lite.python.interpreter import Interpreter
```
So it instead reads:
```python
from tflite_runtime import Interpreter
```
Now run `label_image.py` again. That's it! You're now executing TensorFlow Lite
models.
For more details about the `Interpreter` API, read [Load and run a model
in Python](inference.md#load-and-run-a-model-in-python).
To convert other TensorFlow models to TensorFlow Lite, read about the
the [TensorFlow Lite Converter](../convert/).