diff --git a/tensorflow/lite/g3doc/_book.yaml b/tensorflow/lite/g3doc/_book.yaml index b7b954e2db5..df004b12680 100644 --- a/tensorflow/lite/g3doc/_book.yaml +++ b/tensorflow/lite/g3doc/_book.yaml @@ -24,6 +24,8 @@ upper_tabs: path: /lite/guide/android - title: "iOS quickstart" path: /lite/guide/ios + - title: "Python quickstart" + path: /lite/guide/python - title: "FAQ" path: /lite/guide/faq - title: "Roadmap" diff --git a/tensorflow/lite/g3doc/guide/build_arm64.md b/tensorflow/lite/g3doc/guide/build_arm64.md index 304b7217e52..825e235d058 100644 --- a/tensorflow/lite/g3doc/guide/build_arm64.md +++ b/tensorflow/lite/g3doc/guide/build_arm64.md @@ -1,23 +1,37 @@ # Build TensorFlow Lite for ARM64 boards -## Cross compiling +This page describes how to build the TensorFlow Lite static library for +ARM64-based computers. If you just want to start using TensorFlow Lite to +execute your models, the fastest option is to install the TensorFlow Lite +runtime package as shown in the [Python quickstart](python.md). -### Installing the toolchain +Note: This page shows how to compile only the C++ static library for +TensorFlow Lite. Alternative install options include: [install just the Python +interpreter API](python.md) (for inferencing only); [install the full +TensorFlow package from pip](https://www.tensorflow.org/install/pip); +or [build the full TensorFlow package]( +https://www.tensorflow.org/install/source). + +## Cross-compile for ARM64 + +To ensure the proper build environment, we recommend using one of our TensorFlow +Docker images such as [tensorflow/tensorflow:nightly-devel]( +https://hub.docker.com/r/tensorflow/tensorflow/tags/). + +To get started, install the toolchain and libs: ```bash sudo apt-get update sudo apt-get install crossbuild-essential-arm64 ``` -> If you are using Docker, you may not use `sudo`. +If you are using Docker, you may not use `sudo`. -### Building - -Clone this Tensorflow repository. Run this script at the root of the repository -to download all the dependencies: - -> The Tensorflow repository is in `/tensorflow` if you are using -> `tensorflow/tensorflow:nightly-devel` docker image, just try it. +Now git-clone the TensorFlow repository +(`https://github.com/tensorflow/tensorflow`)—if you're using the TensorFlow +Docker image, the repo is already provided in `/tensorflow_src/`—and then run +this script at the root of the TensorFlow repository to download all the +build dependencies: ```bash ./tensorflow/lite/tools/make/download_dependencies.sh @@ -25,7 +39,7 @@ to download all the dependencies: Note that you only need to do this once. -Compile: +Then compile: ```bash ./tensorflow/lite/tools/make/build_aarch64_lib.sh @@ -34,17 +48,19 @@ Compile: This should compile a static library in: `tensorflow/lite/tools/make/gen/aarch64_armv8-a/lib/libtensorflow-lite.a`. -## Native compiling +## Compile natively on ARM64 These steps were tested on HardKernel Odroid C2, gcc version 5.4.0. -Log in to your board, install the toolchain. +Log in to your board and install the toolchain: ```bash sudo apt-get install build-essential ``` -First, clone the TensorFlow repository. Run this at the root of the repository: +Now git-clone the TensorFlow repository +(`https://github.com/tensorflow/tensorflow`) and run this at the root of +the repository: ```bash ./tensorflow/lite/tools/make/download_dependencies.sh @@ -52,7 +68,7 @@ First, clone the TensorFlow repository. Run this at the root of the repository: Note that you only need to do this once. -Compile: +Then compile: ```bash ./tensorflow/lite/tools/make/build_aarch64_lib.sh diff --git a/tensorflow/lite/g3doc/guide/build_rpi.md b/tensorflow/lite/g3doc/guide/build_rpi.md index 1a438ab50e1..7ab4b434e4f 100644 --- a/tensorflow/lite/g3doc/guide/build_rpi.md +++ b/tensorflow/lite/g3doc/guide/build_rpi.md @@ -1,30 +1,42 @@ # Build TensorFlow Lite for Raspberry Pi -## Cross compiling +This page describes how to build the TensorFlow Lite static library for +Raspberry Pi. If you just want to start using TensorFlow Lite to execute your +models, the fastest option is to install the TensorFlow Lite runtime package as +shown in the [Python quickstart](python.md). -### Installing the toolchain +Note: This page shows how to compile only the C++ static library for +TensorFlow Lite. Alternative install options include: [install just the Python +interpreter API](python.md) (for inferencing only); [install the full +TensorFlow package from pip](https://www.tensorflow.org/install/pip); +or [build the full TensorFlow package]( +https://www.tensorflow.org/install/source_rpi). -This has been tested on Ubuntu 16.04.3 64bit and Tensorflow devel docker image + +## Cross-compile for Raspberry Pi + +This has been tested on Ubuntu 16.04.3 64bit and TensorFlow devel docker image [tensorflow/tensorflow:nightly-devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). -To cross compile TensorFlow Lite, first install the toolchain and libs. +To cross compile TensorFlow Lite, first install the toolchain and libs: ```bash sudo apt-get update sudo apt-get install crossbuild-essential-armhf ``` -> If you are using Docker, you may not use `sudo`. +If you are using Docker, you may not use `sudo`. -### Building - -Clone this Tensorflow repository, Run this script at the root of the repository to download all the dependencies: - -> The Tensorflow repository is in `/tensorflow` if you are using `tensorflow/tensorflow:nightly-devel` docker image, just try it. +Now git-clone the TensorFlow repository +(`https://github.com/tensorflow/tensorflow`)—if you're using the TensorFlow +Docker image, the repo is already provided in `/tensorflow_src/`—and then run +this script at the root of the TensorFlow repository to download all the +build dependencies: ```bash ./tensorflow/lite/tools/make/download_dependencies.sh ``` + Note that you only need to do this once. You should then be able to compile: @@ -36,23 +48,29 @@ You should then be able to compile: This should compile a static library in: `tensorflow/lite/tools/make/gen/rpi_armv7l/lib/libtensorflow-lite.a`. -## Native compiling + +## Compile natively on Raspberry Pi + This has been tested on Raspberry Pi 3b, Raspbian GNU/Linux 9.1 (stretch), gcc version 6.3.0 20170516 (Raspbian 6.3.0-18+rpi1). -Log in to you Raspberry Pi, install the toolchain. +Log in to your Raspberry Pi and install the toolchain: ```bash sudo apt-get install build-essential ``` -First, clone the TensorFlow repository. Run this at the root of the repository: +Now git-clone the TensorFlow repository +(`https://github.com/tensorflow/tensorflow`) and run this at the root of +the repository: ```bash ./tensorflow/lite/tools/make/download_dependencies.sh ``` + Note that you only need to do this once. You should then be able to compile: + ```bash ./tensorflow/lite/tools/make/build_rpi_lib.sh ``` diff --git a/tensorflow/lite/g3doc/guide/get_started.md b/tensorflow/lite/g3doc/guide/get_started.md index 72ddff4a8f0..a8f5daae9df 100644 --- a/tensorflow/lite/g3doc/guide/get_started.md +++ b/tensorflow/lite/g3doc/guide/get_started.md @@ -211,9 +211,15 @@ developers should use the ### Linux -Embedded Linux is an important platform for deploying machine learning. We -provide build instructions for both [Raspberry Pi](build_rpi.md) and -[Arm64-based boards](build_arm64.md) such as Odroid C2, Pine64, and NanoPi. +Embedded Linux is an important platform for deploying machine learning. To get +started using Python to perform inference with your TensorFlow Lite models, +follow the [Python quickstart](python.md). + +To instead install the C++ library, see the +build instructions for [Raspberry Pi](build_rpi.md) or +[Arm64-based boards](build_arm64.md) (for boards such as Odroid C2, Pine64, and +NanoPi). + ### Microcontrollers @@ -289,5 +295,8 @@ resources: * If you're a mobile developer, visit [Android quickstart](android.md) or [iOS quickstart](ios.md). +* If you're building Linux embedded devices, see the [Python quickstart]( + python.md) or C++ build instructions for [Raspberry Pi](build_rpi.md) and + [Arm64-based boards](build_arm64.md). * Explore our [pre-trained models](../models). * Try our [example apps](https://www.tensorflow.org/lite/examples). diff --git a/tensorflow/lite/g3doc/guide/python.md b/tensorflow/lite/g3doc/guide/python.md new file mode 100644 index 00000000000..cba5b2f6f3e --- /dev/null +++ b/tensorflow/lite/g3doc/guide/python.md @@ -0,0 +1,99 @@ +# Python quickstart + +Using TensorFlow Lite with Python is great for embedded devices based on Linux, +such as [Raspberry Pi](https://www.raspberrypi.org/){:.external} and +[Coral devices with Edge TPU](https://coral.withgoogle.com/){:.external}, +among many others. + +This page shows how you can start running TensorFlow Lite models with Python in +just a few minutes. All you need is a TensorFlow model [converted to TensorFlow +Lite](../convert/). (If you don't have a model converted yet, you can experiment +using the model provided with the example linked below.) + +## Install just the TensorFlow Lite interpreter + +To quickly start executing TensorFlow Lite models with Python, you can install +just the TensorFlow Lite interpreter, instead of all TensorFlow packages. + +This interpreter-only package is a fraction the size of the full TensorFlow +package and includes the bare minimum code required to run inferences with +TensorFlow Lite—it includes only the [`tf.lite.Interpreter`]( +https://www.tensorflow.org/api_docs/python/tf/lite/Interpreter) Python class. +This small package is ideal when all you want to do is execute `.tflite` models +and avoid wasting disk space with the large TensorFlow library. + +Note: If you need access to other Python APIs, such as the [TensorFlow Lite +Converter](../convert/python_api.md), you must install the [full TensorFlow +package](https://www.tensorflow.org/install/). + +To install, download the appropriate Python wheel for your system from the +following table, and then install it with with `pip install` command. + +For example, if you're setting up a Raspberry Pi (using Raspbian Buster, which +has Python 3.7), install the Python wheel as follows (after you click to +download the `.whl` file below): + +
+pip3 install tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whl
+
+ + + + + + + + + + + + + + + + + + +
ARM 32ARM 64x86-64
Python 3.5tflite_runtime-1.14.0-cp35-cp35m-linux_armv7l.whltflite_runtime-1.14.0-cp35-cp35m-linux_aarch64.whltflite_runtime-1.14.0-cp35-cp35m-linux_x86_64.whl
Python 3.6N/AN/Atflite_runtime-1.14.0-cp36-cp36m-linux_x86_64.whl
Python 3.7tflite_runtime-1.14.0-cp37-cp37m-linux_armv7l.whltflite_runtime-1.14.0-cp37-cp37m-linux_aarch64.whltflite_runtime-1.14.0-cp37-cp37m-linux_x86_64.whl
+ + +## Run an inference using tflite_runtime + +To distinguish this interpreter-only package from the full TensorFlow package +(allowing both to be installed, if you choose), the Python module provided in +the above wheel is named `tflite_runtime`. + +So instead of importing `Interpreter` from the `tensorflow` module, you need to +import it from `tflite_runtime`. + +For example, after you install the package above, copy and run the +[`label_image.py`]( +https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/examples/python/) +file. It will (probably) fail because you don't have the `tensorflow` library +installed. To fix it, simply edit this line of the file: + +```python +from tensorflow.lite.python.interpreter import Interpreter +``` + +So it instead reads: + +```python +from tflite_runtime import Interpreter +``` + +Now run `label_image.py` again. That's it! You're now executing TensorFlow Lite +models. + +For more details about the `Interpreter` API, read [Load and run a model +in Python](inference.md#load-and-run-a-model-in-python). + +To convert other TensorFlow models to TensorFlow Lite, read about the +the [TensorFlow Lite Converter](../convert/).