From 08ab10f127bf7ef566c3b9930e729a6ba4cdd596 Mon Sep 17 00:00:00 2001 From: Terry Heo Date: Thu, 19 Mar 2020 18:02:53 -0700 Subject: [PATCH] Fix formatting of lite/tools/pip_package/REDAME.md PiperOrigin-RevId: 301934203 Change-Id: I9146b9d9afca2f95576edbb2588dff8d8759e79e --- tensorflow/lite/tools/pip_package/README.md | 21 ++++++++++++++------- 1 file changed, 14 insertions(+), 7 deletions(-) diff --git a/tensorflow/lite/tools/pip_package/README.md b/tensorflow/lite/tools/pip_package/README.md index 88906ee2bb0..849bbf57813 100644 --- a/tensorflow/lite/tools/pip_package/README.md +++ b/tensorflow/lite/tools/pip_package/README.md @@ -6,44 +6,51 @@ Python without requiring the rest of TensorFlow. ## Steps To build a binary wheel run this script: -``` + +```sh sudo apt install swig libjpeg-dev zlib1g-dev python3-dev python3-numpy sh tensorflow/lite/tools/pip_package/build_pip_package.sh ``` That will print out some output and a .whl file. You can then install that -``` + +```sh pip install --upgrade ``` You can also build a wheel inside docker container using make tool. For example the following command will cross-compile tflite-runtime package for python2.7 and python3.7 (from Debian Buster) on Raspberry Pi: -``` + +```sh make BASE_IMAGE=debian:buster PYTHON=python TENSORFLOW_TARGET=rpi docker-build make BASE_IMAGE=debian:buster PYTHON=python3 TENSORFLOW_TARGET=rpi docker-build ``` Another option is to cross-compile for python3.5 (from Debian Stretch) on ARM64 board: -``` + +```sh make BASE_IMAGE=debian:stretch PYTHON=python3 TENSORFLOW_TARGET=aarch64 docker-build ``` To build for python3.6 (from Ubuntu 18.04) on x86_64 (native to the docker image) run: -``` + +```sh make BASE_IMAGE=ubuntu:18.04 PYTHON=python3 TENSORFLOW_TARGET=native docker-build ``` In addition to the wheel there is a way to build Debian package by adding BUILD_DEB=y to the make command (only for python3): -``` + +```sh make BASE_IMAGE=debian:buster PYTHON=python3 TENSORFLOW_TARGET=rpi BUILD_DEB=y docker-build ``` Note, unlike tensorflow this will be installed to a tflite_runtime namespace. You can then use the Tensorflow Lite interpreter as. -``` + +```python from tflite_runtime.interpreter import Interpreter interpreter = Interpreter(model_path="foo.tflite") ```