6bb9ca398d
PiperOrigin-RevId: 311127183 Change-Id: I9011f48a3a753d0fdae5cff869a1b28ff1ccda3a |
||
---|---|---|
.. | ||
cmake | ||
java/org/tensorflow/contrib/android | ||
jni | ||
BUILD | ||
README.md | ||
asset_manager_filesystem.cc | ||
asset_manager_filesystem.h |
README.md
Android TensorFlow support
This directory defines components (a native .so
library and a Java JAR)
geared towards supporting TensorFlow on Android. This includes:
- The TensorFlow Java API
- A
TensorFlowInferenceInterface
class that provides a smaller API surface suitable for inference and summarizing performance of model execution.
For example usage, see TensorFlowImageClassifier.java in the TensorFlow Android Demo.
For prebuilt libraries, see the nightly Android build artifacts page for a recent build.
The TensorFlow Inference Interface is also available as a
JCenter package
(see the tensorflow-android directory) and can be included quite simply in your
android project with a couple of lines in the project's build.gradle
file:
allprojects {
repositories {
jcenter()
}
}
dependencies {
compile 'org.tensorflow:tensorflow-android:+'
}
This will tell Gradle to use the
latest version
of the TensorFlow AAR that has been released to
JCenter.
You may replace the +
with an explicit version label if you wish to
use a specific release of TensorFlow in your app.
To build the libraries yourself (if, for example, you want to support custom TensorFlow operators), pick your preferred approach below:
Bazel
First follow the Bazel setup instructions described in tensorflow/examples/android/README.md
Then, to build the native TF library:
bazel build -c opt //tensorflow/tools/android/inference_interface:libtensorflow_inference.so \
--crosstool_top=//external:android/crosstool \
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
--cxxopt=-std=c++11 \
--cpu=armeabi-v7a
Replacing armeabi-v7a
with your desired target architecture.
The library will be located at:
bazel-bin/tensorflow/tools/android/inference_interface/libtensorflow_inference.so
To build the Java counterpart:
bazel build //tensorflow/tools/android/inference_interface:android_tensorflow_inference_java
You will find the JAR file at:
bazel-bin/tensorflow/tools/android/inference_interface/libandroid_tensorflow_inference_java.jar
CMake
For documentation on building a self-contained AAR file with cmake, see tensorflow/tools/android/inference_interface/cmake.
Makefile
For documentation on building native TF libraries with make, including a CUDA-enabled variant for devices like the Nvidia Shield TV, see tensorflow/contrib/makefile/README.md
AssetManagerFileSystem
This directory also contains a TensorFlow filesystem supporting the Android asset manager. This may be useful when writing native (C++) code that is tightly coupled with TensorFlow. For typical usage, the library above will be sufficient.