STT-tensorflow/tensorflow/lite/nnapi/README.md
Austin Anderson 61c6c84964 Migrate TensorFlow Lite out of tensorflow/contrib
This change moves //tensorflow/contrib/lite to //tensorflow/lite in preparation
for TensorFlow 2.0's deprecation of contrib/. If you refer to TF Lite build
targets or headers, you will need to update them manually. If you use TF Lite
from the TensorFlow python package, "tf.contrib.lite" now points to "tf.lite".
Please update your imports as soon as possible.

For more details, see https://groups.google.com/a/tensorflow.org/forum/#!topic/tflite/iIIXOTOFvwQ

@angersson and @aselle are conducting this migration. Please contact them if
you have any further questions.

PiperOrigin-RevId: 219536476
2018-10-31 14:20:28 -07:00

16 lines
865 B
Markdown
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# Android Neural Network API
The Android Neural Networks API (NNAPI) is an Android C API designed for running
computationally intensive operators for machine learning on mobile devices.
Tensorflow Lite is designed to use the NNAPI to perform hardware-accelerated
inference operators on supported devices.
Based on the apps requirements and the hardware capabilities on a device, the
NNAPI can distribute the computation workload across available on-device
processors, including dedicated neural network hardware, graphics processing
units (GPUs), and digital signal processors (DSPs).
For devices that lack a specialized vendor driver, the NNAPI runtime relies on
optimized code to execute requests on the CPU. For more information about the
NNAPI, please refer to the [NNAPI documentation](https://developer.android.com/ndk/guides/neuralnetworks/index.html)