STT-tensorflow/tf/tensorflow/lite/nnapi
Mihai Maruseac 06923bb4fe initial
2021-01-21 09:06:36 -08:00
..
BUILD initial 2021-01-21 09:06:36 -08:00
NeuralNetworksShim.h initial 2021-01-21 09:06:36 -08:00
NeuralNetworksTypes.h initial 2021-01-21 09:06:36 -08:00
nnapi_handler_test.cc initial 2021-01-21 09:06:36 -08:00
nnapi_handler.cc initial 2021-01-21 09:06:36 -08:00
nnapi_handler.h initial 2021-01-21 09:06:36 -08:00
nnapi_implementation_disabled.cc initial 2021-01-21 09:06:36 -08:00
nnapi_implementation_test.cc initial 2021-01-21 09:06:36 -08:00
nnapi_implementation.cc initial 2021-01-21 09:06:36 -08:00
nnapi_implementation.h initial 2021-01-21 09:06:36 -08:00
nnapi_util.cc initial 2021-01-21 09:06:36 -08:00
nnapi_util.h initial 2021-01-21 09:06:36 -08:00
README.md initial 2021-01-21 09:06:36 -08:00

Android Neural Network API

The Android Neural Networks API (NNAPI) is an Android C API designed for running computationally intensive operators for machine learning on mobile devices. Tensorflow Lite is designed to use the NNAPI to perform hardware-accelerated inference operators on supported devices. Based on the apps requirements and the hardware capabilities on a device, the NNAPI can distribute the computation workload across available on-device processors, including dedicated neural network hardware, graphics processing units (GPUs), and digital signal processors (DSPs). For devices that lack a specialized vendor driver, the NNAPI runtime relies on optimized code to execute requests on the CPU. For more information about the NNAPI, please refer to the NNAPI documentation