Introduce Task library in the modeling page of object detection

PiperOrigin-RevId: 338147727
Change-Id: I2136991519695142e03224708b34aa2b2b3ed274
This commit is contained in:
Lu Wang 2020-10-20 15:14:27 -07:00 committed by TensorFlower Gardener
parent ce8d708785
commit 776e574e2a

View File

@ -14,18 +14,13 @@ annotated:
## Get started
If you are new to TensorFlow Lite and are working with Android or iOS, download
the following example applications to get started.
<a class="button button-primary" href="https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android">Android
example</a>
<a class="button button-primary" href="https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/ios">iOS
example</a>
To learn how to use object detection in a mobile app, explore the
<a href="#example_applications_and_guides">Example applications and guides</a>.
If you are using a platform other than Android or iOS, or if you are already
familiar with the
<a href="https://www.tensorflow.org/api_docs/python/tf/lite">TensorFlow Lite
APIs</a>, you can download the starter object detection model and the
APIs</a>, you can download our starter object detection model and the
accompanying labels.
<a class="button button-primary" href="https://tfhub.dev/tensorflow/lite-model/ssd_mobilenet_v1/1/metadata/1?lite-format=tflite">Download
@ -45,6 +40,38 @@ For the following use cases, you should use a different type of model:
<li>Predicting the composition of an image, for example subject versus background (see <a href="../segmentation/overview.md">segmentation</a>)</li>
</ul>
### Example applications and guides
If you are new to TensorFlow Lite and are working with Android or iOS, we
recommend exploring the following example applications that can help you get
started.
#### Android
You can leverage the out-of-box API from TensorFlow Lite Task Library to
[integrate object detection models](../../inference_with_metadata/task_library/object_detector)
in just a few lines of code. You can also
[build your own custom inference pipeline](../../guide/inference#load_and_run_a_model_in_java)
using the TensorFlow Lite Interpreter Java API.
The Android example below demonstrates the implementation for both methods as
[lib_task_api](https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android/lib_task_api)
and
[lib_interpreter](https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android/lib_interpreter),
respectively.
<a class="button button-primary" href="https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android">View
Android example</a>
#### iOS
You can integrate the model using the
[TensorFlow Lite Interpreter Swift API](../../guide/inference#load_and_run_a_model_in_swift).
See the iOS example below.
<a class="button button-primary" href="https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/ios">View
iOS example</a>
## Model description
This section describes the signature for