STT-tensorflow/tensorflow/python/compiler/tensorrt
TensorFlower Gardener ee11cf13ea Merge pull request from ROCmSoftwarePlatform:google_upstream_rocm_add_remove_no_rocm_tag
PiperOrigin-RevId: 348425014
Change-Id: I1f4ec5ab94f27ea201a9dbc8b53d337d03e01d6c
2020-12-21 00:32:48 -08:00
..
model_tests [TF:TRT] Adds ResultAnalyzer to analyze timing and numerical results of model testing. 2020-11-24 18:12:28 -08:00
test [TF-TRT] Enable allow_soft_placement for executing native segments in dynamic 2020-11-23 12:42:08 -08:00
__init__.py
BUILD Removing no_rocm tag from unit-tests that are now passing on the ROCm platform 2020-11-25 02:06:27 +00:00
README.md Delete extra new line in README 2019-05-31 15:50:17 -07:00
trt_convert_test.py [TF-TRT] Simplify the parameters for the TF-TRT converters. 2020-11-19 10:29:03 -08:00
trt_convert_windows.py [TF-TRT] Simplify the parameters for the TF-TRT converters. 2020-11-19 10:29:03 -08:00
trt_convert.py [TF-TRT] Simplify the parameters for the TF-TRT converters. 2020-11-19 10:29:03 -08:00

Using TensorRT in TensorFlow (TF-TRT)

This module provides necessary bindings and introduces TRTEngineOp operator that wraps a subgraph in TensorRT. This module is under active development.

Installing TF-TRT

Currently TensorFlow nightly builds include TF-TRT by default, which means you don't need to install TF-TRT separately. You can pull the latest TF containers from docker hub or install the latest TF pip package to get access to the latest TF-TRT.

If you want to use TF-TRT on NVIDIA Jetson platform, you can find the download links for the relevant TensorFlow pip packages here: https://docs.nvidia.com/deeplearning/dgx/index.html#installing-frameworks-for-jetson

Installing TensorRT

In order to make use of TF-TRT, you will need a local installation of TensorRT. Installation instructions for compatibility with TensorFlow are provided on the TensorFlow GPU support guide.

Examples

You can find example scripts for running inference on deep learning models in this repository: https://github.com/tensorflow/tensorrt

We have used these examples to verify the accuracy and performance of TF-TRT. For more information see Verified Models.

Documentation

TF-TRT documentation gives an overview of the supported functionalities, provides tutorials and verified models, explains best practices with troubleshooting guides.

Tests

TF-TRT includes both Python tests and C++ unit tests. Most of Python tests are located in the test directory and they can be executed using bazel test or directly with the Python command. Most of the C++ unit tests are used to test the conversion functions that convert each TF op to a number of TensorRT layers.

Compilation

In order to compile the module, you need to have a local TensorRT installation (libnvinfer.so and respective include files). During the configuration step, TensorRT should be enabled and installation path should be set. If installed through package managers (deb,rpm), configure script should find the necessary components from the system automatically. If installed from tar packages, user has to set path to location where the library is installed during configuration.

bazel build --config=cuda --config=opt //tensorflow/tools/pip_package:build_pip_package
bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/