As part of ongoing refactoring, `tflite::GetInput`, `tflite::GetOutput`, `tflite::GetTemporary` and `tflite::GetIntermediates` will return `nullptr` in some cases. Hence, we insert the `nullptr` checks on all usages.
We also insert `nullptr` checks on usages of `tflite::GetVariableInput` and `tflite::GetOptionalInputTensor` but only in the cases where there is no obvious check that `nullptr` is acceptable (that is, we only insert the check for the output of these two functions if the tensor is accessed as if it is always not `nullptr`).
PiperOrigin-RevId: 332521299
Change-Id: I29af455bcb48d0b92e58132d951a3badbd772d56
This codepath is unused when ruy is enabled, so stop linking it
into the kernel library.
PiperOrigin-RevId: 330617707
Change-Id: I0b583eff42a65f71d22ef500794fa74f30aa1b4e
Imported from GitHub PR https://github.com/tensorflow/tensorflow/pull/34903
### Description of issue:
For generating .tflite file with TFLiteConverter, when model contains
Conv2DTranspose layers, bias cannot fold into Operator TRANSPOSECONV.
It will result with extra Op ADD following Op TRANSPOSECONV.
But with other CONV-like layers (Conv2D, DepthwiseConv2D),
bias will be fold into CONV layer.
(check detailed TF issue: https://github.com/tensorflow/tensorflow/issues/34622)
### PR try to resolve it:
So we try to resolve this issue by enable TransposeConv with bias for TFLite:
- Update TFLite graph_transform features with:
fill TransposeConv bias with zero if there is no bias
fuse bias add into preceding TransposeConv(TEST added)
- Update TransposeConv with bias:
add bias input to TransposeConv
add optional bias to TransposeConv kernels
### example of the results:
TRANSPOSE_CONV inputs:
1. output_shape
2. weights
3. activation
4. bias

### Need to discuss:
~~currently this PR only update reference kernel for transpose_conv, optimised kernal is commented out.~~
~~several TEST need to be added as well, but~~ further suggestions are needed for adding additional test.
Copybara import of the project:
--
1c6eb9c98229a9e8248dc1fe913a20cc6dd89332 by Peng Sun <peng.sun@arm.com>:
Fuse TransposeConv with Bias
For generating .tflite file with TFLiteConverter, when model contains
Conv2DTranspose layers, bias cannot fold into Operator TRANSPOSECONV.
It will result with extra Op ADD following Op TRANSPOSECONV.
But with other CONV-like layers (Conv2D, DepthwiseConv2D),
bias will be fold into CONV layer.
(check TF issue: https://github.com/tensorflow/tensorflow/issues/34622)
So we try to resolve this issue by enable TransposeConv with bias for TFLite:
Update TFLite graph_transform features with:
fill TransposeConv bias with zero if there is no bias
fuse bias add into preceding TransposeConv(TEST added)
Update TransposeConv with bias:
add bias input to TransposeConv
add optional bias to TransposeConv kernels(TEST added)
--
22611b880c94eb753c88a0a3e2977200e55ebd2c by Peng Sun <peng.sun@arm.com>:
clang-format with google style.
COPYBARA_INTEGRATE_REVIEW=https://github.com/tensorflow/tensorflow/pull/34903 from psunn:TransposeConvWithBias 22611b880c94eb753c88a0a3e2977200e55ebd2c
PiperOrigin-RevId: 307872447
Change-Id: I367fcd65f2662f4c7846d37bc69dc43670c83961
The C types in lite/c/c_api_internal.h are not actually internal,
rather, they are common types used throughout the C++ and C APIs.
Rename the header accordingly.
PiperOrigin-RevId: 282494601
Change-Id: Ia784f35724d774db256ffcbbcdc5bb00e6574417
This change moves //tensorflow/contrib/lite to //tensorflow/lite in preparation
for TensorFlow 2.0's deprecation of contrib/. If you refer to TF Lite build
targets or headers, you will need to update them manually. If you use TF Lite
from the TensorFlow python package, "tf.contrib.lite" now points to "tf.lite".
Please update your imports as soon as possible.
For more details, see https://groups.google.com/a/tensorflow.org/forum/#!topic/tflite/iIIXOTOFvwQ
@angersson and @aselle are conducting this migration. Please contact them if
you have any further questions.
PiperOrigin-RevId: 219536476