From 7ad1eb110f1966f6197f96f9e3b084137c350231 Mon Sep 17 00:00:00 2001 From: Smit Hinsu Date: Wed, 19 Feb 2020 17:25:04 -0800 Subject: [PATCH] NFC: Add a TODO to move HLO Relu legalizations to TF to TF lowering PiperOrigin-RevId: 296095163 Change-Id: Ic0b8d26c11c64e6584eef0da38b87e71d2dd03e8 --- .../compiler/mlir/xla/transforms/legalize_tf_patterns.td | 3 +++ 1 file changed, 3 insertions(+) diff --git a/tensorflow/compiler/mlir/xla/transforms/legalize_tf_patterns.td b/tensorflow/compiler/mlir/xla/transforms/legalize_tf_patterns.td index a78d9cc2d2d..872a288c259 100644 --- a/tensorflow/compiler/mlir/xla/transforms/legalize_tf_patterns.td +++ b/tensorflow/compiler/mlir/xla/transforms/legalize_tf_patterns.td @@ -368,6 +368,9 @@ def : Pat<(TF_ConstOp:$res ElementsAttr:$value), (HLO_ConstOp $value), // Relu op patterns. //===----------------------------------------------------------------------===// +// TODO(hinsu): Make these patterns to TF to TF lowering. Relu6 lowering will +// require HLO canonicalization of min and max on a tensor to ClampOp. + // TODO(hinsu): Lower unsinged and quantized types after supporting // them in GetScalarOfType. def : Pat<(TF_ReluOp AnyRankedTensor:$input),