Change ops to raw_ops for linking

Formatting ops like `raw_ops.Add` will link ops automatically to corresponding documentation.

PiperOrigin-RevId: 305010877
Change-Id: Ia6bda869aed9a3f49f78c11037b3e1ab89bb7289
This commit is contained in:
Taehee Jeong 2020-04-06 06:00:52 -07:00 committed by TensorFlower Gardener
parent 9fbe2a4de4
commit cf88f3c5d2

View File

@ -85,30 +85,30 @@ TfLiteCoreMlDelegateDelete(delegate);
Following ops are supported by the Core ML delegate.
* `raw_ops.Add`
* Add
* Only certain shapes are broadcastable. In Core ML tensor layout,
following tensor shapes are broadcastable. `[B, C, H, W]`, `[B, C, 1,
1]`, `[B, 1, H, W]`, `[B, 1, 1, 1]`.
* `raw_ops.AveragePool2D`
* `raw_ops.Concat`
* `raw_ops.Conv2D`
* AveragePool2D
* Concat
* Conv2D
* Weights and bias should be constant.
* `raw_ops.DepthwiseConv2D`
* DepthwiseConv2D
* Weights and bias should be constant.
* `raw_ops.Hardswish`
* `raw_ops.Logistic` (aka Sigmoid)
* `raw_ops.MaxPool2D`
* `raw_ops.Mul`
* Hardswish
* Logistic (aka Sigmoid)
* MaxPool2D
* Mul
* Only certain shapes are broadcastable. In Core ML tensor layout,
following tensor shapes are broadcastable. `[B, C, H, W]`, `[B, C, 1,
1]`, `[B, 1, H, W]`, `[B, 1, 1, 1]`.
* `raw_ops.Relu`
* `raw_ops.ReluN1To1`
* `raw_ops.Relu6`
* `raw_ops.Reshape`
* `raw_ops.ResizeBilinear`
* `raw_ops.SoftMax`
* `raw_ops.Tanh`
* Relu
* ReluN1To1
* Relu6
* Reshape
* ResizeBilinear
* SoftMax
* Tanh
## Feedback