[tf.data] Add note about performance cost of Dataset.unbatch.

PiperOrigin-RevId: 316788559
Change-Id: I87d0d8a2be0a6d6751baa838ea8acc1eb9ee8d9a
This commit is contained in:
Andrew Audibert 2020-06-16 17:16:15 -07:00 committed by TensorFlower Gardener
parent 01cfc8a8a3
commit 7873e14cf9

View File

@ -2099,6 +2099,10 @@ name=None))
>>> list(dataset.as_numpy_iterator()) >>> list(dataset.as_numpy_iterator())
[1, 2, 3, 1, 2, 1, 2, 3, 4] [1, 2, 3, 1, 2, 1, 2, 3, 4]
Note: `unbatch` requires a data copy to slice up the batched tensor into
smaller, unbatched tensors. When optimizing performance, try to avoid
unnecessary usage of `unbatch`.
Returns: Returns:
A `Dataset`. A `Dataset`.
""" """