Update generated Python Op docs.
Change: 144144091
This commit is contained in:
parent
973d5afdb6
commit
5d9a05a8b0
@ -904,7 +904,7 @@ pad(t, paddings, "SYMMETRIC") ==> [[2, 1, 1, 2, 3, 3, 2],
|
||||
|
||||
- - -
|
||||
|
||||
### `tf.concat_v2(values, axis, name='concat_v2')` {#concat_v2}
|
||||
### `tf.concat(values, axis, name='concat')` {#concat}
|
||||
|
||||
Concatenates tensors along one dimension.
|
||||
|
||||
@ -929,20 +929,20 @@ For example:
|
||||
```python
|
||||
t1 = [[1, 2, 3], [4, 5, 6]]
|
||||
t2 = [[7, 8, 9], [10, 11, 12]]
|
||||
tf.concat_v2([t1, t2], 0) ==> [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]
|
||||
tf.concat_v2([t1, t2], 1) ==> [[1, 2, 3, 7, 8, 9], [4, 5, 6, 10, 11, 12]]
|
||||
tf.concat([t1, t2], 0) ==> [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]
|
||||
tf.concat([t1, t2], 1) ==> [[1, 2, 3, 7, 8, 9], [4, 5, 6, 10, 11, 12]]
|
||||
|
||||
# tensor t3 with shape [2, 3]
|
||||
# tensor t4 with shape [2, 3]
|
||||
tf.shape(tf.concat_v2([t3, t4], 0)) ==> [4, 3]
|
||||
tf.shape(tf.concat_v2([t3, t4], 1)) ==> [2, 6]
|
||||
tf.shape(tf.concat([t3, t4], 0)) ==> [4, 3]
|
||||
tf.shape(tf.concat([t3, t4], 1)) ==> [2, 6]
|
||||
```
|
||||
|
||||
Note: If you are concatenating along a new axis consider using stack.
|
||||
E.g.
|
||||
|
||||
```python
|
||||
tf.concat_v2([tf.expand_dims(t, axis) for t in tensors], axis)
|
||||
tf.concat([tf.expand_dims(t, axis) for t in tensors], axis)
|
||||
```
|
||||
|
||||
can be rewritten as
|
||||
@ -3131,3 +3131,12 @@ Compute gradients for a FakeQuantWithMinMaxVarsPerChannel operation.
|
||||
`sum_per_d(gradients * (inputs > max))`.
|
||||
|
||||
|
||||
|
||||
## Other Functions and Classes
|
||||
- - -
|
||||
|
||||
### `tf.concat_v2(values, axis, name='concat_v2')` {#concat_v2}
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1973,9 +1973,13 @@ variable.
|
||||
|
||||
- - -
|
||||
|
||||
### `tf.contrib.learn.evaluate(graph, output_dir, checkpoint_path, eval_dict, update_op=None, global_step_tensor=None, supervisor_master='', log_every_steps=10, feed_fn=None, max_steps=None)` {#evaluate}
|
||||
### `tf.contrib.learn.evaluate(*args, **kwargs)` {#evaluate}
|
||||
|
||||
Evaluate a model loaded from a checkpoint.
|
||||
Evaluate a model loaded from a checkpoint. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
Given `graph`, a directory to write summaries to (`output_dir`), a checkpoint
|
||||
to restore variables from, and a `dict` of `Tensor`s to evaluate, run an eval
|
||||
@ -2029,9 +2033,13 @@ and written to `output_dir`.
|
||||
|
||||
- - -
|
||||
|
||||
### `tf.contrib.learn.infer(restore_checkpoint_path, output_dict, feed_dict=None)` {#infer}
|
||||
### `tf.contrib.learn.infer(*args, **kwargs)` {#infer}
|
||||
|
||||
Restore graph from `restore_checkpoint_path` and run `output_dict` tensors.
|
||||
Restore graph from `restore_checkpoint_path` and run `output_dict` tensors. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
If `restore_checkpoint_path` is supplied, restore from checkpoint. Otherwise,
|
||||
init all variables.
|
||||
@ -2061,14 +2069,22 @@ init all variables.
|
||||
|
||||
### `tf.contrib.learn.run_feeds(*args, **kwargs)` {#run_feeds}
|
||||
|
||||
See run_feeds_iter(). Returns a `list` instead of an iterator.
|
||||
See run_feeds_iter(). Returns a `list` instead of an iterator. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
|
||||
- - -
|
||||
|
||||
### `tf.contrib.learn.run_n(output_dict, feed_dict=None, restore_checkpoint_path=None, n=1)` {#run_n}
|
||||
### `tf.contrib.learn.run_n(*args, **kwargs)` {#run_n}
|
||||
|
||||
Run `output_dict` tensors `n` times, with the same `feed_dict` each run.
|
||||
Run `output_dict` tensors `n` times, with the same `feed_dict` each run. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
##### Args:
|
||||
|
||||
@ -2088,9 +2104,13 @@ Run `output_dict` tensors `n` times, with the same `feed_dict` each run.
|
||||
|
||||
- - -
|
||||
|
||||
### `tf.contrib.learn.train(graph, output_dir, train_op, loss_op, global_step_tensor=None, init_op=None, init_feed_dict=None, init_fn=None, log_every_steps=10, supervisor_is_chief=True, supervisor_master='', supervisor_save_model_secs=600, keep_checkpoint_max=5, supervisor_save_summaries_steps=100, feed_fn=None, steps=None, fail_on_nan_loss=True, monitors=None, max_steps=None)` {#train}
|
||||
### `tf.contrib.learn.train(*args, **kwargs)` {#train}
|
||||
|
||||
Train a model.
|
||||
Train a model. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
Given `graph`, a directory to write outputs to (`output_dir`), and some ops,
|
||||
run a training loop. The given `train_op` performs one step of training on the
|
||||
|
@ -1,6 +1,10 @@
|
||||
### `tf.contrib.learn.train(graph, output_dir, train_op, loss_op, global_step_tensor=None, init_op=None, init_feed_dict=None, init_fn=None, log_every_steps=10, supervisor_is_chief=True, supervisor_master='', supervisor_save_model_secs=600, keep_checkpoint_max=5, supervisor_save_summaries_steps=100, feed_fn=None, steps=None, fail_on_nan_loss=True, monitors=None, max_steps=None)` {#train}
|
||||
### `tf.contrib.learn.train(*args, **kwargs)` {#train}
|
||||
|
||||
Train a model.
|
||||
Train a model. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
Given `graph`, a directory to write outputs to (`output_dir`), and some ops,
|
||||
run a training loop. The given `train_op` performs one step of training on the
|
||||
|
@ -1,6 +1,10 @@
|
||||
### `tf.contrib.learn.evaluate(graph, output_dir, checkpoint_path, eval_dict, update_op=None, global_step_tensor=None, supervisor_master='', log_every_steps=10, feed_fn=None, max_steps=None)` {#evaluate}
|
||||
### `tf.contrib.learn.evaluate(*args, **kwargs)` {#evaluate}
|
||||
|
||||
Evaluate a model loaded from a checkpoint.
|
||||
Evaluate a model loaded from a checkpoint. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
Given `graph`, a directory to write summaries to (`output_dir`), a checkpoint
|
||||
to restore variables from, and a `dict` of `Tensor`s to evaluate, run an eval
|
||||
|
@ -1,4 +1,8 @@
|
||||
### `tf.contrib.learn.run_feeds(*args, **kwargs)` {#run_feeds}
|
||||
|
||||
See run_feeds_iter(). Returns a `list` instead of an iterator.
|
||||
See run_feeds_iter(). Returns a `list` instead of an iterator. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
|
@ -1,6 +1,10 @@
|
||||
### `tf.contrib.learn.run_n(output_dict, feed_dict=None, restore_checkpoint_path=None, n=1)` {#run_n}
|
||||
### `tf.contrib.learn.run_n(*args, **kwargs)` {#run_n}
|
||||
|
||||
Run `output_dict` tensors `n` times, with the same `feed_dict` each run.
|
||||
Run `output_dict` tensors `n` times, with the same `feed_dict` each run. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
##### Args:
|
||||
|
||||
|
@ -0,0 +1,58 @@
|
||||
### `tf.concat(values, axis, name='concat')` {#concat}
|
||||
|
||||
Concatenates tensors along one dimension.
|
||||
|
||||
Concatenates the list of tensors `values` along dimension `axis`. If
|
||||
`values[i].shape = [D0, D1, ... Daxis(i), ...Dn]`, the concatenated
|
||||
result has shape
|
||||
|
||||
[D0, D1, ... Raxis, ...Dn]
|
||||
|
||||
where
|
||||
|
||||
Raxis = sum(Daxis(i))
|
||||
|
||||
That is, the data from the input tensors is joined along the `axis`
|
||||
dimension.
|
||||
|
||||
The number of dimensions of the input tensors must match, and all dimensions
|
||||
except `axis` must be equal.
|
||||
|
||||
For example:
|
||||
|
||||
```python
|
||||
t1 = [[1, 2, 3], [4, 5, 6]]
|
||||
t2 = [[7, 8, 9], [10, 11, 12]]
|
||||
tf.concat([t1, t2], 0) ==> [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]
|
||||
tf.concat([t1, t2], 1) ==> [[1, 2, 3, 7, 8, 9], [4, 5, 6, 10, 11, 12]]
|
||||
|
||||
# tensor t3 with shape [2, 3]
|
||||
# tensor t4 with shape [2, 3]
|
||||
tf.shape(tf.concat([t3, t4], 0)) ==> [4, 3]
|
||||
tf.shape(tf.concat([t3, t4], 1)) ==> [2, 6]
|
||||
```
|
||||
|
||||
Note: If you are concatenating along a new axis consider using stack.
|
||||
E.g.
|
||||
|
||||
```python
|
||||
tf.concat([tf.expand_dims(t, axis) for t in tensors], axis)
|
||||
```
|
||||
|
||||
can be rewritten as
|
||||
|
||||
```python
|
||||
tf.stack(tensors, axis=axis)
|
||||
```
|
||||
|
||||
##### Args:
|
||||
|
||||
|
||||
* <b>`values`</b>: A list of `Tensor` objects or a single `Tensor`.
|
||||
* <b>`axis`</b>: 0-D `int32` `Tensor`. Dimension along which to concatenate.
|
||||
* <b>`name`</b>: A name for the operation (optional).
|
||||
|
||||
##### Returns:
|
||||
|
||||
A `Tensor` resulting from concatenation of the input tensors.
|
||||
|
@ -1,58 +1,4 @@
|
||||
### `tf.concat_v2(values, axis, name='concat_v2')` {#concat_v2}
|
||||
|
||||
Concatenates tensors along one dimension.
|
||||
|
||||
Concatenates the list of tensors `values` along dimension `axis`. If
|
||||
`values[i].shape = [D0, D1, ... Daxis(i), ...Dn]`, the concatenated
|
||||
result has shape
|
||||
|
||||
[D0, D1, ... Raxis, ...Dn]
|
||||
|
||||
where
|
||||
|
||||
Raxis = sum(Daxis(i))
|
||||
|
||||
That is, the data from the input tensors is joined along the `axis`
|
||||
dimension.
|
||||
|
||||
The number of dimensions of the input tensors must match, and all dimensions
|
||||
except `axis` must be equal.
|
||||
|
||||
For example:
|
||||
|
||||
```python
|
||||
t1 = [[1, 2, 3], [4, 5, 6]]
|
||||
t2 = [[7, 8, 9], [10, 11, 12]]
|
||||
tf.concat_v2([t1, t2], 0) ==> [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]
|
||||
tf.concat_v2([t1, t2], 1) ==> [[1, 2, 3, 7, 8, 9], [4, 5, 6, 10, 11, 12]]
|
||||
|
||||
# tensor t3 with shape [2, 3]
|
||||
# tensor t4 with shape [2, 3]
|
||||
tf.shape(tf.concat_v2([t3, t4], 0)) ==> [4, 3]
|
||||
tf.shape(tf.concat_v2([t3, t4], 1)) ==> [2, 6]
|
||||
```
|
||||
|
||||
Note: If you are concatenating along a new axis consider using stack.
|
||||
E.g.
|
||||
|
||||
```python
|
||||
tf.concat_v2([tf.expand_dims(t, axis) for t in tensors], axis)
|
||||
```
|
||||
|
||||
can be rewritten as
|
||||
|
||||
```python
|
||||
tf.stack(tensors, axis=axis)
|
||||
```
|
||||
|
||||
##### Args:
|
||||
|
||||
|
||||
* <b>`values`</b>: A list of `Tensor` objects or a single `Tensor`.
|
||||
* <b>`axis`</b>: 0-D `int32` `Tensor`. Dimension along which to concatenate.
|
||||
* <b>`name`</b>: A name for the operation (optional).
|
||||
|
||||
##### Returns:
|
||||
|
||||
A `Tensor` resulting from concatenation of the input tensors.
|
||||
|
||||
|
@ -1,6 +1,10 @@
|
||||
### `tf.contrib.learn.infer(restore_checkpoint_path, output_dict, feed_dict=None)` {#infer}
|
||||
### `tf.contrib.learn.infer(*args, **kwargs)` {#infer}
|
||||
|
||||
Restore graph from `restore_checkpoint_path` and run `output_dict` tensors.
|
||||
Restore graph from `restore_checkpoint_path` and run `output_dict` tensors. (deprecated)
|
||||
|
||||
THIS FUNCTION IS DEPRECATED. It will be removed after 2017-02-15.
|
||||
Instructions for updating:
|
||||
graph_actions.py will be deleted. Use tf.train.* utilities instead. You can use learn/estimators/estimator.py as an example.
|
||||
|
||||
If `restore_checkpoint_path` is supplied, restore from checkpoint. Otherwise,
|
||||
init all variables.
|
||||
|
@ -139,6 +139,7 @@
|
||||
* [`broadcast_dynamic_shape`](../../api_docs/python/array_ops.md#broadcast_dynamic_shape)
|
||||
* [`broadcast_static_shape`](../../api_docs/python/array_ops.md#broadcast_static_shape)
|
||||
* [`cast`](../../api_docs/python/array_ops.md#cast)
|
||||
* [`concat`](../../api_docs/python/array_ops.md#concat)
|
||||
* [`concat_v2`](../../api_docs/python/array_ops.md#concat_v2)
|
||||
* [`depth_to_space`](../../api_docs/python/array_ops.md#depth_to_space)
|
||||
* [`dequantize`](../../api_docs/python/array_ops.md#dequantize)
|
||||
@ -362,6 +363,7 @@
|
||||
* [`scan`](../../api_docs/python/functional_ops.md#scan)
|
||||
|
||||
* **[TensorArray Operations](../../api_docs/python/tensor_array_ops.md)**:
|
||||
* [`concat`](../../api_docs/python/tensor_array_ops.md#concat)
|
||||
* [`gather`](../../api_docs/python/tensor_array_ops.md#gather)
|
||||
* [`identity`](../../api_docs/python/tensor_array_ops.md#identity)
|
||||
* [`split`](../../api_docs/python/tensor_array_ops.md#split)
|
||||
|
Loading…
Reference in New Issue
Block a user