Update generated Python Op docs.

Change: 147423074
This commit is contained in:
A. Unique TensorFlower 2017-02-13 18:48:39 -08:00 committed by TensorFlower Gardener
parent 78c491caaf
commit 74d516c6aa
5 changed files with 5 additions and 96 deletions

View File

@ -7,10 +7,7 @@ Note: Functions taking `Tensor` arguments can also take anything accepted by
[TOC]
## Control Flow Operations
TensorFlow provides several operations and classes that you can use to control
the execution of operations and add conditional dependencies to your graph.
Control Flow Operations. See the @{python/control_flow_ops} guide.
- - -
@ -390,12 +387,6 @@ Example using shape_invariants:
```
## Logical Operators
TensorFlow provides several operations that you can use to add logical operators
to your graph.
- - -
### `tf.logical_and(x, y, name=None)` {#logical_and}
@ -462,12 +453,6 @@ Returns the truth value of x OR y element-wise.
x ^ y = (x | y) & ~(x & y).
## Comparison Operators
TensorFlow provides several operations that you can use to add comparison
operators to your graph.
- - -
### `tf.equal(x, y, name=None)` {#equal}
@ -644,12 +629,6 @@ has the same shape as `x` and `y`, then it chooses which element to copy from
* <b>`ValueError`</b>: When exactly one of `x` or `y` is non-None.
## Debugging Operations
TensorFlow provides several operations that you can use to validate values and
debug your graph.
- - -
### `tf.is_finite(x, name=None)` {#is_finite}

View File

@ -7,10 +7,7 @@ Note: Functions taking `Tensor` arguments can also take anything accepted by
[TOC]
## Script Language Operators.
TensorFlow provides allows you to wrap python/numpy functions as
TensorFlow operators.
Script Language Operators. See the @{python/script_ops} guide.
- - -

View File

@ -7,10 +7,7 @@ Note: Functions taking `Tensor` arguments can also take anything accepted by
[TOC]
## Tensor Handle Operations.
TensorFlow provides several operators that allows the user to keep tensors
"in-place" across run calls.
Tensor Handle Operations. See the @{python/session_ops} guide.
- - -

View File

@ -7,12 +7,7 @@ Note: Functions taking `Tensor` arguments can also take anything accepted by
[TOC]
## Sparse Tensor Representation
TensorFlow supports a `SparseTensor` representation for data that is sparse
in multiple dimensions. Contrast this representation with `IndexedSlices`,
which is efficient for representing tensors that are sparse in their first
dimension, and dense along all other dimensions.
Sparse Tensor Representation. See the @{python/sparse_ops} guide.
- - -
@ -322,9 +317,6 @@ Alias for field number 1
## Conversion
- - -
### `tf.sparse_to_dense(sparse_indices, output_shape, sparse_values, default_value=0, validate_indices=True, name=None)` {#sparse_to_dense}
@ -575,9 +567,6 @@ In this case the resulting `SparseTensor` has the following properties:
* <b>`ValueError`</b>: If `sp_ids` and `vocab_size` are lists of different lengths.
## Manipulation
- - -
### `tf.sparse_concat(axis, sp_inputs, name=None, expand_nonconcat_dim=False, concat_dim=None)` {#sparse_concat}
@ -1026,8 +1015,6 @@ then the output will be a `SparseTensor` of shape `[5, 4]` and
* <b>`TypeError`</b>: If `sp_input` is not a `SparseTensor`.
## Reduction
- - -
### `tf.sparse_reduce_sum(sp_input, axis=None, keep_dims=False, reduction_axes=None)` {#sparse_reduce_sum}
@ -1107,8 +1094,6 @@ which are interpreted according to the indexing rules in Python.
The reduced SparseTensor.
## Math Operations
- - -
### `tf.sparse_add(a, b, thresh=0)` {#sparse_add}

View File

@ -7,7 +7,7 @@ Note: Functions taking `Tensor` arguments can also take anything accepted by
[TOC]
## Variables
Variables. See the @{python/state_ops} guide.
- - -
@ -1170,12 +1170,6 @@ is on a different device it will get a copy of the variable.
## Variable helper functions
TensorFlow provides a set of functions to help manage the set of variables
collected in the graph.
- - -
### `tf.global_variables()` {#global_variables}
@ -1261,7 +1255,6 @@ This convenience function returns the contents of that collection.
A list of Variable objects.
- - -
### `tf.global_variables_initializer()` {#global_variables_initializer}
@ -1381,7 +1374,6 @@ logged by the C++ runtime. This is expected.
An Op, or None if there are no variables.
- - -
### `tf.assign(ref, value, validate_shape=None, use_locking=None, name=None)` {#assign}
@ -1471,9 +1463,6 @@ This makes it easier to chain operations that need to use the reset value.
to use the new value after the variable has been updated.
## Saving and Restoring Variables
- - -
### `class tf.train.Saver` {#Saver}
@ -1851,7 +1840,6 @@ Converts this `Saver` to a `SaverDef` protocol buffer.
- - -
### `tf.train.latest_checkpoint(checkpoint_dir, latest_filename=None)` {#latest_checkpoint}
@ -1871,7 +1859,6 @@ Finds the filename of latest saved checkpoint file.
The full path to the latest checkpoint or `None` if no checkpoint was found.
- - -
### `tf.train.get_checkpoint_state(checkpoint_dir, latest_filename=None)` {#get_checkpoint_state}
@ -1926,12 +1913,6 @@ proto.
* <b>`RuntimeError`</b>: If the save paths conflict.
## Sharing Variables
TensorFlow provides several classes and operations that you can use to
create variables contingent on certain conditions.
- - -
### `tf.get_variable(name, shape=None, dtype=None, initializer=None, regularizer=None, trainable=True, collections=None, caching_device=None, partitioner=None, validate_shape=True, use_resource=None, custom_getter=None)` {#get_variable}
@ -2506,7 +2487,6 @@ reduce the likelihood of collisions with kwargs.
* <b>`ValueError`</b>: if the name is None.
- - -
### `tf.no_regularizer(_)` {#no_regularizer}
@ -2514,7 +2494,6 @@ reduce the likelihood of collisions with kwargs.
Use this function to prevent regularization of variables.
- - -
### `class tf.constant_initializer` {#constant_initializer}
@ -2820,9 +2799,6 @@ Args:
## Variable Partitioners for Sharding
- - -
### `tf.fixed_size_partitioner(num_shards, axis=0)` {#fixed_size_partitioner}
@ -2908,21 +2884,6 @@ variable. The maximum number of such partitions (upper bound) is given by
`variable_scope`, `get_variable`, and `get_partitioned_variable_list`.
## Sparse Variable Updates
The sparse update ops modify a subset of the entries in a dense `Variable`,
either overwriting the entries or adding / subtracting a delta. These are
useful for training embedding models and similar lookup-based networks, since
only a small subset of embedding vectors change in any given step.
Since a sparse update of a large tensor may be generated automatically during
gradient computation (as in the gradient of
[`tf.gather`](../../api_docs/python/array_ops.md#gather)),
an [`IndexedSlices`](#IndexedSlices) class is provided that encapsulates a set
of sparse indices and values. `IndexedSlices` objects are detected and handled
automatically by the optimizers in most cases.
- - -
### `tf.scatter_update(ref, indices, updates, use_locking=None, name=None)` {#scatter_update}
@ -3499,9 +3460,6 @@ A `Tensor` containing the values of the slices.
### Read-only Lookup Tables
- - -
### `tf.initialize_all_tables(*args, **kwargs)` {#initialize_all_tables}
@ -3540,10 +3498,6 @@ Returns an Op that initializes all tables of the default graph.
not tables the returned Op is a NoOp.
## Exporting and Importing Meta Graphs
- - -
### `tf.train.export_meta_graph(filename=None, meta_info_def=None, graph_def=None, saver_def=None, collection_list=None, as_text=False, graph=None, export_scope=None, clear_devices=False, **kwargs)` {#export_meta_graph}
@ -3657,9 +3611,6 @@ device assignments have not changed.
(i.e., there are no variables to restore).
# Deprecated functions (removed after 2017-03-02). Please don't use them.
- - -
### `tf.all_variables(*args, **kwargs)` {#all_variables}