STT-tensorflow/tensorflow/tools/api/golden/v1/tensorflow.distribute.experimental.-collective-communication.pbtxt
Ayush Dubey 34024edf7f Enable user to choose between all-reduce implementations in
MultiWorkerMirroredStrategy.

Possible choices: AUTO, RING - which uses `common_runtime/ring_reducer.{cc,h}`,
and NCCL - which uses Nvidia NCCL for all-reduce.

PiperOrigin-RevId: 236000699
2019-02-27 16:01:57 -08:00

17 lines
372 B
Plaintext

path: "tensorflow.distribute.experimental.CollectiveCommunication"
tf_class {
is_instance: "<enum \'CollectiveCommunication\'>"
member {
name: "AUTO"
mtype: "<enum \'CollectiveCommunication\'>"
}
member {
name: "NCCL"
mtype: "<enum \'CollectiveCommunication\'>"
}
member {
name: "RING"
mtype: "<enum \'CollectiveCommunication\'>"
}
}