diff --git a/RELEASE.md b/RELEASE.md index 6f3aa94c203..7fc620bc18a 100644 --- a/RELEASE.md +++ b/RELEASE.md @@ -164,7 +164,7 @@ Coinciding with this change, new releases of [TensorFlow's Docker images](https: * `saved_model_cli aot_compile_cpu` allows you to compile saved models to XLA header+object files and include them in your C++ programs. * Enable `Igamma`, `Igammac` for XLA. * Deterministic Op Functionality: - * XLA reduction emitter is deterministic when the environment variable `TF_DETERMINISTIC_OPS` is set to "true" or "1". This extends deterministic `tf.nn.bias_add` back-prop functionality (and therefore also deterministic back-prop of bias-addition in Keras layers) to include when XLA JIT complilation is enabled. + * XLA reduction emitter is deterministic when the environment variable `TF_DETERMINISTIC_OPS` is set to "true" or "1". This extends deterministic `tf.nn.bias_add` back-prop functionality (and therefore also deterministic back-prop of bias-addition in Keras layers) to include when XLA JIT compilation is enabled. * Fix problem, when running on a CUDA GPU and when either environment variable `TF_DETERMINSTIC_OPS` or environment variable `TF_CUDNN_DETERMINISTIC` is set to "true" or "1", in which some layer configurations led to an exception with the message "No algorithm worked!" * Tracing and Debugging: * Add source, destination name to `_send` traceme to allow easier debugging.