*** Reason for rollback ***

Correctness > speed. And using log1p improves accuracy significantly, especially of gradients.

*** Original change description ***

Partial rollback of change to softplus functor. The large speedup observed was only true for arguments greather than ~5. For arguments less than ~5, a significant slowdown would occur.

PiperOrigin-RevId: 309499644
Change-Id: Ie137dd5e2fa798f8c1e0330375e3e0b6dbcc31a6
This commit is contained in:
A. Unique TensorFlower 2020-05-01 16:57:59 -07:00 committed by TensorFlower Gardener
parent be894c5c42
commit 609c90f5b0

View File

@ -53,7 +53,7 @@ struct Softplus {
activations.device(d) = too_large.select(
features, // softplus(x) ~= x for x large
too_small.select(features_exp, // softplus(x) ~= exp(x) for x small
(features_exp + features.constant(T(1))).log()));
features_exp.log1p()));
}
};