*** Reason for rollback ***

Correctness > speed. And using log1p improves accuracy significantly, especially of gradients.

*** Original change description ***

Partial rollback of change to softplus functor. The large speedup observed was only true for arguments greather than ~5. For arguments less than ~5, a significant slowdown would occur.

PiperOrigin-RevId: 309487961
Change-Id: Ibe949c578a002d25250ba519acd16597981d1a4e
This commit is contained in:
A. Unique TensorFlower 2020-05-01 15:45:37 -07:00 committed by TensorFlower Gardener
parent b4d831bf38
commit 4877232727

View File

@ -53,7 +53,7 @@ struct Softplus {
activations.device(d) = too_large.select(
features, // softplus(x) ~= x for x large
too_small.select(features_exp, // softplus(x) ~= exp(x) for x small
features_exp.log1p()));
(features_exp + features.constant(T(1))).log()));
}
};