tf.compat.v1.nn.y_relu

Compute the y ReLU activation function.

Source: Rectifier Nonlinearities Improve Neural Network Acoustic Models. AL Maas, AY Hannun, AY Ng - Proc. ICML, 2013.

featuresA Tensor representing preactivation values. Must be one of the following types: float16, float32, float64, int32, int64.
alphaSlope of the activation function at x < 0.
nameA name for the operation (optional).

The activation value.

Rectifier Nonlinearities Improve Neural Network Acoustic Models: Maas et al., 2013 (pdf)