tf.keras.activations.relu

Applies the rectified linear unit activation function.

With default values, this returns the standard ReLU activation: max(x, 0), the element-wise maximum of 0 and the input tensor.

Modifying default parameters allows you to use non-zero thresholds, change the max value of the activation, and to use a non-zero multiple of the input for values below the threshold.

Examples:

x = [-10, -5, 0.0, 5, 10]
keras.activations.relu(x)
[ 0.,  0.,  0.,  5., 10.]
keras.activations.relu(x, negative_slope=0.5)
[-5. , -2.5,  0. ,  5. , 10. ]
keras.activations.relu(x, max_value=5.)
[0., 0., 0., 5., 5.]
keras.activations.relu(x, threshold=5.)
[-0., -0.,  0.,  0., 10.]

xInput tensor.
negative_slopeA float that controls the slope for values lower than the threshold.
max_valueA float that sets the saturation threshold (the largest value the function will return).
thresholdA float giving the threshold value of the activation function below which values will be damped or set to zero.

A tensor with the same shape and dtype as input x.