tf.compat.v1.nn.crelu

Computes Concatenated ReLU.

Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.

featuresA Tensor with type float, double, int32, int64, uint8, int16, or int8.
nameA name for the operation (optional).
axisThe axis that the output values are concatenated along. Default is -1.

A Tensor with the same type as features.

Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units: Shang et al., 2016 (pdf)