Skip to content

relu

ReLU activation.

This layer applies a ReLU function (max(x,0)) to an input. It also supports taking the upper bound of the output; i.e. min(max(x,0), max_value).


Parameter

  • max_value : Number, the layer takes the upper bound if max_value >= 0.f.