Skip to content

elu

ELU activation.

This layer applies a Exponential Linear Unit function (alpha * (exp(x) - 1) for x < 0) to an output.


Parameter

  • alpha : Number, scale for the negative factor.