NN GitHub

Activations

Activation functions are available under nn.activations.

Usage

To use an activation function for a layer, you can pass its name as an argument:

nn.Dense(units=64, activation='relu')

Or pass an activation function:

nn.Dense(units=64, activation=nn.activations.relu)

Available Activations

  • relu
  • relu6
  • crelu
  • elu
  • selu
  • softplus
  • softsign
  • softmax
  • sigmoid
  • tanh

See Also