pyaibox.nn package

Submodules

pyaibox.nn.activations module

pyaibox.nn.activations.crelu(x)

Computes Concatenated ReLU.

Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.

Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.elu(x)

Computes exponential linear element-wise.

\[y = \left\{ {\begin{array}{*{20}{c}}{x,\;\;\;\;\;\;\;\;\;x \ge 0}\\{{e^x} - 1,\;\;\;x < 0}\end{array}} \right.. \]

See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)

Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.leaky_relu(x, alpha=0.2)

Compute the Leaky ReLU activation function.

\[y = \left\{ {\begin{array}{ccc}{x, x \ge 0}\\{\alpha x, x < 0}\end{array}} \right. \]

Rectifier Nonlinearities Improve Neural Network Acoustic Models

Parameters
  • x (lists or array) – inputs

  • alpha (float) – \(\alpha\)

Returns

outputs

Return type

array

pyaibox.nn.activations.linear(x)

linear activation

\[y = x \]
Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.relu(x)

Computes rectified linear

\[{\rm max}(x, 0) \]
Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.relu6(x)

Computes Rectified Linear 6

\[{\rm min}({\rm max}(x, 0), 6) \]

Convolutional Deep Belief Networks on CIFAR-10. A. Krizhevsky

Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.selu(x)

Computes scaled exponential linear

\[y = \lambda \left\{ {\begin{array}{*{20}{c}}{x, x \ge 0}\\{\alpha ({e^x} - 1), x < 0}\end{array}} \right. \]

where, \(\alpha = 1.6732632423543772848170429916717\) , \(\lambda = 1.0507009873554804934193349852946\), See Self-Normalizing Neural Networks

Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.sigmoid(x)

sigmoid function

\[y = \frac{e^x}{e^x + 1} \]
Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.softplus(x)

softplus function

\[{\rm log}(e^x + 1) \]
Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.softsign(x)

softsign function

\[\frac{x} {({\rm abs}(x) + 1)} \]
Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

pyaibox.nn.activations.swish(x, beta=1.0)

Swish function

\[y = x\cdot {\rm sigmoid}(\beta x) = {e^{(\beta x)} \over {e^{(\beta x)} + 1}} \cdot x \]

See “Searching for Activation Functions” (Ramachandran et al. 2017)

Parameters
  • x (lists or array) – inputs

  • beta (float) – \(\beta\)

Returns

outputs

Return type

array

pyaibox.nn.activations.tanh(x)

tanh function

\[y = {\rm tanh}(x) = {{e^{2x} - 1} \over {e^{2x} + 1}}. \]
Parameters

x (lists or array) – inputs

Returns

outputs

Return type

array

Module contents