torchbox.nn package
Submodules
torchbox.nn.activations module
- torchbox.nn.activations.crelu(x)
Computes Concatenated ReLU.
Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.elu(x)
Computes exponential linear element-wise.
\[y = \left\{ {\begin{tensor}{*{20}{c}}{x,\;\;\;\;\;\;\;\;\;x \ge 0}\\{{e^x} - 1,\;\;\;x < 0}\end{tensor}} \right.. \]See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.leaky_relu(x, alpha=0.2)
Compute the Leaky ReLU activation function.
\[y = \left\{ {\begin{tensor}{ccc}{x, x \ge 0}\\{\alpha x, x < 0}\end{tensor}} \right. \]Rectifier Nonlinearities Improve Neural Network Acoustic Models
- Parameters:
x (lists or tensor) – inputs
alpha (float) –
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.linear(x)
linear activation
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.relu(x)
Computes rectified linear
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.relu6(x)
Computes Rectified Linear 6
Convolutional Deep Belief Networks on CIFAR-10. A. Krizhevsky
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.selu(x)
Computes scaled exponential linear
\[y = \lambda \left\{ {\begin{tensor}{*{20}{c}}{x, x \ge 0}\\{\alpha ({e^x} - 1), x < 0}\end{tensor}} \right. \]where, , , See Self-Normalizing Neural Networks
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.sigmoid(x)
sigmoid function
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.softplus(x)
softplus function
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.softsign(x)
softsign function
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.swish(x, beta=1.0)
Swish function
See “Searching for Activation Functions” (Ramachandran et al. 2017)
- Parameters:
x (lists or tensor) – inputs
beta (float) –
- Returns:
outputs
- Return type:
tensor
- torchbox.nn.activations.tanh(x)
tanh function
- Parameters:
x (lists or tensor) – inputs
- Returns:
outputs
- Return type:
tensor