Activations
Activation functions for neural networks.
This module provides custom activation functions that extend the standard PyTorch activation functions with additional functionality and flexibility.
Classes:
-
GeneralRelu : nn.Module
–A generalized Rectified Linear Unit with configurable leaky slope, output subtraction, and maximum value clipping.
Notes
All activation functions in this module are designed to be compatible with PyTorch's nn.Module interface and can be used as drop-in replacements for standard activation functions in neural network architectures.
Examples:
>>> import torch
>>> from cmn_ai.activations import GeneralRelu
>>>
>>> # Create a generalized ReLU with custom parameters
>>> act = GeneralRelu(leak=0.1, sub=0.4, maxv=6.0)
>>>
>>> # Apply to input tensor
>>> x = torch.tensor([-2.0, -0.5, 0.5, 2.0])
>>> output = act(x)
>>> print(output)
tensor([-0.6000, -0.4500, 0.1000, 1.6000])
GeneralRelu
Bases: Module
A generalized Rectified Linear Unit (ReLU) activation function with optional leaky slope, output subtraction, and maximum value clipping.
Parameters:
-
leak
(float
, default:0.1
) –Negative slope for values less than zero, similar to LeakyReLU. If None (default), standard ReLU behavior is used (all negatives set to 0).
-
sub
(float
, default:0.4
) –A constant value to subtract from the activation output after applying ReLU/LeakyReLU. If None (default), no subtraction is applied.
-
maxv
(float
, default:None
) –Maximum value to clip the activation output to. If None (default), no clipping is applied.
Attributes:
-
leak
(float or None
) –The negative slope applied to negative inputs.
-
sub
(float or None
) –The value subtracted from the activation output.
-
maxv
(float or None
) –The upper bound for output clipping.
Methods:
-
forward
–Applies the configured activation transformation to the input tensor.
Examples:
>>> import torch
>>> act = GeneralReLU(leak=0.1, sub=0.4, maxv=6.0)
>>> x = torch.tensor([-2.0, -0.5, 0.5, 2.0])
>>> act(x)
tensor([-0.6000, -0.4500, 0.1000, 1.6000])
forward(x)
Apply the generalized ReLU activation.
Parameters:
-
x
(Tensor
) –Input tensor.
Returns:
-
Tensor
–Activated tensor, possibly leaky for negative values, with optional subtraction and clipping applied.