scimba_torch.neural_nets.coordinates_based_nets.discontinuous_mlp¶
A Multi-Layer Perceptron (MLP) with discontinuous layers.
Each hidden layer can either be discontinuous or regular.
Classes
|
Class that encodes a fully connected layer which can be discontinuous or not. |
|
A Multi-Layer Perceptron (MLP) with discontinuous layers. |
- class DiscontinuousLayer(in_size, out_size, **kwargs)[source]¶
Bases:
ModuleClass that encodes a fully connected layer which can be discontinuous or not.
It computes: \(y = \sigma(Ax + b) + \epsilon * H(Ax + b)\).
where \(H(x)\) is the Heaviside function and \(\epsilon\) is a learnable vector.
- Parameters:
in_size (
int) – The input dimension size.out_size (
int) – The output dimension size.**kwargs –
Keyword arguments including:
activation_type (
str): The activation function type. Defaults to “tanh”.dis (
bool): If True, the layer includes the discontinuous term, otherwise it behaves as a regular layer. Defaults to True.
Example
>>> layer = DiscontinuousLayer(10, 5, activation_type='relu', dis=True)
- linearlayer¶
The linear transformation applied to the inputs.
- eps¶
The parameters which multiply the Heaviside function. The size is the size of the output of the layer.
- class DiscontinuousMLP(in_size, out_size, **kwargs)[source]¶
Bases:
ScimbaModuleA Multi-Layer Perceptron (MLP) with discontinuous layers.
Each hidden layer can either be discontinuous or regular.
- Parameters:
in_size (
int) – Input dimension.out_size (
int) – Output dimension.**kwargs –
Keyword arguments including:
activation_type (
str): The type of activation function to be used for hidden layers. Defaults to “tanh”.activation_output (
str): The type of activation function for the output layer. Defaults to “id”.layer_sizes (
list[int]): List of sizes for each hidden layer. Defaults to[10, 20, 20, 20, 5].layer_type (
list[bool]): List of booleans indicating whether each hidden layer should be discontinuous. Defaults to[False, False, True, False, False].
- Raises:
ValueError – If layer_sizes and layer_type lists have different lengths.
Example
>>> model = DiscontinuousMLP( ... 10, 5, activation_type="relu", activation_output="tanh", ... layer_sizes=[50, 30], layer_type=[False, True, False] ... )
The list of discontinuous or regular layers in the model.
- output_layer¶
The final output layer.
- forward(inputs, with_last_layer=True)[source]¶
Forward pass through the discontinuous MLP network.
- Parameters:
inputs (
Tensor) – Input tensor.with_last_layer (
bool) – Whether to apply the final output layer. Defaults to True.
- Return type:
Tensor- Returns:
Output tensor after processing through the MLP.