scimba_torch.neural_nets.coordinates_based_nets.res_net¶
Residual Network (ResNet) architectures.
Classes
|
A general Residual Network (ResNet) architecture. |
- class GenericResNet(in_size, out_size, **kwargs)[source]¶
Bases:
ScimbaModuleA general Residual Network (ResNet) architecture.
The layer structure is defined by the layer_structure parameter, and specifies the width, depth, and skip connections. layer_structure is a list, where:
the first element is the width of the hidden layers,
the second element is the number of layers,
the remaining elements are list of pairs of integers representing the skip connections.
- For instance, the default value [10, 6, [1, 3], [4, 6]] means:
10 hidden units in each layer,
6 layers,
skip connection from layer 1 to layer 3,
skip connection from layer 4 to layer 6.
- Parameters:
in_size (
int) – Dimension of the inputout_size (
int) – Dimension of the output**kwargs –
Additional keyword arguments:
activation_type (
str, default=”tanh”): The activation function type to use in hidden layers.activation_output (
str, default=”id”): The activation function type to use in the output layer.layer_structure (
list, default=[10, 6, [1, 3], [4, 6]]): A list representing the layer structure of the ResNet.weights_norm_bool (
bool, default=False): If True, applies weight normalization to the layers.
Example
>>> model = ResNet( ... 4, 1, activation_type='tanh', ... layer_structure=[20, 6, [1, 3], [4, 6]] ... )
A list of hidden linear layers.
- output_layer¶
The final output linear layer.