Noise layers

class lasagne.layers.DropoutLayer(incoming, p=0.5, rescale=True, shared_axes=(), **kwargs)[source]

Dropout layer

Sets values to zero with probability p. See notes for disabling dropout during testing.

Parameters:
incoming : a Layer instance or a tuple

the layer feeding into this layer, or the expected input shape

p : float or scalar tensor

The probability of setting a value to zero

rescale : bool

If True (the default), scale the input by 1 / (1 - p) when dropout is enabled, to keep the expected output mean the same.

shared_axes : tuple of int

Axes to share the dropout mask over. By default, each value can be dropped individually. shared_axes=(0,) uses the same mask across the batch. shared_axes=(2, 3) uses the same mask across the spatial dimensions of 2D feature maps.

See also

dropout_channels
Drops full channels of feature maps
spatial_dropout
Alias for dropout_channels()
dropout_locations
Drops full pixels or voxels of feature maps

Notes

The dropout layer is a regularizer that randomly sets input values to zero; see [1], [2] for why this might improve generalization.

The behaviour of the layer depends on the deterministic keyword argument passed to lasagne.layers.get_output(). If True, the layer behaves deterministically, and passes on the input unchanged. If False or not specified, dropout (and possibly scaling) is enabled. Usually, you would use deterministic=False at train time and deterministic=True at test time.

References

[1](1, 2) Hinton, G., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R. R. (2012): Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580.
[2](1, 2) Srivastava Nitish, Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. R. (2014): Dropout: A Simple Way to Prevent Neural Networks from Overfitting. Journal of Machine Learning Research, 5(Jun)(2), 1929-1958.
lasagne.layers.dropout[source]

alias of DropoutLayer

lasagne.layers.dropout_channels(incoming, *args, **kwargs)[source]

Convenience function to drop full channels of feature maps.

Adds a DropoutLayer that sets feature map channels to zero, across all locations, with probability p. For convolutional neural networks, this may give better results than independent dropout [1].

Parameters:
incoming : a Layer instance or a tuple

the layer feeding into this layer, or the expected input shape

*args, **kwargs

Any additional arguments and keyword arguments are passed on to the DropoutLayer constructor, except for shared_axes.

Returns:
layer : DropoutLayer instance

The dropout layer with shared_axes set to drop channels.

References

[1](1, 2) J. Tompson, R. Goroshin, A. Jain, Y. LeCun, C. Bregler (2014): Efficient Object Localization Using Convolutional Networks. https://arxiv.org/abs/1411.4280
lasagne.layers.spatial_dropout(incoming, *args, **kwargs)[source]

alias of dropout_channels()

lasagne.layers.dropout_locations(incoming, *args, **kwargs)[source]

Convenience function to drop full locations of feature maps.

Adds a DropoutLayer that sets feature map locations (i.e., pixels or voxels) to zero, across all channels, with probability p.

Parameters:
incoming : a Layer instance or a tuple

the layer feeding into this layer, or the expected input shape

*args, **kwargs

Any additional arguments and keyword arguments are passed on to the DropoutLayer constructor, except for shared_axes.

Returns:
layer : DropoutLayer instance

The dropout layer with shared_axes set to drop locations.

class lasagne.layers.GaussianNoiseLayer(incoming, sigma=0.1, **kwargs)[source]

Gaussian noise layer.

Add zero-mean Gaussian noise of given standard deviation to the input [1].

Parameters:
incoming : a Layer instance or a tuple

the layer feeding into this layer, or the expected input shape

sigma : float or tensor scalar

Standard deviation of added Gaussian noise

Notes

The Gaussian noise layer is a regularizer. During training you should set deterministic to false and during testing you should set deterministic to true.

References

[1](1, 2) K.-C. Jim, C. Giles, and B. Horne (1996): An analysis of noise in recurrent neural networks: convergence and generalization. IEEE Transactions on Neural Networks, 7(6):1424-1438.
get_output_for(input, deterministic=False, **kwargs)[source]
Parameters:
input : tensor

output from the previous layer

deterministic : bool

If true noise is disabled, see notes