This package provides an easy way to build and train simple or complex neural networks.
Each module of a network is composed of Modules and there
are several sub-classes of Module available: container classes like
Sequential, Parallel and
Concat , which can contain simple layers like
Linear, Mean, Max and
Reshape, as well as convolutional layers, and transfer
functions like Tanh.
Loss functions are implemented as sub-classes of Criterion. They are helpful to train neural network on classical tasks. 
Common criterions are the
Mean Squared Error criterion implemented in MSECriterion
and the cross-entropy criterion implemented in ClassNLLCriterion.
Finally, the StochasticGradient class provides a
high level way to train the neural network of choice, even though it is
easy with a simple for loop to train a neural network yourself.