Sequential provides a means to plug layers together in a feed-forward fully connected manner.
E.g. creating a one hidden-layer multi-layer perceptron is thus just as easy as:
mlp = nn.Sequential() mlp:add( nn.Linear(10, 25) ) -- 10 input, 25 hidden units mlp:add( nn.Tanh() ) -- some hyperbolic tangent transfer function mlp:add( nn.Linear(25, 1) ) -- 1 output require "lab" print(mlp:forward(lab.randn(10)))which gives the output:
-0.1815 [torch.Tensor of dimension 1]