Selects a dimension and index of a nxpxqx..
Tensor.
Example:
mlp=nn.Sequential(); mlp:add(nn.Select(1,3)) require "lab" x=lab.randn(10,5) print(x) print(mlp:forward(x))gives the output:
0.9720 -0.0836 0.0831 -0.2059 -0.0871 0.8750 -2.0432 -0.1295 -2.3932 0.8168 0.0369 1.1633 0.6483 1.2862 0.6596 0.1667 -0.5704 -0.7303 0.3697 -2.2941 0.4794 2.0636 0.3502 0.3560 -0.5500 -0.1898 -1.1547 0.1145 -1.1399 0.1711 -1.5130 1.4445 0.2356 -0.5393 -0.6222 -0.6587 0.4314 1.1916 -1.4509 1.9400 0.2733 1.0911 0.7667 0.4002 0.1646 0.5804 -0.5333 1.1621 1.5683 -0.1978 [torch.Tensor of dimension 10x5] 0.0369 1.1633 0.6483 1.2862 0.6596 [torch.Tensor of dimension 5]
This can be used in conjunction with Concat
to emulate the behavior
of Parallel
, or to select various parts of an input Tensor to
perform operations on. Here is a fairly complicated example:
require "lab" mlp=nn.Sequential(); c=nn.Concat(2) for i=1,10 do local t=nn.Sequential() t:add(nn.Select(1,i)) t:add(nn.Linear(3,2)) t:add(nn.Reshape(2,1)) c:add(t) end mlp:add(c) pred=mlp:forward(lab.randn(10,3)) print(pred) for i=1,10000 do -- Train for a few iterations x=lab.randn(10,3); y=lab.ones(2,10); pred=mlp:forward(x) criterion= nn.MSECriterion() err=criterion:forward(pred,y) gradCriterion = criterion:backward(pred,y); mlp:zeroGradParameters(); mlp:backward(x, gradCriterion); mlp:updateParameters(0.01); print(err) end