module
= SplitTable(dimension)
Creates a module that takes a Tensor as input and outputs several tables, splitting the Tensor along dimension dimension
.
Example 1:
require "lab" mlp=nn.SplitTable(2) x=lab.randn(4,3) pred=mlp:forward(x) for i,k in pairs(pred) do print(i,k); endgives the output:
1 1.3885 1.3295 0.4281 -1.0171 [torch.Tensor of dimension 4] 2 -1.1565 -0.8556 -1.0717 -0.8316 [torch.Tensor of dimension 4] 3 -1.3678 -0.1709 -0.0191 -2.5871 [torch.Tensor of dimension 4]
Example 2:
require "lab" mlp=nn.SplitTable(1) pred=mlp:forward(lab.randn(10,3)) for i,k in pairs(pred) do print(i,k); endgives the output:
1 1.6114 0.9038 0.8419 [torch.Tensor of dimension 3] 2 2.4742 0.2208 1.6043 [torch.Tensor of dimension 3] 3 1.3415 0.2984 0.2260 [torch.Tensor of dimension 3] 4 2.0889 1.2309 0.0983 [torch.Tensor of dimension 3]
A more complicated example:
require "lab" mlp=nn.Sequential(); --Create a network that takes a Tensor as input mlp:add(nn.SplitTable(2)) c=nn.ParallelTable() --The two Tensors go through two different Linear c:add(nn.Linear(10,3)) --Layers in Parallel c:add(nn.Linear(10,7)) mlp:add(c) --Outputing a table with 2 elements p=nn.ParallelTable() --These tables go through two more linear layers p:add(nn.Linear(3,2)) -- separately. p:add(nn.Linear(7,1)) mlp:add(p) mlp:add(nn.JoinTable(1)) --Finally, the tables are joined together and output. pred=mlp:forward(lab.randn(10,2)) print(pred) for i=1,100 do -- A few steps of training such a network.. x=lab.ones(10,2); y=torch.Tensor(3); y:copy(x:select(2,1,1):narrow(1,1,3)) pred=mlp:forward(x) criterion= nn.MSECriterion() local err=criterion:forward(pred,y) local gradCriterion = criterion:backward(pred,y); mlp:zeroGradParameters(); mlp:backward(x, gradCriterion); mlp:updateParameters(0.05); print(err) end