2 This package implements a new module nn.DAG which inherits from nn.Container and allows to combine modules in an arbitrary graph without cycle.
6 A typical use would be:
11 a = nn.Linear(100, 10)
18 model:connect(b, nn.Linear(10, 15), nn.ReLU(), d)
21 model:connect(c, nn.Mul(-1), e)
24 model:setOutput({ d, e })
26 input = torch.Tensor(30, 100):uniform()
27 output = model:updateOutput(input)
31 which would encode the following graph
33 +- Linear(10, 10) -> ReLU ---> d -->
36 --> a --> b -----------> c --------------+
41 and run a forward pass with a random batch of 30 samples.
43 Note that DAG:connect allows to add a bunch of edges at once. This is particularly useful to add anonymous modules which have a single predecessor and successor.
47 If a node has a single successor, its output is sent unchanged as input to that successor. If it has multiple successors, the outputs are collected into a table, and the table is used as input to the successor node. The indexes of the outputs in that table reflects the order of the DAG:connect() commands.
49 The expected input (respectively the produced output) is a nested table of inputs reflecting the structure of the nested table of modules provided to DAG:setInput (respectively DAG:setOutput)
51 So for instance, in the example above, the model expects a tensor as input, since it is the input to the module a, and its output will is a table composed of two tensors, corresponding to the outputs of d and e respectively.
53 *Francois Fleuret, Jan 13th, 2017*