You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+39-17Lines changed: 39 additions & 17 deletions
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# pytorch-containers
2
2
3
-
This repository aims to help former Torchies more seemlessly transition to the "Containerless" world of
3
+
This repository aims to help former Torchies more seamlessly transition to the "Containerless" world of
4
4
[PyTorch](https://github.com/pytorch/pytorch)
5
5
by providing a list of PyTorch implementations of [Torch Table Layers](https://github.com/torch/nn/blob/master/doc/table.md).
6
6
@@ -68,7 +68,7 @@ flexibility as your architectures become more complex, and it's also a lot easie
68
68
remembering the exact functionality of ConcatTable, or any of the other tables for that matter.
69
69
70
70
Two other things to note:
71
-
- To work with autograd, we must wrap our input in a `Variable`
71
+
- To work with autograd, we must wrap our input in a `Variable` (we can also pass a python iterable of Variables)
72
72
- PyTorch requires us to add a batch dimension which is why we call `.unsqueeze(0)` on the input
73
73
74
74
@@ -202,7 +202,7 @@ class TableModule(nn.Module):
202
202
super(TableModule,self).__init__()
203
203
204
204
defforward(self,x1,x2):
205
-
x_sum = x1+x2
205
+
x_sum = x1+x2# could use sum() if input given as python iterable
206
206
x_sub = x1-x2
207
207
x_div = x1/x2
208
208
x_mul = x1*x2
@@ -242,42 +242,62 @@ And we get:
242
242
## Intuitively Build Complex Architectures
243
243
244
244
Now we will visit a more complex example that combines several of the above operations.
245
-
The graph below is a random network that I created using the Torch [nngraph](https://github.com/torch/nngraph) package,
246
-
and the Torch code definition using nngraph can be found [here](https://github.com/amdegroot/pytorch-containers/blob/master/complex_graph.lua) and a raw Torch implementation can be found [here](https://github.com/amdegroot/pytorch-containers/blob/master/complex_net.lua) for comparison to the PyTorch code that follows.
245
+
The graph below is a random network that I created using the Torch [nngraph](https://github.com/torch/nngraph) package. The Torch model definition using nngraph can be found [here](https://github.com/amdegroot/pytorch-containers/blob/master/complex_graph.lua) and a raw Torch implementation can be found [here](https://github.com/amdegroot/pytorch-containers/blob/master/complex_net.lua) for comparison to the PyTorch code that follows.
Here we see that autograd allows us to safely reuse Variables in
286
+
defining the computational graph. We could also reuse Modules or even
287
+
use loops or conditional statements.
288
+
Note: Some of this could be condensed, but it is laid out the way it
289
+
is for clarity.
290
+
"""
273
291
x =self.net1(x)
274
-
x1 =self.branch1(x)
275
-
y =self.net2(x)
276
-
x2 =self.branch2(y)
292
+
x1 =self.branch1(x)# SplitTable (implicitly)
293
+
y =self.net2(x)
294
+
x2 =self.branch2(y)# SplitTable (implicitly)
277
295
x3 =self.net3(y).view(-1)
278
-
output = torch.cat((x1,x2,x3),0)
296
+
output = torch.cat((x1,x2,x3),0)# JoinTable
279
297
return output
280
-
298
+
```
299
+
This is a loop to define our VGG conv layers derived from [pytorch/vision](https://github.com/pytorch/vision/blob/master/torchvision/models/vgg.py). (maybe a little overkill for our small case)
300
+
```Python
281
301
defmake_layers(params, ch):
282
302
layers = []
283
303
channels = ch
@@ -289,3 +309,5 @@ def make_layers(params, ch):
289
309
290
310
net = ComplexNet(make_layers([64,64],3),make_layers([128,128],64))
291
311
```
312
+
This documented python code can be found [here](https://github.com/amdegroot/pytorch-containers/blob/master/complex_net.py).
0 commit comments