Skip to content
Carl Vondrick edited this page Sep 12, 2016 · 1 revision

Network

How to access intermediate activations

net:forward(input) -- forward data first!
net.modules[LAYER].output

How to share or tie weights

The below will create a copy of the network with all weights shared. The running_mean and running_var are for batch normalization, and you can remove them if you want.

net2 = net:clone('weight', 'bias', 'gradWeight', 'gradBias', 'running_mean', 'running_var')

How to freeze weights for all layers

net:apply(function(m) m.accGradParameters = function() end end)

How to freeze weights for a certain layer

net.modules[LAYER].accGradParameters = function() end