torch-residual-networks
torch-residual-networks copied to clipboard
training data also normalized?
Hi G,
I am so new to Torch, just a quick concern about fetching training data. The training data is supposed to be normalized, too. However, I see no such operation in the dataTrain:getBatch()
call. Specifically, the code here does not pass input
value back to batch
. Can you point out where I misunderstood? Thanks!
If I'm reading correctly:
- this line https://github.com/gcr/torch-residual-networks/blob/master/data/cifar-dataset.lua#L58 creates a reference variable
input
, that points to one image, in the batch.inputs tensor - this line https://github.com/gcr/torch-residual-networks/blob/master/data/cifar-dataset.lua#L74 , ie the one you point to:
- flips the input, ie the image pointed to by
input
, which is in the batch.inputs tensor - copies the result of flipping, back into the batch.inputs tensor, via the
input
reference variable
- flips the input, ie the image pointed to by
(Edit: actually, input
is not a refernce variable, it's actually a brand-new torch tensor, but that tensor has the exact same underlying storage
as the origianl tensor, and any changes to the data in the input
tensor write through to the exact same storage in the original tensor. this is probably the new information you are looking for?)
Yes, that's right. Perhaps this should have been made more clear with a comment or something: input
refers to the same memory as batch.inputs
, so mutating the values at input
will also propagate to batch.input