Kimmy
Kimmy
Oh I guess I misunderstood, pardon. So this experiment was on an ordinary Caffenet, not a residual network?
Thanks, that makes sense. It's interesting because it challenges the commonly-held assumption that batch norm before ReLU is better than after. I'd be interested to see how much of an...
awesome !! do you mind if i merged your fixes into my branch?
Yes, that's right. Perhaps this should have been made more clear with a comment or something: `input` refers to the same memory as `batch.inputs`, so mutating the values at `input`...
Hm, that's strange. Does evaluating `(require 'tumblesocks)` fix it for that session?
hm, interesting. You may need to mess with the learning rate; it certainly isn't supposed to explode that first time. It's normal for loss to increase a little bit (from...
Oops, sorry! Glad you found the issue. Should we add some initialization code to keep others from being bitten? When I ran the experiments in January, they worked; I wonder...
Hello! Glad you're enjoying Tumblesocks! I'm afraid I don't use Tumblr very often, so I'm not sure how videos are supposed to work. Does pasting the embed code into the...
Hello! Are you interested in **visualizing** the depth map for human consumption or do you want to convert to greyscale int16 format to work with other systems? First, load your...
Could we make this configurable? I'm working on a machine with unified GPU memory, so RAM use isn't a problem. Having an unconfigurable hardcoded limit would prevent me from upscaling...