segaware icon indicating copy to clipboard operation
segaware copied to clipboard

Using Im2col and bottom_is_im2col needs more memory

Open lazatsoc opened this issue 6 years ago • 0 comments

I am trying to train VGG16 with my own data. I have cropped the images to 224x224. When I train with VGG16 as provided by Caffe model zoo (https://gist.github.com/ksimonyan/211839e770f7b538e2d8) I can train with batch size 32. After replacing all convolution layers with im2col followed by a convolution layer with bottom_is_im2col I can train with maximum batch size 8 without "Out of memory" error. First of all, I wonder if this is normal behavior given that typical convolution layers use im2col internally. Secondly, is there a way to reduce memory needs? Thanks in advance,

lazatsoc avatar Dec 12 '18 10:12 lazatsoc