DeepLearnToolbox
DeepLearnToolbox copied to clipboard
out of memory for CNN
Hello. I have many bitmap images, about over 100000. When I apply the many bitmaps to CNN even after I resized a bitmap size to 32x32, I got an out-of- memory error from Matlab. Can I try to divide the bitmap images to some smaller groups and do cnntrain() with each group repeatedly? In this case, does CNN train all trainning bitmap images well? For example,
cnn = cnnsetup(cnn, train_x1, train_y1);
opts.alpha = 1; opts.batchsize = 50; opts.numepochs = 10;
cnn = cnntrain(cnn, train_x1, train_y1, opts); train_x = read_from_file("train_x2"); train_y = read_from_file("train_y2"); cnn = cnntrain(cnn, train_x, train_y, opts); train_x = read_from_file("train_x3"); train_y = read_from_file("train_y3"); cnn = cnntrain(cnn, train_x, train_y, opts); train_x = read_from_file("train_x4"); train_y = read_from_file("train_y4"); cnn = cnntrain(cnn, train_x, train_y, opts); ... (and so on..)
[er, bad] = cnntest(cnn, test_x, test_y);