AI_challenger_Chinese_Caption
AI_challenger_Chinese_Caption copied to clipboard
I got the out of memory error...
All codes worded well but prepro_ai_challenger.py.
`for i, img in enumerate(imgs):
# load the image
real_path = img['filepath'] + "/" + img['filename']
I = skimage.io.imread(os.path.join(params['images_root']+"/", real_path)) # note the path
# handle grayscale input images
if len(I.shape) == 2:
I = I[:, :, np.newaxis]
I = np.concatenate((I, I, I), axis=2)
I = I.astype('float32') / 255.0
I = torch.from_numpy(I.transpose([2, 0, 1])).cuda()
I = Variable(preprocess(I), volatile=True)
tmp_fc, tmp_att = my_resnet(I)
# write to h5
dset_fc[i] = tmp_fc.data.cpu().float().numpy()
dset_att[i] = tmp_att.data.cpu().float().numpy()
if i % 1000 == 0:
print 'processing %d/%d (%.2f%% done)' % (i, N, i * 100.0 / N)`
I got the out of memory error. Have you ever got this?
Make sure that your GPU must have 8 GB memory to generate the image feature
@lxtGH Yes. My GPU is 1080, it 's 8GB. While I running the code I used 'nvidia-smi' and found the memory of GPU continued increasing. It looks like the memory has not been released.
I think that this may require more than 8GB GPU memory, I use 12GB GPU
Have you used 'nvidia-smi' while running this part? When I finished 3000 pictures I got oom error, you know there are 24000 pictures..... If used memory continued increasing without released I don't think 12GB is enough.
Yes, I did, the memory was increasing, I remembered that it needs 8300MB in total to run, now I can't run because there is no space for this.
XD....
Thank you for your sharing. Can you give me your e-mail address if you are convenien? @lxtGH
[email protected] @summerZXH
Can we use multiple-GPU?