radio icon indicating copy to clipboard operation
radio copied to clipboard

Kernel is crashed after using all available RAM

Open LarisaKozhekina opened this issue 5 years ago • 2 comments

Hi guys! I try to execute Tutorial3 (and 4) with Goggle Colaboratory notebooks (GPU with 12GB RAM). I use only subset0 from luna16. Kernel is crashed when I execute "Shortcut to RadIO capabilities: pipelines-submodule" from tutorial3. In [34]: batch_crops = (luna_dataset >> crops_sampling).next_batch(7)

Are data from 1 subset very big for 12 GB RAM?

Timestamp Level Message
Feb 17, 2019, 12:41:07 PM WARNING WARNING:root:kernel 29f58e92-c82d-4d59-8881-97130a866161 restarted
Feb 17, 2019, 12:41:07 PM INFO KernelRestarter: restarting kernel (1/5), keep random ports
Feb 17, 2019, 12:40:43 PM WARNING tcmalloc: large alloc 5872025600 bytes == 0xfe3e8000 @ 0x7fc77217c001 0x7fc76686fb85 0x7fc7668d2b43 0x7fc7668d4a86 0x7fc76696c868 0x5030d5 0x506859 0x504c28 0x501b2e 0x591461 0x59ebbe 0x507c17 0x504c28 0x502540 0x502f3d 0x506859 0x504c28 0x502540 0x502f3d 0x506859 0x504c28 0x58650d 0x59ebbe 0x507c17 0x504c28 0x501b2e 0x591461 0x59ebbe 0x507c17 0x502209 0x502f3d

My notebook is https://drive.google.com/open?id=1j3xoSUvKF-Z6rl6lO3HcJ-dLxBKUUkzx

Thank you for help

LarisaKozhekina avatar Feb 17 '19 09:02 LarisaKozhekina

Hello, @LarisaKozhekina!

Thanks for the issue. In fact, when using the line

batch_crops = (luna_dataset >> crops_sampling).next_batch(7)

one only passes a batch of 7 scans through the sampling pipeline, not the whole dataset (subset0). Yet, it is possible that 7 scans are already too much for 12 GB of RAM! Still, it seems to me that your problem can be easily solved:

  1. It looks like you attempt to run the tutorial cell by cell. If that's the case, keep in mind, that you don't need all previous calculations when running cell#34! Just don't forget to load info about nodules locations nodules_df (cell#7).
  2. If running only needed cells doesn't help, you might benefit from reducing the size of batch passed through the pipeline:
batch_crops = (luna_dataset >> crops_sampling).next_batch(4)

Best, Alex.

akoryagin avatar Mar 18 '19 15:03 akoryagin

Алексей, здравствуйте,спасибо.Мне пришлось отключить GPU в Google Colaboratory и уменьшить размер батча до 4 -- С уважением,Лариса   18.03.2019, 18:00, "Alexander Koryagin" [email protected]:Hello, @LarisaKozhekina!Thanks for the issue. In fact, when using the linebatch_crops = (luna_dataset >> crops_sampling).next_batch(7) one only passes a batch of 7 scans through the sampling pipeline, not the whole dataset (subset0). Yet, it is possible that 7 scans are already too much for 12 GB of RAM! Still, it seems to me that your problem can be easily solved:It looks like you attempt to run the tutorial cell by cell. If that's the case, keep in mind, that you don't need all previous calculations when running cell#34! Just don't forget to load info about nodules locations nodules_df (cell#7).If running only needed cells doesn't help, you might benefit from reducing the size of batch passed through the pipeline:batch_crops = (luna_dataset >> crops_sampling).next_batch(4) Best,Alex.—You are receiving this because you were mentioned.Reply to this email directly, view it on GitHub, or mute the thread.

LarisaKozhekina avatar Mar 18 '19 15:03 LarisaKozhekina