darknet
darknet copied to clipboard
OpenCV(4.6.0) /croot/opencv-suite_1691620365762/work/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 82924982787168 bytes in function 'OutOfMemoryError'
I have one class and I try to train yolov3 in a custom dataset so I am using:
time ./darknet detector train build/darknet/x64/obj.data cfg/yolo_vessel.cfg yolov3.weights -dont_show -ext_output < train.txt > results/result.txt
but I get this error which says that I need a huge amount of memory for augmentation purposes.
OpenCV(4.6.0) /croot/opencv-suite_1691620365762/work/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 82924982787168 bytes in function OutOfMemoryError
along with this error:
OpenCV can't augment image: 255 x 255 OpenCV(4.6.0) /croot/opencv-suite_1691620365762/work/modules/core/src/matrix.cpp:246: error: (-215:Assertion failed) s >= 0 in function setSize
but I have 64 gb ram, geforce 1070 8gb and intel i7 and I have tried to disable augmentation parameters but the same thing happens.
Testing batch=64 subdivisions=64 Training batch=64 subdivisions=64 width=255 height=255 channels=3 momentum=0.949 decay=0.0005 angle=0 saturation = 1.5 exposure = 1.5 hue=.1 learning_rate=0.001 burn_in=1000 max_batches=2000 policy=steps steps=1600,1800 scales=1,1
I have tried to recude batch_size=1 but it gives me an error that I should increase it to 64 along with subdivisions. Am I missing something?
It is only a suggestion and appears when you have batch_size=1.
The OutOfMemoryError
is your GPU out of memory, the GeForce 1070
For example, an RTX 3rd gen at 8GB. I can set the batch to 64, subdivisions 18 or batch 64, subdivisions 32 and more variants.
You need to play around with the numbers. Keep in mind the pros and cons to batch size vs. subdivisions.
https://stats.stackexchange.com/questions/153531/what-is-batch-size-in-neural-network