convert_yolo_weights
convert_yolo_weights copied to clipboard
Error when converting Yolov4-Tiny model to Keras: buffer is too small for requested array
Hi,
I'm running the convert_weights_to_keras.py script, with weights trained in darknet and the corresponding cfg.
The conversion fails with error:
Parsing section convolutional_19
conv2d bn leaky (3, 3, 384, 256)
Traceback (most recent call last):
File "convert_weights_to_keras.py", line 320, in
It seems the binary is ending before it's expected, it expects to read 3538944 bytes but instead it reads only 2786388 bytes from the weights file.
The input size is not square in the cfg, but 640x480. I'm certain there's no mismatch between the weights and cfg file.
Why could this be happening?
Has anything changed in the binary format the script may need to be adapted to?
Any pointers would be greatly appreciated, thanks.
Hi @gabouy I am facing the same issue. Were you able to find a solution?
My hunch is that the converter does not support certain route statements like this:
[route]
layers=-1
groups=2
group_id=1
This probably causes inflated size calculations for convolutional weights down the line.