tf_trt_models icon indicating copy to clipboard operation
tf_trt_models copied to clipboard

Unable to change batch_size

Open EvGe22 opened this issue 5 years ago • 3 comments

I am running a detection example with ssd_inception_v2 and I have changed the max_batch_size to 24, but when I try to actually compute a batch of any size other than 1 I get this error: ValueError: Cannot feed value of shape (12, 300, 300, 3) for Tensor 'input:0', which has shape '(1, ?, ?, 3)' Is there anything else that needs to be changed?

EvGe22 avatar Sep 20 '18 11:09 EvGe22

You may need to change the batch size of the placeholder node to be consistent with what you provide to trt.create_inference_graph. This can be done at the GraphDef level.

Could you try the following before calling trt.create_inference_node:

for node in frozen_graph.node:
    if 'Placeholder' in node.op:
        node.attr['shape'].shape.dim[0].size = 12 # or whatever you set max_batch_size to

Thanks!

ghost avatar Sep 24 '18 23:09 ghost

Yeah, I have figured it out eventually. Thanks I have actually changed the shape of the placeholder in the build_detection_graph function.

EvGe22 avatar Sep 26 '18 08:09 EvGe22

Great. I will work on adding this feature in.

ghost avatar Sep 26 '18 17:09 ghost