YOLO-World
YOLO-World copied to clipboard
How to export the inference model that supports dynamic batch size in inference?
I'm faceing some problems as follow:
- Does the export to ONNX model support dynamic batch size?