serve
serve copied to clipboard
running two yolo models in parallel on mor ethen one gpu
can I run two yolo models in parallel on more then one gpu what would be the best way to optimize the ansamble of two models? thank you
Hi @divastar are the two models the exact same? As in they do they share weights, do not communicate and run on the same inputs? If so all you need to is in your config.properties set initial_workers to some value greater than 1.
If your question is about running two models in parallel and aggregating their results in some way, you would need to write a custom handler where in your __init__() function you would load the weights for the second model and call them