blueoil icon indicating copy to clipboard operation
blueoil copied to clipboard

Support dual inference at same time on FPGA

Open tk26eng opened this issue 4 years ago • 3 comments

Now runtime only support one inference on FPGA. But sometimes running two models is useful. We might need the feature for that.

tk26eng avatar Jun 19 '20 02:06 tk26eng

Related issue: #666

primenumber avatar Jun 19 '20 07:06 primenumber

@tk26eng @primenumber

Just a related question: Is there a possibility of using multiple FPGAs for inference? (Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)

kalpitthakkar-lm avatar Jun 22 '20 05:06 kalpitthakkar-lm

@kalpitthakkar-lm

@tk26eng @primenumber

Just a related question: Is there a possibility of using multiple FPGAs for inference? (Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)

Maybe we don't have any plan to use multiple FPGAs for inference. Bigger FPGA is easier way to handle large model on FPGA instead of multiple FPGAs.

tk26eng avatar Jun 23 '20 04:06 tk26eng