ArpanGyawali

Results 7 comments of ArpanGyawali

Thank you. I didnt see it earlier. yes do i need to convert the model to torchscript and use docker to deploy it. What inference script should i use so...

Yes there is an entrypoint, but for that the end user need to install nndetection right? What do you mean by adapt the model?

@mibaumgartner that means the inference cannot be performed on the device with no gpu support? gpu is mandatory for inference as well or is there is a way inference/prediction can...

@mibaumgartner When Will the inferencing on CPU be available on later release?

hi @Thibescobar , Has there been procress in production/deployment of the model? - the first step of creating the docker image and then using tensorRT. Is this successful and whats...

@Thibescobar , thank you so much Yes, just after the 1st step without using TensorRT, the inference time is quite efficient. Can you thoroughly explain, what you did to create...

@mibaumgartner @Thibescobar , any comment