mikelam92
                                            mikelam92
                                        
                                    Hi @vaishu950102 , Did you find the solution?
We are using nvcr.io/nvidia/tritonserver:22.03-py3. The question is will there be continued support to Ubuntu 18.04? If not, is the recommendation to build our own image based on the section "unsupported...
We encounter the same issue, it stuck at loading model without returning an error. How can we properly diagnose? Thanks
Hi @Tabrizian we might go down the path of creating the image ourselves. And we would like to follow up on the questions [here](https://github.com/triton-inference-server/server/issues/4590#issuecomment-1182839279). Thank you!
https://github.com/pytorch/pytorch/issues/18325 Move `model.py` to wherever you are loading the models with `torch.load` would resolve this.