server icon indicating copy to clipboard operation
server copied to clipboard

Python Backend complains "triton_python_backend_utils" has no attribute "InferenceRequest"

Open Michael-Jing opened this issue 2 years ago • 12 comments

I'm using the python business logic scripting, and a conda packed python environment with python3.8. Both 22.06 and 22.07 version show the following error message "UNAVAILABLE: Internal: AttributeError: module 'triton_python_backend_utils' has no attribute 'InferenceRequest'". but it works ok on the third party docker image flyingmachine/tritonserver-w-ort-1.11.0

Michael-Jing avatar Aug 05 '22 02:08 Michael-Jing

@Tabrizian ^^^

tanmayv25 avatar Aug 05 '22 18:08 tanmayv25

Hi @Michael-Jing, sorry about the delay. Can you please share the structure of your model repository? Are you copying triton_python_backend_utils in your model directory? If yes, that is why you are observing this error.

Tabrizian avatar Aug 09 '22 18:08 Tabrizian

Hi, sorry for the late reply, I don't copy triton_python_backend_utils. part of the repository is like this image

and I use the following config

parameters: { key: "EXECUTION_ENV_PATH", value: {string_value: "$$TRITON_MODEL_DIRECTORY/../python38.tar.gz"} }

Michael-Jing avatar Aug 23 '22 09:08 Michael-Jing

Opening a bug with the team to investigate further. Most likely it is a user issue. @Michael-Jing Can you share your model repository and exact steps to reproduce the issue? We don't think tritonserver-w-ort-1.11.0 should make any difference. Can you look carefully into the differences between the two?

tanmayv25 avatar Sep 07 '22 18:09 tanmayv25

Hi, I found that the reason for the error is I defined a funciton with the following Signature in my worker.py

import triton_python_backend_utils as pbu

def process(request: pbu.InferenceRequest):

after I removed the pbu.InferenceRequest type annotation for request, it works fine.

Michael-Jing avatar Sep 12 '22 07:09 Michael-Jing

Thanks for updating us, Michael! Closing ticket.

dyastremsky avatar Sep 30 '22 22:09 dyastremsky

why remove type annotation would help? Or maybe I should ask, why use pbu.InferenceRequest would cause issue?

sfc-gh-zhwang avatar Jul 20 '23 18:07 sfc-gh-zhwang

I find this confusing as well - all the examples make use of the type pb_utils.InferenceRequest but it's not defined in the triton_python_backend_utils.py script as a python type, it seems to be imported from c_python_backend_utils.InferenceRequest via https://github.com/triton-inference-server/python_backend/blob/main/src/pb_stub.cc

david-waterworth avatar Aug 21 '23 23:08 david-waterworth

We are also facing issues with triton_python_backend_utils annotations. Currently, we are using triton server 22.05-py3 and don't face any issues with the following code inside our models:

import triton_python_backend_utils as tpbu
class TritonPythonModel:
    ...
    def execute(self, requests: List[List[tpbu.Tensor]]) -> List[List[tpbu.Tensor]]:
        ...

However, we tried to update the triton server to 23.08-py3, and now we get the following error:

AttributeError: module 'triton_python_backend_utils' has no attribute 'Tensor'

When we remove the annotations, the model is loaded successfully.

Why is this the case?

tobiasvitt avatar Sep 13 '23 07:09 tobiasvitt

Facing the same issue with pb_utils.Tensor and pb_utils.InferenceRequest, using triton server 23.07-py3

Okay, I remove annotations...

Fleyderer avatar Oct 25 '23 11:10 Fleyderer

@Tabrizian @tanmayv25, any update here? The ticket was closed, but it seems like many people have issues with this. How can this class of bugs be fixed without removing annotations? I don't see a reason why it would be intended to stop supporting annotations. For me this is still a bug.

tobiasvitt avatar Oct 31 '23 08:10 tobiasvitt

Sorry, we haven't be able to still get to this issue. Will update you as soon as there are any updates.

Tabrizian avatar Jan 13 '24 00:01 Tabrizian

The issue appears to be the timing of when the stub is setup, i.e.

https://github.com/triton-inference-server/python_backend/blob/ba616e26c256f11c41f7249c6a55220af8becee9/src/pb_stub.cc#L442

If I use properties/methods of the python stub at the top level of my module, i.e.

import triton_python_backend_utils as pb_utils
logger = pb_utils.Logger

this fails with

error: creating server: Invalid argument - load failed for model 'encoder': version 1 is at UNAVAILABLE state: Internal: AttributeError: module 'triton_python_backend_utils' has no attribute 'Logger'

But the following works fine

import triton_python_backend_utils as pb_utils
class TritonPythonModel:
    def initialize(self, args: Dict[str, str]) -> None:
        logger = pb_utils.Logger

And I can use the c versions at the module level (including type annotations).

So the script appears to be imported, then the stub is setup, then the model is initalised. The order would ideally be setup the stub, import the script then initalise the model.

I've upgraded to 23.12 and still seeing this.

david-waterworth avatar Feb 28 '24 22:02 david-waterworth