server
server copied to clipboard
Python Backend complains "triton_python_backend_utils" has no attribute "InferenceRequest"
I'm using the python business logic scripting, and a conda packed python environment with python3.8. Both 22.06 and 22.07 version show the following error message "UNAVAILABLE: Internal: AttributeError: module 'triton_python_backend_utils' has no attribute 'InferenceRequest'". but it works ok on the third party docker image flyingmachine/tritonserver-w-ort-1.11.0
@Tabrizian ^^^
Hi @Michael-Jing, sorry about the delay. Can you please share the structure of your model repository? Are you copying triton_python_backend_utils
in your model directory? If yes, that is why you are observing this error.
Hi, sorry for the late reply, I don't copy triton_python_backend_utils.
part of the repository is like this
and I use the following config
parameters: { key: "EXECUTION_ENV_PATH", value: {string_value: "$$TRITON_MODEL_DIRECTORY/../python38.tar.gz"} }
Opening a bug with the team to investigate further. Most likely it is a user issue. @Michael-Jing Can you share your model repository and exact steps to reproduce the issue? We don't think tritonserver-w-ort-1.11.0 should make any difference. Can you look carefully into the differences between the two?
Hi, I found that the reason for the error is I defined a funciton with the following Signature in my worker.py
import triton_python_backend_utils as pbu
def process(request: pbu.InferenceRequest):
after I removed the pbu.InferenceRequest type annotation for request, it works fine.
Thanks for updating us, Michael! Closing ticket.
why remove type annotation would help? Or maybe I should ask, why use pbu.InferenceRequest would cause issue?
I find this confusing as well - all the examples make use of the type pb_utils.InferenceRequest
but it's not defined in the triton_python_backend_utils.py
script as a python type, it seems to be imported from c_python_backend_utils.InferenceRequest
via https://github.com/triton-inference-server/python_backend/blob/main/src/pb_stub.cc
We are also facing issues with triton_python_backend_utils
annotations.
Currently, we are using triton server 22.05-py3 and don't face any issues with the following code inside our models:
import triton_python_backend_utils as tpbu
class TritonPythonModel:
...
def execute(self, requests: List[List[tpbu.Tensor]]) -> List[List[tpbu.Tensor]]:
...
However, we tried to update the triton server to 23.08-py3, and now we get the following error:
AttributeError: module 'triton_python_backend_utils' has no attribute 'Tensor'
When we remove the annotations, the model is loaded successfully.
Why is this the case?
Facing the same issue with pb_utils.Tensor
and pb_utils.InferenceRequest
, using triton server 23.07-py3
Okay, I remove annotations...
@Tabrizian @tanmayv25, any update here? The ticket was closed, but it seems like many people have issues with this. How can this class of bugs be fixed without removing annotations? I don't see a reason why it would be intended to stop supporting annotations. For me this is still a bug.
Sorry, we haven't be able to still get to this issue. Will update you as soon as there are any updates.
The issue appears to be the timing of when the stub is setup, i.e.
https://github.com/triton-inference-server/python_backend/blob/ba616e26c256f11c41f7249c6a55220af8becee9/src/pb_stub.cc#L442
If I use properties/methods of the python stub at the top level of my module, i.e.
import triton_python_backend_utils as pb_utils
logger = pb_utils.Logger
this fails with
error: creating server: Invalid argument - load failed for model 'encoder': version 1 is at UNAVAILABLE state: Internal: AttributeError: module 'triton_python_backend_utils' has no attribute 'Logger'
But the following works fine
import triton_python_backend_utils as pb_utils
class TritonPythonModel:
def initialize(self, args: Dict[str, str]) -> None:
logger = pb_utils.Logger
And I can use the c
versions at the module level (including type annotations).
So the script appears to be imported, then the stub is setup, then the model is initalised. The order would ideally be setup the stub, import the script then initalise the model.
I've upgraded to 23.12 and still seeing this.