matthost

Results 22 comments of matthost

Same (on v0.18.1) but error is `‘j_decompress_ptr’ was not declared in this scope` building from source Seems like installing LibJpegTurbo should fix but hasn't for me yet - maybe something...

Updating `TORCHVISION_INCLUDE` to include the dir with `jpeglib.h` fixed the error for me. But yeah better if it worked as expected in the release :)

The breaking changes do not look too bad. I am considering whether to try patching this on a fork or to try to get `torch.compile / torch_tensorrt.compile` working instead

Not yet, not sure whether I’ll need to after seeing the hardware compatibility feature is as recent as 8.6. Will share if I do. Any reason you want 10?

This isn't necessarily Windows specific. On MacOS and connecting to a Linux machine I have been seeing this for over a year. Edit: Adding IdentityFile to ssh config actually did...

Perhaps I am doing something wrong? I tried numpy stacking the images which did not help. Perhaps related to https://github.com/open-mmlab/mmdeploy/issues/2808

I found the Detector class has a `batch` method so gave that a try: `detector.batch([img, img, img])` But it is passing a single image to the model it seems, which...

The `from mmdeploy.apis import inference_model` API seems to work if you pass a list to `img`

Assuming plain TensorRT library inference using C++ based on the screenshot. My understanding is we need to load the plugins by (1) loading the `libmmdeploy_tensorrt_ops.so` library, (2) calling `initLibNvInferPlugins` in...

I also don't see `initLibNvInferPlugins` here in the mmdeploy repo. Not sure how they are doing it so far