TensorRT-LLM icon indicating copy to clipboard operation
TensorRT-LLM copied to clipboard

How to deal with unsupported operations in TensorRT-LLM?

Open EpiSlim opened this issue 1 year ago • 1 comments

Hi team, I am trying to add a custom model to TRT-LLM. The original model in PyTorch uses torch.complex for few tensors, in addition to FFT ops like torch.fft.fft, torch.fft.ifft and torch.fft.rfft. From the documentation, it seems that such ops are not directly supported in TRT-LLM. Is there a workaround around this? For example, it is possible to mix TRT-LLM modules with PyTorch modules?

EpiSlim avatar Sep 11 '24 14:09 EpiSlim

@EpiSlim currently mixing torch and TRT ops are not implemented.

Although, it would be possible if you can wrap the torch operations into TRT plugin by using the TRT python plugin interface.

Would you mind to share you model with more context such that this feature can be considered in future planning?

litaotju avatar Sep 30 '24 13:09 litaotju

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 15 days."

github-actions[bot] avatar Oct 31 '24 02:10 github-actions[bot]

Closing this due to stale. Feel free to reopen

poweiw avatar May 21 '25 20:05 poweiw