JIT not working for (de)compress
Bug
I see that you are testing whether JIT of the EntropyBottleneck works (and it does), but JITing the compress/decompress method of an entropy bottleneck does not work. which means that as a whole the ENtropy Bottleneck cannot really be used with JIT.
To Reproduce
from compressai.entropy_models import EntropyBottleneck
import torch
torch.jit.script(EntropyBottleneck(4).compress)
torch.jit.script(EntropyBottleneck(4).decompress)

Expected behavior
Successful JIT conversion.
Environment
- compressai==1.1.5
- torch==1.7.1
hey, thanks for the report. I don't think there's anything we can do at this point to address this. The entropy modules compress/decompress methods rely on inheritance and dynamic lists which are not supported by torchscript. I'm not even sure we can export these methods anyway since we are also calling a c++ extension. Any reasons you want to use torchscript here?
I see, no worries.
Nothing crucial but I put a new pretrained compressor on torchhub, and they give the choice of downloading the model in a scripted version and mine wasn't working which made me realize that.
ah yes, that's an interesting use case. I'll keep this open for now until we figure out the next steps.
closing for now, accepting related PR