Mateen Ulhaq
Mateen Ulhaq
### Automatic definition of `load_state_dict` and `update` ```python from compressai.models.base import CompressionModel class ExampleModel(CompressionModel): def __init__(self, N, M): super().__init__() self.entropy_bottleneck_1 = EntropyBottleneck(M) self.entropy_bottleneck_2 = EntropyBottleneck(M) ... self.gaussian_conditional_1 = GaussianConditional() self.gaussian_conditional_2...
Yes, their non-trainable parameters should all have the same default initialized values. This is fine. The section I wrote about "Different scale tables" was just in case you wanted to...
Yes, these are `N, M`, and are used when calling the model's `__init__`: https://github.com/InterDigitalInc/CompressAI/blob/ee91d536bd934fc1f8b1532f78db4c94072ae26d/compressai/models/google.py#L242 The quality parameter refers to different image quality targets that each separate model was trained on....
The qualities are just a convenient way to number the different trained models which produce small files (quality=1) or large files (quality=8). The lambdas shown were used to train these...
Yes, that is correct.
aux_loss is described here: - https://interdigitalinc.github.io/CompressAI/models.html#compressai.models.CompressionModel.aux_loss - https://github.com/InterDigitalInc/CompressAI/issues/167 Usually, it goes down to ~30 after the first few epochs. It's unusual that it stays large in your case. Possible scenarios:...
The RD loss (0.6) looks good. My guess is that at least one of the entropy bottlenecks `.quantiles` are not being optimized. Could you post the code for your model...
It might be because of the custom `HighLowEntropyBottleneck`. If `HighLowEntropyBottleneck` is implemented like this: ```python class HighLowEntropyBottleneck(EntropyModel): def __init__(self, N): self.eb_l = EntropyBottleneck(N) self.eb_h = EntropyBottleneck(N) ``` Then try this:...
The code you posted seems fine to me after a bit of review. I can't understand why it's not reducing the aux loss. The parameters for the aux optimizer are...