Xiangyu Guo
Xiangyu Guo
Checkout the colab notebook the author shared, it has the sr inference code
I guess the main question is why `weighting` is being calculated like that, `normalization` I guess is just byproduct of the `weighting`
Simply do `pip install -e .` in the root directory should get your issue resolved.
Where is this line coming from?
You just need to implement a dataset and put in the config that's all.
@XavierXiao I bet with activation checkpointing plus bf16 this is achievable or at least close.
@XavierXiao try the fairscale checkpointing, it will save more memory than the one implemented in openai's code
You can debug this issue by checking the output of each step, likely a NaN issue from fp16 (which you can resolve by switching to fp32) or some weights are...
Hi zepmck, curious what does the annotations look like?
I think some of the recent semantic segmentation work might be similar to what you're trying to achieve. Basically, you form the problem as an image-to-image translation problem, raw image...