Eduardo Hirata-Miyasaki
Eduardo Hirata-Miyasaki
I merged #114. @ziw-liu feel free to make the changes
From the discussion with @ziw-liu, we will move forward by adding temporarily `UNeXt2_2D` as an alias to fcmae for the 0.2.0 release. Merging the architectures right now is not feasible...
Re-tested this. Seems to pass CI after merging #244. @ziw-liu
closing in favor of #273
I find it more use when it's decompressed rather than compressed. We can report both if needed. I think zarr.array does `nbytes_stored`. What do you guys think?
ended up adding `uncompressed size [GB]`
Using the prefetch=4 vs prefetch=2 has no effect on the training speed for the neuromast VS training. Here we are mostly limited by CPU->GPU pipes.
No, this branch is going to be closed in favor of #240
we will make a new folder under the `examples/configs`. We will add a text file that ask the users to use the ones in the repo.
As discussed, the HF model is pinned to use