DataAugmentation.jl icon indicating copy to clipboard operation
DataAugmentation.jl copied to clipboard

integration with Flux

Open CarloLucibello opened this issue 3 years ago • 2 comments

It would be helpful to add to the documentation an example of integration of DataAugmentation.jl in a pure Flux pipeline, e.g. https://github.com/FluxML/model-zoo/blob/master/vision/vgg_cifar10/vgg_cifar10.jl

An alternative is to modify the model zoo example by adding data augmentation, which is quite standard on CIFAR10. If could do that if you can provide some indirections.

CarloLucibello avatar Oct 13 '21 11:10 CarloLucibello

Most flexible way to drop DataAugmentation.jl into any workflow is to write a function that augments a single image and use that in the rest of the workflow. For example:

using DataAugmentation

function augmentimage(img, sz; augmentations = DataAugmentation.Identity())
    tfm = RandomResizeCrop(sz) |> augmentations
    return apply(tf, Image(img)) |> itemdata
end

How to integrate with the rest of the workflow depends on what other tools you're using, for example:

  • use in an iterator comprehension with Flux.DataLoader
  • mapobs over a data container and use with DataLoaders.DataLoader

Hope this helps, let me know if you need any other pointers.

lorenzoh avatar Oct 13 '21 14:10 lorenzoh

As far as I can tell, outputs of DataAugmentations.jl are in Images.jl's HWC (height, width, color-channels) format, whereas Flux generally recommends WHCN, e.g. in the Conv docstring:

Image data should be stored in WHCN order (width, height, channels, batch).

The pretrained models in Metalhead.jl also require inputs in WHCN format.

It would therefore be nice to have Transformations that:

  1. permute the height and width dimensions
  2. add a batch dimension

adrhill avatar Feb 08 '24 15:02 adrhill