distiller
distiller copied to clipboard
Adaptation of Distiller for graphConvolution (Pytorch Geometric and custom modules)
Dear people of distiller,
I would like to add pruning within my project: https://github.com/nicolas-chaulet/deeppointcloud-benchmarks. It contains SOTA models for pointcloud data. How complex would it be to extend distiller to support at least Pytorch Geometric (https://github.com/rusty1s/pytorch_geometric) or within mine.
Best, Thomas Chaton
Hi @tchaton,
Sorry for the late response...
Although we haven't tested yet the benchmarks for graph convolutions - pruning (at least fine-grain) should work all the same for graph conv as it would for regular conv / linear layer.
In fact -
As was defined in GCNConv is just a regular linear layer operation on the feature map X, while multiplying it by a 'normalized' adjacency matrix on the left. You could view this operation as analogous to linear->layer_norm. Of course the linear layer consists of parameter (in this case - theta) and you could apply pruning to this layer using our framework with compression scheduler, no problem. Same goes for every other parametric layer in the network.
Cheers, Lev
Hey @levzlotnik,
It is what I thought, and I will give it a try !
Best regards, T.C