DeepTab icon indicating copy to clipboard operation
DeepTab copied to clipboard

[FAQ] How to extract feature importance?

Open SalvatoreRa opened this issue 9 months ago • 1 comments

Hi great project,

There is a way to extract the feature importance for a model (once you have trained it)? Or other interpretability algorithms?

have a nice day,

Salvatore

SalvatoreRa avatar Mar 14 '25 18:03 SalvatoreRa

Since most models included in the library do not inherently support feature importances there isnt an inherent way to get feature importances. However, since everything is in torch, the library is compatible with libraries such as shapor captum.

AnFreTh avatar Mar 18 '25 11:03 AnFreTh