wtte-rnn icon indicating copy to clipboard operation
wtte-rnn copied to clipboard

Extracting feature importance

Open adam-haber opened this issue 7 years ago • 1 comments

Is it possible to extract "feature importance" from an WTTE-RNN model?

adam-haber avatar Dec 03 '17 15:12 adam-haber

In theory, feature importance for neural networks is a tricky question and in general there's no nice tricks/clear winning strategy as with trees that I know about (but I know very little about the subject).

There's some tricks ranging from advanced plotting to checking the derivative wrt to inputs or maybe training with dropout and shutting of certain inputs and evaluating on testset. Maybe this question can get you started.

In practice, with my experiences a more informal analysis can be helpful using wtte-rnn by just eyeballing plots of sequences of predictions and features lined up and (depending on your problem/data) you can sometimes see quite clearly that some data will make an impact on the state of the rnn. This wont help feature selection of course.

ragulpr avatar Dec 03 '17 16:12 ragulpr