mlr3
mlr3 copied to clipboard
SVM-RFE
I saw method RFE for fselect function seems not support SVM, as it require for the available attribute of "importance" in learner, but classif.svm did not contain this attribute and raise the error:
Error in learner$importance() : attempt to apply non-function
How could I implement RFE via SVM algorithm via mlr3?
Here are my codes which could not realize the aim.
task <- tsk("pima")
learner <- lrn("classif.svm")
instance <- fselect(
method = "rfe",
task = task,
learner = learner,
resampling = rsmp("cv", folds = 10),
measure = msr("classif.ce"),
store_models = TRUE,
term_evals = 10
)
Which importance method for a svm do you have in mind?
For example, recursive feature elimination? But how can I implement this method for SVM?
I meant the importance
method of the svm, on whose basis the recursive feature selection would take place.
The error code you showed (Error in learner$importance() : attempt to apply non-function
) happens because the learner does not have an importance
method, that you can read about here.
The error method should be better I agree, I therefore opened an issue https://github.com/mlr-org/mlr3fselect/issues/55. You can read more about RFE here.
@xhxlilium You are probably confused because there are approaches for using RFE with an SVM. The initial request for RFE was with SVM https://github.com/mlr-org/mlr3fselect/issues/1. Unfortunately, we cannot support this. See https://github.com/mlr-org/mlr3learners/issues/161. However, RFE works for a plethora of models and is not limited to the SVM. You can use RFE with any learner with the importance
property. You can search for these learners at https://mlr-org.com/learners.html. Just type importance
into the search box.
I will add a better error message. Thanks.
RFE works now on an SVM https://mlr-org.com/gallery/optimization/2023-02-07-recursive-feature-elimination/#support-vector-machine.
RFE works now on an SVM https://mlr-org.com/gallery/optimization/2023-02-07-recursive-feature-elimination/#support-vector-machine.
Cool! Thanks very much!