tune
tune copied to clipboard
Tools for tidy parameter tuning
``` r library(tidymodels) #> ── Attaching packages ─────────────────────────────────────────────────────────────── tidymodels 0.0.2 ── #> ✔ broom 0.5.2 ✔ purrr 0.3.2 #> ✔ dials 0.0.2.9001 ✔ recipes 0.1.6.9000 #> ✔ dplyr 0.8.3 ✔...
For example, if someone want to tune a random forest or other tree-based model to maximize accuracy but with a minimal number of predictors. I'm not sure if we could/should...
so that we can have a `typesum` method that prints something like `` or similar
Consider what happens when you try and tune a model with a grid containing `penalty = -1` vs one containing `penalty = c(-1, 1)` where `-1` is seen as a...
Since `parsnip` wants to fit the whole path (and ignores the given single penalty value), we need to find an approach for using `linear_reg(penalty = 10^-5, mixture = tune())` or...
And use the `step_spline_*()` functions instead. `tune_grid()` uses them and maybe others.
In the code below, the `tune_grid` function takes a workflow and (inappropriately) a recipe. The code interprets the recipe as going to the `...` and issues an unhelpful warning. We...
Ref: https://github.com/tidymodels/rsample/issues/457 Most or all errors thrown in this package are made via `rlang::abort()`. We are transitioning to `cli::cli_abort()` to make use of the richer styling options for errors via...
In [this blog post](https://www.mm218.dev/posts/2024-07-19-tidymodels/), Mike writes about a costly mistake of not setting `save_pred = TRUE` and `save_workflow = TRUE` when resampling models for use when stacking. One change that...