bonsai icon indicating copy to clipboard operation
bonsai copied to clipboard

parsnip wrappers for tree-based models

Results 20 bonsai issues
Sort by recently updated
recently updated
newest added

Error message `Error in init(env): For early stopping, valids must have at least one element` is produced when trying to train a multi-class model with ``` lgb_model % set_engine( "lightgbm",...

bug

# Problem Within LightGBM, [`num_leaves`](https://lightgbm.readthedocs.io/en/latest/Parameters.html#num_leaves) is capped at 2 ^ [`max_depth`](https://lightgbm.readthedocs.io/en/latest/Parameters.html#max_depth). For example, if `num_leaves` is set to 1000 and `max_depth` is set to 5, then LightGBM will likely end...

feature

Thanks for creating this excellent package. I created a [similar fork of treesnip](https://gitlab.com/ccao-data-science---modeling/packages/lightsnip) but am planning to replace it with `{bonsai}` in all our production models. One feature that I...

feature

There is some code in `{bonsai}` that looks like it was intended to support `multi_predict(..., type = "raw")` for `{lightgbm}` classification models. https://github.com/tidymodels/bonsai/blob/6c090e16f1a5476da1699ff14d8927f92fbe2c83/R/lightgbm_data.R#L146-L158 However, I don't believe `{bonsai}` actually respects...

bug

aorsf is a great addition to bonsai! Any chance of supporting `mtry_prop`? ``` r library(tidymodels) library(bonsai) set.seed(1) folds set_mode("regression") lgbm_wflow add_model(mod_lgbm) |> add_recipe(rec) aorsf_wflow add_model(mod_aorsf) |> add_recipe(rec) # lightgbm supports...

feature

Hi, the model fit fails if `mtry` is specified for the `aorsf`-engine. If it is not specified, it works with the default engine values. ```r library(bonsai) #> Loading required package:...

``` → A | warning: The following argument(s) are guarded by bonsai and will not be passed to LightGBM: init_model There were issues with some computations A: x55 ``` In...

It appears to be set up correctly, we just need to test that the data flows freely in `fit()` and `predict()`

Noticed while working on emlwr the other day that `bonsai::train_lightgbm()` is quite a bit slower than `lightgbm::lgb.train()`, probably due to the handling of categorical variables / conversion to `lgb.Dataset`. Observed...

I recently discovered that LightGBM has a [`multiclassova` objective function](https://lightgbm.readthedocs.io/en/latest/Parameters.html#core-parameters) (multi-class one-vs-all) that treats multi-class classification as a set of binary classification tasks. Getting quite decent results with this approach...