Lukas Burk

Results 53 comments of Lukas Burk

Ah, I totally forgot that was even an option, sorry! Yes, when I try that the behavior is exactly the same. The only difference of course being the first line...

Ah too bad, but oh well, will watch that other issue as well just in case, thanks! There were two workarounds stated in https://github.com/getcursor/cursor/issues/1027#issuecomment-2454651328, so for anyone else running into...

Sure! This issue is the only thing blocking me from going all-in on Positron for my R projects, so I'm happy to do all the debugging things you can think...

I'd like to point out the reported error by @sameet here is `[Error - 17:48:58.411] Error resolving authority`, which is _not_ different from what I mention in the original issue...

Looks like every VSCode fork (including Cursor and Windsurf) that can't use Microsoft's Remote SSH extension relies on that one random person's (no offense) open source library in a classic...

I'm trying to add early stopping to the XGBoost learner in my benchmark based on this chapter, and I'm not sure whether I just misunderstand a few things or maybe...

Great, thanks! Is there something I can to to keep the `evaluation_log` around though? Considering a previous experiment without `AutoTuner`: ``` r library(mlr3) library(mlr3pipelines) library(mlr3proba) library(mlr3extralearners) task = tsk("lung") xgb_base...

> you are accessing the final model fit but in the final model fit there is no early stopping. Ah right, of course, makes sense 😅 I don't think I...

I have not done any stacking yet, but the book [has a section](https://mlr3book.mlr-org.com/chapters/chapter8/non-sequential_pipelines_and_tuning.html#sec-pipelines-stack) on that so I hope that helps?