ja-thomas

Results 33 issues of ja-thomas

- Definition of the paramater space (ranges) - How large to set the validation set - How large should early_stopping_limit be

Now merging in the correct branch. This might fix the CI problems.

``` task = convertOMLTaskToMlr(getOMLTask(2073)) Error in instantiateResampleInstance.CVDesc(desc, length(ci), task) : Cannot use more folds (10) than size (5)! ``` The mlr task: ``` Browse[2]> mlr.task Supervised task: yeast Type: classif...

(Crosspost from https://github.com/mlr-org/mlrCPO/issues/49) CPOs don't work with runTaskMlr (see link for example) Not sure if this has to be solved in mlrCPO or in Openml

In mboostLSS_fit() all offset values of the distribution parameters (mu, sigma, ...) are written in each distribution parameter model. e.g. The fit object of the mu parameter has a variable...

``` > cb3 = cb$clone(deep = TRUE) Error in envRefInferField(x, what, getClass(class(x)), selfEnv) : '.__enclos_env__' is not a valid field or method name for reference class "Rcpp_BlearnerFactoryList" ```

should be either caught or (preferably) handled correctly as a pure intercept model

I think another reason to use a data abstraction layer. But we want to have the ability to do random subsampling in every iteration.

In some cases e.g. splines we need to actually create new data but for simple base learner e.g. linear, a pointer to the data column would be enough

later

There is the package [DiceDesign](https://cran.r-project.org/web/packages/DiceDesign/index.html) which contains a large number of possible ways to create initial designs. May be worth a look

enhancement
question