Jakob Richter
Jakob Richter
It depends on the configuration of the slurm cluster. Most likelky `/tmp` will seldom be accessible from different nodes. But this can also be true for directories in the home...
I think this it out of scope for `batchtools` because it is a potentially subjective choice what the best approach is. You could just handle it yourself by writing a...
Related discussion: https://github.com/mllg/batchtools/issues/222
There has to be a commonly shared directory. All results (the return of your algorithm function) will be saved there. So when everything is finished the results are already in...
This error still prevails ``` r task = wpbc.task lrn = makeLearner("surv.rpart") rin = makeResampleDesc("Holdout") mm = setAggregation(cindex, test.join) r = resample(lrn, task, rin, measures = mm) #[Resample] holdout iter:...
And ``` r task = yeast.task lrn = makeLearner("multilabel.cforest") rin = makeResampleDesc("Holdout") mm = setAggregation(multilabel.acc, test.join) r = resample(lrn, task, rin, measures = mm) #[Resample] holdout iter: 1 #Error in...
In other words: We should not write custom code for every task class in `test.join` but find a generalized solution.
It is implemented here: https://github.com/mlr-org/mlrMBO/pull/397
You can pass your own design with `mlr:::makeTuneControlMBO(..., mbo.design = generateDesign(n = 4*length(ps$pars), fun = randomLHS))`. Is that sufficient?
I'm in for a better documentation. But in this case I don't favor more function arguments that kind of "rival" with existing arguments (i.e. `mbo.design`) and thus will clutter the...