Jakob Richter
                                            Jakob Richter
                                        
                                    What is subsetting a PramSet: 1) Removing a Param A 2) Setting a Param A to a fixed value to obtain something like a "hyperplane" Version 1) can break 1)...
Hmm. I wanted to find the discussion we already had with @berndbischl about this topic. Maybe someone can link it if he finds it. We also came to the conclusion...
Proposal: tag the Param with "sequential"
Then the Param would have the tags "train" , "predict" and "predict_sequential", right?
For completeness I will repeat what I said [here](https://github.com/mlr-org/paradox/pull/260) The defaults are just a documentation (this was kind of our understanding IMHO): - We cannot ensure that they are correct...
Thanks. That's a nice start. If you want, you can close this issue.
I think it is perfectly fine that the code runs like that. A tuner ist told to run on an instance and we specifically allow it(?) to not be empty....
Would it be better if `reevaluate` was a property of the TuningInstance instead of an argument that has to be passed all the time? It could be an active binding...
Thanks for spotting that bug. I have to fix it in mlr. But it is just an output bug. Measures that are to be maximized are transformed with -1 because...
The webserver needs to be better before that happens.