Benjamin Bossan
Benjamin Bossan
> quite different validation results between pytorch and skorch. Yes, I need to investigate further, or perhaps someone else can spot a mistake. > Did skorch some weight initialization automatically...
Regarding said callback: We could also have a more general function signature that takes the callback and the other arguments as input. This would allow things like this: ```python #...
In the abstract, it's quite hard to say what the reason could, so if you have a minimal code example to reproduce the behavior, that would be great. In general,...
What would you use it for? I only see `CVSplit` at the moment.
No, I don't think that it should be skorch's job to set seeds for torch and numpy. I could see a helper function that does it, but otherwise I would...
> Scikit-learn classifiers that has random state allows for a `random_state` keyword in `__init__` We could do this. At the moment, I only see `CVSplit` as a potential target for...
Hmm, I think it's a little late for renaming the callbacks themselves. And even if we rename the argument, I would still keep it so that lower is better by...
I think this is from the offical pytorch documentation, maybe it's okay to leave it as is?
I don't see the feature as strictly necessary but would be okay with it being added. @ottonemo @thomasjpfan any opinions on that? If this change is made, I would suggest...
> What would be the use case for preventing casting? I can imagine that some people would like to have complete control over it, say run part of the model...