Alexander März
Alexander März
You might want to start looking into https://github.com/stanfordmlgroup/ngboost I am not sure to what extent we can borrow from it. Initially, I didn't want to change the implementation of catboost...
Thanks for your comment. Currently, I am still working on getting the package ready. Once I have the problem of simultaneous parameter estimation across distributional parameters solved, I will release...
Dear @mengyaoji, thanks for the interest in the project. What you describe is indeed an odd behaviour. The hyper-parameters don't seem to be too odd. It is difficult to say...
@mengyaoji A data snippet would be great. We can use this also as an example on how to use the distribution. You don't need to provide the full dataset with...
@mengyaoji You may also want to try using a normalization `distribution.stabilize = "MAD"`. Since XGBoostLSS updates the parameter estimates using Gradients and Hessians, it is important that these are comparable...
@mengyaoji Kindly asking if there is any update on this you can share?
@yuezhihan Thanks for the clarification. I do have some follow up-questions on this. - Looking at the [forecasting code](https://github.com/yuezhihan/ts2vec/blob/main/tasks/forecasting.py#L35), do you use the entire data-set (train+valid+test) to create the representations?...
@yuezhihan Let me try to illustrate the problem. I am using the Electricity consumption as an example, where the aim is to forecast a univariate time series 24 steps ahead....
@yuezhihan Thanks for the detailed answer. Things are getting clearer now. If I understand you correctly, based on the rolling evaluation, we use ```python features[0:1,320] to forecast labels[1,24] features[0:2,320] to...
@yuezhihan Many thanks for your great and detailed explanations, very much appreciated!! I have now a much better understanding. If I may raise another question: based on your code example...