Fabio Sigrist

Results 42 comments of Fabio Sigrist

The issue with OpenMP support is now also solved. Starting with version 0.7.8.4, the pre-compiled macOS arm64 on PyPI (installation with `pip install -U gpboost`) should work as expected.

@joselznom: Thanks for your feedback. Concerning Mac: Are you compiling from source or is the pip pre-compiled wheel used? Have you uninstalled and reinstalled brew after you migrated to an...

@joselznom: glad to hear that it works on Linux. I keep wondering why it does not work on your Mac with an arm64 processor. Initially, I also had a similar...

I reset my arm64 Mac to factory level to double check whether it works. After migrating from an old non-arm64 Mac, I did the following steps such that installation with...

Thank you for your feedback and suggestion. I have to admit that it is currently unclear to me how one can use sample weights for Gaussian process and random effects...

Yes, you are right. But this is not possible for Gaussian processes / random effects, as there is not one loss per sample but only one "global" loss for all...

Thank you for the hint. No, this is not a meaningful option. In my opinion, the option proposed in the issue you mention (allowing the user to provide weights to...

Yes, this seems like a reasonable approach for Gaussian data. That's the same approach I also mentioned in [this comment](https://github.com/fabsig/GPBoost/issues/12#issuecomment-782266684): > for Gaussian data, one might weight the error variances...

Yes, using this terminology it's about 'probability' == 'sampling' == 'scale up' == 'representation' weights. Afaik, this is the predominant way how weights are used in machine learning. You want...

The GPBoost / LaGaBoost algorithm is currently not yet implemented for categorical data with more than two categories. You might consider a "one-against-all" approach where you create K - 1...