Mads Dabros
Mads Dabros
Will do once published. You are more than welcome to help! :)
Hi @MiWeiss Yes, it is more or less ready. For now, same functionality as the old XGBoost lib, but with code clean up, targeting dotnet 8, and native packages for...
Hi @MiWeiss and @madskonradsen I have release the renewed version of the old XGBoost.Net wrapper. The new name is XGBoostSharp (following the trend of TorchSharp). The first release is on...
Hi @SULVAL The implementation in SharpLearning is not intended for running in "full batch mode" as such. The all the different optimizers er based on stochastic gradient descent, so is...