jsleight
jsleight
I'd guess you're getting the wrong version of apache commons somehow. Should have `commons-lang3` v3.7
This will depend on which xgboost runtime you are using. We have two xgboost runtimes: - [default] dmlc xgboost which is C++ behind the scenes. DMLC xgboost does have threading...
I definitely would not expect `predict(50_rows) < predict(1_rows)`. `predict(50_rows) / 50 < predict(1_rows)` would obviously make sense. Only ideas I have are some weirdness in the benchmarking setup like cache...
hmmm, I'm not too familiar with `sparkxgb` package. Is there a reason you aren't using [the built in pyspark bindings for xgboost](https://github.com/dmlc/xgboost/blob/master/python-package/xgboost/spark/estimator.py) instead? They were just added in v1.7 so...
hmmm @WeichenXu123 any thoughts on this since I know you added the pyspark bindings in xgboost.
in case anyone else sees this issue and is nervous. We've connected via other channels. We'll update here as issues get resolved and there are upgrades to fix available
Think you need to alter the travis yaml to make scala 2.13 available https://github.com/combust/mleap/blob/master/.travis.yml#L25
@db-scnakandala any rough timeline on your end? I'm happy to merge and release artifacts soon after the PR is complete
> Can we make sure that the spark version we are using for scala 2.13.11, if its compatible or not? Becase the jdk version is at openjdk11 and the scala...
> @db-scnakandala Also, I was also facing this issue locally where the following were not found. > https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/snapshot/ml/dmlc/xgboost4j_2.12/2.0.0-SNAPSHOT/xgboost4j_${scala.binary.version}-2.0.0-20230817.090749-594.jar: not found: https://s3-us-west-2.amazonaws.com/xgboost-maven-repo/snapshot/ml/dmlc/xgboost4j_2.12/2.0.0-SNAPSHOT/xgboost4j_${scala.binary.version}-2.0.0-20230817.090749-594.jar > > Is there a way to resolve this...