[TOREE-556] Support Scala 2.13
Co-authored-by: Neil Skinner [email protected] Co-authored-by: Cheng Pan [email protected]
This PR is based on #199, and makes the project compatible with both Scala 2.12 and 2.13.
Usage:
sbt ++2.12 clean test
sbt ++2.13 clean test
or
SCALA_VERSION=2.12 sbt clean test
SCALA_VERSION=2.13 sbt clean test
or
make SCALA_VERSION=2.12 dev
make SCALA_VERSION=2.13 dev
CI is updated to cover both Scala 2.12 and 2.13 tests. I also tested some basic functionalities locally with the docker environment.
@pan3793 what is the status of this pr?
@lresende I remember the coursier log API changed significantly, and we need to adapt the new API to recover the log display, let me find time these weeks to fix it and finish the scala 2.13 support.
Thank you for the updates @pan3793 I did not do much testing, but updated my env to spark 3.4.3 + scala 13 and at least it connects and establishes a spark session... will try to spend more time on validation during the week.
@lresende this PR is ready to review now.
also cc @requaos
kindly ping @lresende ~
@pan3793, @lresende , @requaos do we feel comfortable to merge the PR to support 2.13?
hello all, can this be merged? if there is additional work, is there anything I can help w/?
hello all, can this be merged? what's pending on this? is there anything I can help w/?
@lresende, since there are many users requesting Scala 2.13 support, I wonder if we can move this forward.
Currently, the PR makes Toree work with both Scala 2.12 and Scala 2.13, but leaves Scala 2.13 as the default. After second thought, I think we should use Scala 2.12 as the default, this won't introduce any breaking changes. Given Spark 3.4 and prior are EOL, we should also move to Spark 3.5. I see you are preparing the 0.6.0 release. Maybe after that, we can move to Spark 4.x and Scala 2.13. WDYT?
I agree, let me try to push a release candidate this weekend and we merge after the release?
@lresende, if I change the default Scala version to 2.12, it's also fine to include this patch in 0.6. But it's also fine if you don't want to take risks and defer this to 0.7
It would be super helpful if we can have this in the latest release as well.
My suggestion is to have a last release with 2.12 which supports old spark releases, and in the future, if needed, we can branch from it and provide patches, etc. And then we can focus on Scala 2.13 for future releases. Having said that, it does not mean we can't do one release after another, in a short period of time, mostly to have the support for both scala versions.
if we do one release after another, when would be the Spark4 and scala 2.13 available? is there anything that I can help to accelerate the process?
I have updated the PR to keep Scala 2.12 as the default, @lresende it's up to you to include this in the upcoming 0.6 or defer it to the next release.
@pan3793 I was going to test this locally and noticed the conflicts. I have a rebased version locally but if you want to rebase to make sure there are not conflicts, please go ahead otherwise, i can push the changes from local
@lresende I have resolved the conflicts
CI passed, let me merge this to allow other PRs to have both Scala 2.12 and 2.13 CI coverage.
Thank you for merging this. when would this be available in pypi? Do we support spark4 currently?
@fangyh20 i am waiting for #229 and will try to make a rc
@fangyh20 Spark 4 is not supported yet (I guess we need to make it support Java 17+ first, then try Spark 4.0), PRs are welcome