Ability to build for all Scala versions
Allows building for all Spark/Scala version combinations. Introduces new system properties to control the behavior:
-
allScalaVersions=truelets Spark build for all Scala versions - the previous property
scalaVersionis now a version list via thescalaVersionsproperty, provides backwards compatibility - the previous property
defaultScalaVersionis now a version list via thedefaultScalaVersionsproperty, provides backwards compatibility
The defaults, via gradle.properties do not change.
Also...
- Unify processing of the xyzVersions everywhere
- Fix compilation bug in Flink 1.19 jmh code (was not built before)
- Update release jobs/script
- Remove "hack" for
:iceberg-bom - Simplify and fix related usages of the build system properties
allScalaVersions=truelets Spark build for all Scala versions
Forgive my ignorance. Does it affect the artifacts we build? Does it how we build them? Does it affect how we test them?
allScalaVersions=truelets Spark build for all Scala versionsForgive my ignorance. Does it affect the artifacts we build? Does it how we build them? Does it affect how we test them?
There is no change in how things are built or tested.
@RussellSpitzer you have time to review?
This pull request has been marked as stale due to 30 days of inactivity. It will be closed in 1 week if no further activity occurs. If you think that’s incorrect or this pull request requires a review, please simply write any comment. If closed, you can revive the PR at any time and @mention a reviewer or discuss it on the [email protected] list. Thank you for your contributions.
This pull request has been closed due to lack of activity. This is not a judgement on the merit of the PR in any way. It is just a way of keeping the PR queue manageable. If you think that is incorrect, or the pull request requires review, you can revive the PR at any time.