datafusion-comet
datafusion-comet copied to clipboard
Write documentation for working with Comet's spark-4.0 profile in IntelliJ
What is the problem the feature request solves?
When switching from the default profile to the spark-4.0 profile (and the jdk-17) profile, I ran into various issues with building and running tests.
I plan on writing up some documentation on how to do this. Here are some brief notes.
- Close IntelliJ project
- Delete .idea folder
make release PROFILES="-Pspark-4.0 -Pscala-2.13"- open folder as project in IntelliJ
- select spark-4.0 and jdk-17 profiles (be sure to deselect the other profiles if needed)
- re-sync / re-import maven project
- build / rebuild
- run tests, adding the following JVM arguments (these are available in the pom.xml)
-XX:+IgnoreUnrecognizedVMOptions
--add-opens=java.base/java.lang=ALL-UNNAMED
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED
--add-opens=java.base/java.io=ALL-UNNAMED
--add-opens=java.base/java.net=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
--add-opens=java.base/java.util=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED
--add-opens=java.base/jdk.internal.ref=ALL-UNNAMED
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED
--add-opens=java.base/sun.nio.cs=ALL-UNNAMED
--add-opens=java.base/sun.security.action=ALL-UNNAMED
--add-opens=java.base/sun.util.calendar=ALL-UNNAMED
-Djdk.reflect.useDirectMethodHandle=false
Describe the potential solution
No response
Additional context
No response
Close IntelliJ project Delete .idea folder make release PROFILES="-Pspark-4.0 -Pscala-2.13" open folder as project in IntelliJ
Why are these steps needed? Is it because the artifactId of the spark project is comet-spark-spark${spark.version.short}_${scala.binary.version}?