jackylee
jackylee
The default result of the Spark failure is the error message, no exception throwing until user sets spark_statement_errors_are_fatal. The result of running SQL needs this, too. For most users, the...
**Describe the bug** We meet the case when there is `a
**Describe the bug** We met the problem while using `ReuseExchangeAndSubquery`, it would cache the plan with key `exchange.canonicalized`. SparkPlan will be re-initialized in the process of calling `canonicalized`. During this...
**Describe the bug** We meet this problem when there is some data, whose encoding is gbd, written in parquet, and we want to read data from it. For vanilla spark,...
**Describe the bug** We meet some error, which called job aborted, with WSCG ``` java.lang.RuntimeException: split_part is currently not supported in WSCG. Caused by: java.lang.RuntimeException: split_part is currently not supported...
**Is your feature request related to a problem or challenge? Please describe what you are trying to do.** Every time we read a arrow record batch from Parquet,we will use...
**Describe the bug** When we start spark with user spark extension and gazelle, it will throw "Spark extensions are already specified before enabling Gazelle plugin: ". **To Reproduce** Start spark...
**Is your feature request related to a problem or challenge? Please describe what you are trying to do.** We are using `setSafe` to put partitionValues to Vectors while calling `buildReaderWithPartitionValues`....
**Is your feature request related to a problem or challenge? Please describe what you are trying to do.** I have tried to use JDK11 to compile and run project. However...
**Describe the bug** We meet a core dump when running the sql with hash agg. In this sql, the agg key is constant and one of the selected columns has...