[x] join: incorrect result in left/right outer join (BHJ, SHJ)
[x] compilation failed in ExistenceJoin due to "not" operator in WSCG
[x] NullPointerException in WSCG get metrics
[x] incorrect result in semi join condition check
[x] Max for NaN [NaN is greater than all other non-NaN numeric values]
[x] precision loss of castVARCHAR in both gandiva and WSCG
[x] some datatypes (BooleanType, DateType, StringType) may be not supported in Aggregate Actions
[x] ColumnarInMemoryTableScanExec does not support row data as input
[x] sum and avg should return null if all input values are null (currently zero) [SPARK-8828 sum should return null if all input values are null]
[x] Aggregate (sum, avg...) for Literal should be fully supported. In this case, input batch is empty, and native calculation should be skipped. For now, only count_literal was supported. [cte-legacy.sql]
[x] grouping for Literal is not supported
[x] Count with condition was not supported. For now, aggregateExpression.filter was ignored. [Support filter clause for aggregate function with hash aggregate]
[x] ColumnarConditionProjector: filter without project is not supported [SPARK-32788: non-partitioned table scan should not have partition filter]
[x] ColumnarLike: escapeChar should be supported. [SPARK-33677: LikeSimplification should be skipped if pattern contains any escapeChar]
[x] ColumnarLike: Failed to make LLVM module due to 'like' function requires a literal as the second parameter [like-all.sql]
[x] SMJ: segfault [SPARK-25988: self join with aliases on partitioned tables]
[x] SMJ: incorrect result in LeftAnti [scalar-subquery-select.sql]
[x] In WSCG project, some expressions (not, equal, ...) needs to set "check_str_", otherwise compilation would fail. [many cases in sql test, eg, except.sql]
[x] incorrect result in LeftAnti BHJ on null values in non-codegen verison caused by null_set [NOT IN predicate subquery]
[x] incorrect result in LeftAnti BHJ on null values in codegen version
[x] incorrect result in LeftSemi BHJ wo/ codegen caused by BooleanType is not correctly handled in HashRelationKernel [group-by.sql, ut: "groupby"]
[x] Window: segfault [null inputs]
[x] Window: KnownFloatingPointNormalized cannot be cast to org.apache.spark.sql.catalyst.expressions.AttributeReference [NaN and -0.0 in window partition keys]
[x] Concat: different result [string concat]
[x] decimalArithmeticOperations.sql: precision, handling for overflow and precision loss
[x] ColumnarSorter: key not found in output attributes due to upper case and lower case [order-by-nulls-ordering.sql]
[x] ConvertUtils, getAttrFromExpr: some other expressions should be handled [windowFrameCoercion.sql, postgreSQL/select_implicit.sql]
[ ] Aggregate group-by: a very small value (-1.2345678901234e-200) is regarded as the same as 0.0 in normalize function [union.sql]
[ ] exception is expected when int4 or int8 overflows [postgreSQL/int4.sql, postgreSQL/int8.sql]
[x] NotImplemented: Function min_max has no kernel matching input types (array[date32[day]]) [subquery/scalar-subquery/scalar-subquery-predicate.sql]
[x] max for bool has incorrect result
[x] WSCG NullPointerException: s"ColumnarWSCG can't doCodeGen on ${child}". In ColumnarSortExec, doCodeGen: ColumnarCodegenContext = null [SPARK-34003: fix char/varchar fails w/ order by functions]
[x] incorrect result in columnar Expand [cube]
[x] nullOnDivideByZero should be supported in stddev
[x] incorrect result for groupby bool
[ ] different result for Timestamp maybe due to timezone [in-order-by: different result for timestamp]
[x] Sort doCodegn is null causing NPE
[x] sort by literal
[ ] makeCopy, tree in ColumnarCollapseCodegenStages
[x] Failed to make LLVM module due to Return type of root node int64 does not match that of expression timestamp[us, tz=UTC]
[x] Not a valid date value 2014-31-12, to_date should return null instead of throwing exception [function to_date]
[ ] incorrect result in date and timestamp functions [DateFunctionsSuite]
[x] SMJ segfault caused by case of field name
[x] SMJ left/right outer result
[x] runtime error in aggregate caused by the field name
[ ] "divide by zero" exception from gandiva in decimal divide
[ ] window has incorrect result ["two inner joins with condition"]