tispark icon indicating copy to clipboard operation
tispark copied to clipboard

[feature] Support syntax "select a from t partition(p1)"

Open LittleFall opened this issue 3 years ago • 1 comments

Is your feature request related to a problem? Please describe. Currently, TiSpark doesn't support a MySQL/TiDB partition table syntax select col_name from table_name partition(partition_name)

spark.sql("select a from t where partition(p0)").show(false)
org.apache.spark.sql.AnalysisException: Undefined function: 'partition'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 22
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15$$anonfun$applyOrElse$51.apply(Analyzer.scala:1395)
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15$$anonfun$applyOrElse$51.apply(Analyzer.scala:1395)
  at org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:53)
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15.applyOrElse(Analyzer.scala:1394)
  at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$15.applyOrElse(Analyzer.scala:1386)
  at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:258)
  at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:258)
...

docs: https://github.com/pingcap/tispark/pull/1976

Describe the solution you'd like Support this syntax.

Describe alternatives you've considered We can still use where condition to filter the partitions.

scala> spark.sql("select a from t where a<100").show(false)
21/03/23 10:24:44 WARN ObjectStore: Failed to get database test, returning NoSuchObjectException
21/03/23 10:24:44 WARN ObjectStore: Failed to get database test, returning NoSuchObjectException
+---+
|a  |
+---+
+---+

Additional context Add any other context or screenshots about the feature request here.

LittleFall avatar Mar 23 '21 02:03 LittleFall

Sorry for the late reply. Sadly, we don't support it for spark SQL doesn't support this syntax.

shiyuhang0 avatar Apr 02 '22 06:04 shiyuhang0