[spark] Support drop partition from top level
Purpose
To support drop partition from top level in spark if partition spec with one more fields
example:
CREATE TABLE tbl (id int, data string)
USING paimon
PARTITIONED BY (dt string, hour string, event string);
ALTER TABLE tbl DROP PARTITION (dt='2023-01-01');
Tests
- DDLTestBase.scala
- DropPartitionParserTest.java
- PaimonPartitionManagementTest.scala
API and Format
No
Documentation
No
Can have a look at https://github.com/apache/spark/pull/48108
Can have a look at apache/spark#48108
It looks like the Spark community hasn’t accepted this PR. Could we implement the feature within Paimon instead? From a data-ops perspective, cascading deletes based on parent partitions is a very useful capability.
Would you mind have a look? @Zouxxyy @JingsongLi
Can have a look at apache/spark#48108
It looks like the Spark community hasn’t accepted this PR. Could we implement the feature within Paimon instead? From a data-ops perspective, cascading deletes based on parent partitions is a very useful capability.
+1, Can we support in paimon first to enable drop from the top level ? I think it's a useful feature to enhance the paimon spark operation cc @Zouxxyy @JingsongLi