[#5015] Suuport Spark Metadata
Purpose
Currently, Paimon missing spark metadata desc after spark version > 3.3, and if we want support Spark DEFAULT VALUE, we also need metadata desc, so, this pr will support this desc and other changes
Linked issue: close #5015 In the future we could use schema evolution transfer the options fields.xxx.metadataJson into field.metadata
PS: There are some questions about DEFAULT VALUE, which could not compatible with StructField or NestedField, so, this update only support for basicType.(I think the reason is from spark version, 3.5.4 could not support the sql syntax)
Tests
setps:
- [x] Test create table with default value command, then insert into use default
- [x] Test add column with default
- [x] Test delete column which has default value
- [x] Test update column which not exists default value
- [x] Test update column which exists default value
API and Format
Documentation
cc @Zouxxyy
@Zouxxyy Hello, i have complete this pr, but there are some questions. Because of the UpdateColumnDefaultValue class not exists in spark3.3/3.4 version, so i use reflect to check it. I think my pr has some format error need to fix, could you review it and give me some suggestions?
Hi @davidyuan1223 , default value support for Spark is implemented in https://github.com/apache/paimon/pull/5754
I close this PR now. Welcome to create new PRs to improve it and fix bugs.