[Bug] Can't extract bucket from row in dynamic bucket when using hive(mr or tez) insert into paimon primary table with dynamic-bucket
Search before asking
- [x] I searched in the issues and found nothing similar.
Paimon version
1.0-20241111
Compute Engine
hive3.1.2
Minimal reproduce step
1)sparksql CREATE TABLE spark_catalog.tmp.hive2paimon_withpk ( receive_time BIGINT NOT NULL, source_ip STRING NOT NULL, source_port INT, sdk_version STRING) USING paimon TBLPROPERTIES ( 'bucket' = '-1', 'dynamic-bucket.initial-buckets' = '1', 'dynamic-bucket.target-row-num' = '1000', 'file.format' = 'orc', 'primary-key' = 'receive_time,source_ip')
2)hive insert into tmp.hive2paimon_withpk select 1740016451,'192.168.100.200',34738,'2.9.9';
What doesn't meet your expectations?
org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:234) ... 7 more Caused by: java.lang.IllegalArgumentException: Can't extract bucket from row in dynamic bucket mode, you should use 'TableWrite.write(InternalRow row, int bucket)' method. at org.apache.paimon.table.sink.DynamicBucketRowKeyExtractor.bucket(DynamicBucketRowKeyExtractor.java:44) at org.apache.paimon.table.sink.TableWriteImpl.toSinkRecord(TableWriteImpl.java:205) at org.apache.paimon.table.sink.TableWriteImpl.writeAndReturn(TableWriteImpl.java:174) at org.apache.paimon.table.sink.TableWriteImpl.write(TableWriteImpl.java:147) at org.apache.paimon.hive.mapred.PaimonRecordWriter.write(PaimonRecordWriter.java:67) ... 17 more
Anything else?
No response
Are you willing to submit a PR?
- [ ] I'm willing to submit a PR!