starrocks
starrocks copied to clipboard
spark load fail when use hudi table for origin table
Steps to reproduce the behavior (Required)
LOAD LABEL db.linyu_20240424_hudi2sr (
DATA FROM TABLE hudi_table
INTO TABLE linyu001_hudi_2_sr_test
TEMPORARY PARTITION(temp__p20210102)
SET (
`_hoodie_commit_time` = `_hoodie_commit_time`,
`_hoodie_commit_seqno` = `_hoodie_commit_seqno`, `_hoodie_commit_seqno`),
`_hoodie_record_key` =`_hoodie_record_key`,
`_hoodie_partition_path` = `_hoodie_partition_path`,
`_hoodie_file_name` = `_hoodie_file_name`,
`dt` = `dt`,
`id` = `id`,
`name` = `name`,
`age` = `age`,
`ts` = `ts`
)
WHERE (`dt` = '20210102')
)WITH RESOURCE 'xxxxx' (
"spark.yarn.tags" = "h2s_foit_linyu20240422abc001",
"spark.dynamicAllocation.enabled" = "true",
"spark.executor.memory" = "3g",
"spark.executor.memoryOverhead" = "2g",
"spark.streaming.batchDuration" = "5",
"spark.executor.cores" = "1",
"spark.yarn.executor.memoryOverhead" = "4g",
"spark.speculation" = "false",
"spark.dynamicAllocation.minExecutors" = "10",
"spark.dynamicAllocation.maxExecutors" = "100"
) PROPERTIES (
"timeout" = "72000",
"spark_load_submit_timeout" = "36000"
)
Expected behavior (Required)
return load success
Real behavior (Required)
Unexpected exception: Source table hudi_table is not HiveTable ### The error may exist in
StarRocks version (Required)
2.5.12