hudi icon indicating copy to clipboard operation
hudi copied to clipboard

[SUPPORT] Data deduplication caused by drawback in the delete invalid files before commit

Open beyond1920 opened this issue 8 months ago • 9 comments

Dear community, Our user complained that after their daily run job which written to a Hudi cow table finished, the downstream reading jobs find many duplicate records today. The daily run job has been already online for a long time, and this is the first time of such wrong result. He gives a detailed deduplicated record as example to help debug. The record appeared in 3 base files which belongs to different file groups. image I find the today's writer job, the spark application finished successfully. In the driver log, I find those two files marked as invalid files which to delete, only one file is valid files. image And in the clean stage task log, those two files are also marked to be deleted and there is no exception in the task either. image Those two files already existed on the hdfs before the clean stage began, but they still existed after the clean stage.

Finally, found the root cause is some corner case happened in hdfs. And fs.delete does not throw any exception, only return false if the hdfs does not delete the file successfully. image And I check the fs.delete api, the definition is reasonable. image

I think we should check the return value offs.delete in HoodieTable#deleteInvalidFilesByPartitions to avoid wrong results. Besides, it's necessary to check all places which called fs.delete. Any suggestion?

beyond1920 avatar Jun 08 '24 16:06 beyond1920