Haoning Sun
Haoning Sun
@Jackson-Wang-7 Can you help review this code?
@yyongycy I have tested to get 1.5w pieces of data with recursive, and the performance has improved by nearly 40%.
@tcrain I will add a property for areDescendantsLoaded in getBucket. In fact, in the scenarios we need, sync metadata is a must, of course I have to consider more general...
@lucyge2022 @yyongycy Any other suggestions? I think the problem can be solved step by step, and the problem of pulling all files every time can be solved first, which is...
@yyongycy @Jackson-Wang-7 Please help review the issue and solution.
> @Haoning-Sun Thanks for your fix. Is there any error log in proxy.log? Can you attach the error stack here? Error log ``` ERROR S3Handler - Exception during create s3handler:alluxio.proxy.s3.S3Exception:...
@yyongycy @Jackson-Wang-7 Please help review the issue and solution.
> wondering whether this error exists in 2.10? This is an issue if `async` is enabled and the entire large file is uploaded in a single request.
When the client writes data to the worker, it is first written to the worker's alluxio block by `mCurrentBlockOutStream`, and then the data sent to the worker by `mUnderStorageOutputStream` is...
The reason for sending two copies of the data to the worker is probably know, because alluxio client to the worker is to write block, each block corresponds to a...