ForwardXu

Results 23 issues of ForwardXu

1.环境centos7,java1.8091,maven3.9,node6.14.4,yarn1.10.1 2.执行 ./ide.sh build ![image](https://user-images.githubusercontent.com/10494131/47471650-d6042280-d83d-11e8-9d72-d00e8bc6d32b.png)

``` 2022-07-18 16:49:53 org.apache.hudi.exception.HoodieIOException: Could not load filesystem view storage properties from hdfs://XXXXXX/user/tdw/warehouse/csig_billing_rt_ods.db/ods_dev_flow_t_operation_flow_ri/.hoodie/.aux/view_storage_conf.properties at org.apache.hudi.util.ViewStorageProperties.loadFromProperties(ViewStorageProperties.java:78) at org.apache.hudi.util.StreamerUtil.getHoodieClientConfig(StreamerUtil.java:213) at org.apache.hudi.util.StreamerUtil.getHoodieClientConfig(StreamerUtil.java:152) at org.apache.hudi.util.StreamerUtil.createWriteClient(StreamerUtil.java:376) at org.apache.hudi.util.StreamerUtil.createWriteClient(StreamerUtil.java:360) at org.apache.hudi.sink.compact.CompactFunction.open(CompactFunction.java:81) at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34) at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102) at...

priority:minor
flink
table-service

```2022-07-19 05:44:23 org.apache.hudi.exception.HoodieIOException: Could not read commit details from hdfs://XXXXXX/.hoodie/20220719053423274.deltacommit at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.readDataFromPath(HoodieActiveTimeline.java:763) at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.getInstantDetails(HoodieActiveTimeline.java:264) at org.apache.hudi.common.table.timeline.HoodieDefaultTimeline.getInstantDetails(HoodieDefaultTimeline.java:372) at org.apache.hudi.hadoop.utils.HoodieInputFormatUtils.getCommitMetadata(HoodieInputFormatUtils.java:511) at org.apache.hudi.sink.partitioner.profile.WriteProfiles.getCommitMetadata(WriteProfiles.java:194) at org.apache.hudi.source.IncrementalInputSplits.lambda$inputSplits$71(IncrementalInputSplits.java:183) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)...

priority:major
flink
table-service

Fixed an issue where overflow from Long to Int was unsafe.

Fix ISSUE [CALCITE-3329](https://issues.apache.org/jira/browse/CALCITE-3329), Achieve similar features of FaceBook's osquery. E.g: select * from os_version; select * from system_info; select * from mounts; select * from interface_addresses select * from memory_info;...

`JSON_EXTRACT(json_doc, path[, path] ...)` Returns data from a JSON document, selected from the parts of the document matched by the `path` arguments. Returns `NULL` if any argument is `NULL` or...

Fix ISSUE [CALCITE-3137](https://issues.apache.org/jira/browse/CALCITE-3137)

This pull request for issue [CALCITE-3130]https://issues.apache.org/jira/browse/CALCITE-3130.

`JSON_INSERT`(jsondoc, path, val[, path, val] ) `JSON_REPLACE`(jsondoc, _path_, _val_[, path, val] ) `JSON_SET`(jsondoc, path, _val[, path, val] ) Inserts data into a JSON document and returns the result. Returns {{NULL}}...

## *Tips* - *Thank you very much for contributing to Apache Hudi.* - *Please review https://hudi.apache.org/contribute/how-to-contribute before opening a pull request.* ## What is the purpose of the pull request...

meta-sync
priority:major
spark-sql