Liang, Chen (digitalCHN)
Liang, Chen (digitalCHN)
> This works for me: > > ``` > FROM gcr.io/spark-operator/spark:v3.1.1-hadoop3 > > USER root > > ADD https://xxx.com/artifactory/apixio-spark/org/apache/hadoop/hadoop-aws/2.7.4/hadoop-aws-2.7.4.jar $SPARK_HOME/jars/ > ADD https://xxx.com/artifactory/apixio-spark/com/amazonaws/aws-java-sdk-bundle/1.7.4.2/aws-java-sdk-1.7.4.2.jar $SPARK_HOME/jars/ > > RUN groupadd -g 185...
过期了
@ad1happy2go Thanks for the reply. Hudi did transform the partition column timestamp value to the dataformat value based on the `hoodie.keygen.timebased.output.dateformat:yyyy-MM-dd` config. At the same time, the original timestamp value...
@ad1happy2go Based on the IoT scenario on which I've been working, the event time would be adopted as the partition column. At the same time, we would query data based...