flink
flink copied to clipboard
[FLINK-28693][table] Fix janino compile failed because the code generated refers the class in table-planner
What is the purpose of the change
The code generated by codegen references the class in the table-planner package, but the class in the table-planner package is hidden by table-planner-loader, so classloader cannot find it.
This pr moves the referred class WatermarkGeneratorCodeGeneratorFunctionContextWrapper from table-planner to table-runtime.
Brief change log
- Moves the class
WatermarkGeneratorCodeGeneratorFunctionContextWrapperfrom table-planner to table-runtime - Deployments RPC transmits only the blob storage reference
- TaskManagers retrieve the TaskInfo from the blob cache
Verifying this change
This pr can't be verified by IT cases or UT cases because the bug only can be -reproduced without jar table-planner.
So we need to verify this pr manually. The following steps can verify it.
- run a kafka and prepare some data You can start docker and run the test in flink-connector-kafka by debugging to do it: https://github.com/apache/flink-connector-kafka/blob/abf4563e0342abe25dc28bb6b5457bb971381f61/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/table/KafkaTableITCase.java#L488C17-L488C58
Note: debug the code after executing preparing data: https://github.com/apache/flink-connector-kafka/blob/abf4563e0342abe25dc28bb6b5457bb971381f61/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/table/KafkaTableITCase.java#L531C9-L531C48
- run start-cluster to start flink
- run sql-client and then execute ddl Add computed column and watermark column like:
ts AS COALESCE(`timestamp` ,CURRENT_TIMESTAMP),
WATERMARK FOR ts AS ts - INTERVAL '5' SECOND
- run
select * from kafka_table
Before this fix, you will see an exception thrown. After this fix and re-build flink repo, re-run 2-4, then you can see the actual data from sql-client.
Does this pull request potentially affect one of the following parts:
- Dependencies (does it add or upgrade a dependency): no
- The public API, i.e., is any changed class annotated with
@Public(Evolving): no - The serializers: no
- The runtime per-record code paths (performance sensitive): no
- Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: no
- The S3 file system connector: no
Documentation
- Does this pull request introduce a new feature? no
- If yes, how is the feature documented?
CI report:
- 0c6522cdfb5a9e7d07a58bc7381832500f0118a3 Azure: SUCCESS
Bot commands
The @flinkbot bot supports the following commands:@flinkbot run azurere-run the last Azure build
Is there a chance to add a test/ITest for this issue?
Hi, @snuyanzin I think regular IT tests and UT tests cannot help. If some tests must be added, I guess it can be added to the flink-stream-sql-test module in the flink-end-to-end-tests module. But I'm not sure that this module only depends on table-planner-loader but not table-planner. Meanwhile, we need to mock a source that supports SupportWatermarkPushDown
I have tested this change on 1.18.0 in same conditions as reported in FLINK-28693 and it fixes the issue. Job deploys, consumes and process Kafka messages using the UDF function as expected.
This is a pr we would like merged. It looks like @snuyanzin asked for tests to be added. @xuyangzhong are you looking at adding the tests?
This is a pr we would like merged. It looks like @snuyanzin asked for tests to be added. @xuyangzhong are you looking at adding the tests?
Hi, @davidradl . I'm attempting to add tests for it, but my recent schedule has been quite tight. I will do my best to re-push the PR within the next few days. Considering your dependency on this PR, another quick solution would be to cherry-pick this PR to your branch and repackage Flink.
Hi, @snuyanzin . I have added the test in flink-end-to-end-tests. I would appreciate it if you have time to review it.
@xuyangzhong thanks for the update it looks good from my side I put a couple of minor comments
Could you also please create backports for other branches?
@snuyanzin Sure. I will create PRs to back port to 1.19 and 1.18 and comment the links here after this pr is merged. WDYT?
that is also ok to me
Thanks very much for the bp @snuyanzin 👍