seatunnel icon indicating copy to clipboard operation
seatunnel copied to clipboard

[Bug] quick-start-flink not working !!! ?

Open zhouhoo opened this issue 9 months ago • 7 comments

Search before asking

  • [X] I had searched in the issues and found no similar issues.

What happened

When I try seatunne on flink as offical doc says in the link, I got no error , flink job running but console had NO sink data output as the doc says!!! I tried to ask slack channel for help ,but slack link is outdated!!! BTW, I tried to sink to mysql database, still no data sinked!

SeaTunnel Version

2.3.3

SeaTunnel Config

only modified seatunnel-env.sh file:
FLINK_HOME=/root/shen/flink-1.13.5

Running Command

1. add mysql-connector-j-8.3.0.jar to lib, 
2. install some plugins including fake-source and console.
3. ./bin/start-seatunnel-flink-13-connector-v2.sh --config ./config/v2.batch.config.template

Error Exception

Job has been submitted with JobID 607d2d65753670b5f381fc502cb65c57
Program execution finished
Job with JobID 607d2d65753670b5f381fc502cb65c57 has finished.
Job Runtime: 2207 ms

Zeta or Flink or Spark Version

flink 1.13.5

Java or Scala Version

No response

Screenshots

无标题

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

Code of Conduct

zhouhoo avatar May 10 '24 13:05 zhouhoo

you submitted config is fake source to console, the console out is print in your worker node. not your submit job node.

liunaijie avatar May 11 '24 03:05 liunaijie

you can update the config, write data to a local file or other connector, then verify the result

liunaijie avatar May 11 '24 03:05 liunaijie

you submitted config is fake source to console, the console out is print in your worker node. not your submit job node.

flink worker node? I start flink 、seatunnel all on same host, only one host. just a quick start.

zhouhoo avatar May 11 '24 06:05 zhouhoo

you can update the config, write data to a local file or other connector, then verify the result

I also sink to mysql database, no data sinked in.

zhouhoo avatar May 11 '24 06:05 zhouhoo

try this way to verify:

  1. submit v2.streaming.conf.template config ( this is streaming job config)
  2. login flink ui check running flink jobs, check whether this job has submitted to your flink cluster.
  3. check log

liunaijie avatar May 11 '24 06:05 liunaijie

try this way to verify:

  1. submit v2.streaming.conf.template config ( this is streaming job config)
  2. login flink ui check running flink jobs, check whether this job has submitted to your flink cluster.
  3. check log

I tried, job was successfully submitted to flink, no error ,running status. but no data consuming. flink logs print some checkpoings was done successfully. seatunnel has no log output.

zhouhoo avatar May 11 '24 07:05 zhouhoo

try this way to verify:

  1. submit v2.streaming.conf.template config ( this is streaming job config)
  2. login flink ui check running flink jobs, check whether this job has submitted to your flink cluster.
  3. check log

I tried, job was successfully submitted to flink, no error ,running status. but no data consuming. flink logs print some checkpoings was done successfully. seatunnel has no log output.

Hi, i package from dev, using flink 1.13.5. some update to v2.streaming.conf.template :

  1. update env.parallelism and FakeSource.parallelism to 1
  2. update FakeSource.row.num to 100

Then it can run well. i can see the result in taskmanager's log. image

liunaijie avatar May 11 '24 07:05 liunaijie

This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.

github-actions[bot] avatar Jun 11 '24 00:06 github-actions[bot]