seatunnel icon indicating copy to clipboard operation
seatunnel copied to clipboard

[Bug] [seatunnel-connector-spark-jdbc] after the sink is complete, the temporary table is created at the destination

Open liuyongfei1 opened this issue 2 years ago • 11 comments

Search before asking

  • [X] I had searched in the issues and found no similar issues.

What happened

使用spark引擎执行jdbc 到 jdbc 的数据同步,在数据同步成功完成后,会在目的端创建一个与目的表表结构完全一致的空表。

SeaTunnel Version

最新的 dev分支。2.2.3也存在这个问题。

SeaTunnel Config

env {
  spark.app.name = "lyf_test_spark_0909"
  spark.executor.instances = 2
  spark.executor.cores = 1
  spark.executor.memory = "1g"
}
source {
  jdbc {
    driver = "com.mysql.cj.jdbc.Driver"
    url = "jdbc:mysql://10.*.*.*:3306/davinci?useUnicode=true&characterEncoding=utf8&useSSL=false"
    table = "sea1"
    result_table_name = "sea1_data"
    user = "*****"
    password = "**********"
 }
}
transform {
  sql {
    sql = "select id,name from sea1_data"
  }
}
sink {
  jdbc {
    saveMode = "update"
    driver = "com.mysql.cj.jdbc.Driver"
    url = "jdbc:mysql://10.*.*.*:3306/davinci?useUnicode=true&characterEncoding=utf8&useSSL=false"
    user = "*"
    password = "*"
    dbTable = "sea1_data"
    customUpdateStmt = "insert into sea3(id,name) values(?,?)"
 }
}

Running Command

/apache-seatunnel-incubating-2.1.3-SNAPSHOT/bin/start-seatunnel-spark.sh --master yarn --deploy-mode cluster --config /apache-seatunnel-incubating-2.1.3-SNAPSHOT/config/test.spark.jdbc.jdbc.conf

Error Exception

没有报错,任务成功执行。

Flink or Spark Version

spark version: 2.3.2.3.1.5.0-152

Java or Scala Version

java:1.8 scala:2.11

Screenshots

1、任务成功执行,目的表已经写入数据:

table-4

2、在目的端会生成一个与目的表结构一模一样的空表: table-5

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

Code of Conduct

liuyongfei1 avatar Sep 15 '22 11:09 liuyongfei1

If you are not in a hurry, I will try this weekend. But it may exceed my ability. On Monday, I will ask others to help me repair it.

laglangyue avatar Sep 16 '22 02:09 laglangyue

If you are not in a hurry, I will try this weekend. But it may exceed my ability. On Monday, I will ask others to help me repair it.

thanks,Hoping for good news!

liuyongfei1 avatar Sep 16 '22 03:09 liuyongfei1

If you are not in a hurry, I will try this weekend. But it may exceed my ability. On Monday, I will ask others to help me repair it.

thanks,Hoping for good news!

did you test for flink-engine

laglangyue avatar Sep 18 '22 08:09 laglangyue

I has finded the bug which code cause. But the code is inconsistent with the description in the document. I need to ask the Commiter.

laglangyue avatar Sep 18 '22 09:09 laglangyue

If you are not in a hurry, I will try this weekend. But it may exceed my ability. On Monday, I will ask others to help me repair it.

thanks,Hoping for good news!

did you test for flink-engine

flink-engine is ok

liuyongfei1 avatar Sep 18 '22 09:09 liuyongfei1

I has finded the bug which code cause. But the code is inconsistent with the description in the document. I need to ask the Commiter.

thanks

liuyongfei1 avatar Sep 18 '22 09:09 liuyongfei1

the Bug exception in org.apache.spark.sql.execution.datasources.jdbc2.DefaultSource when the table names in dbTable and customUpdateStmt are inconsistent

laglangyue avatar Sep 18 '22 10:09 laglangyue

image

laglangyue avatar Sep 18 '22 10:09 laglangyue

I has finded the bug which code cause. But the code is inconsistent with the description in the document. I need to ask the Commiter.

thanks

you just keep table names consistent dbTable is not souce table name, and it is sink tableName which is Sink-Datasource

dbTable = "sea1_data"
    customUpdateStmt = "insert into sea3(id,name) values(?,?)"

laglangyue avatar Sep 18 '22 10:09 laglangyue

I has finded the bug which code cause. But the code is inconsistent with the description in the document. I need to ask the Commiter.

thanks

you just keep table names consistent dbTable is not souce table name, and it is sink tableName which is Sink-Datasource

dbTable = "sea1_data"
    customUpdateStmt = "insert into sea3(id,name) values(?,?)"

ok,thanks. I'll try it.

liuyongfei1 avatar Sep 20 '22 01:09 liuyongfei1

This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.

github-actions[bot] avatar Oct 21 '22 00:10 github-actions[bot]

This issue has been closed because it has not received response for too long time. You could reopen it if you encountered similar problems in the future.

github-actions[bot] avatar Nov 25 '22 00:11 github-actions[bot]