seatunnel
seatunnel copied to clipboard
[Bug] [Connector-V2] Hive Sink Connector can not write data into hive table
Search before asking
- [X] I had searched in the issues and found no similar issues.
What happened
Hive Sink Connector can not write data into hive table
SeaTunnel Version
dev
SeaTunnel Config
env {
# You can set flink configuration here
execution.parallelism = 3
job.name="test_hive_source_to_hive"
}
source {
Hive {
table_name = "test_hive.test_hive_source_orc"
metastore_uri = "thrift://ctyun7:9083"
result_table_name = "tmp_table"
}
}
transform {
sql {
sql = "select test_tinyint, test_smallint,test_int, test_bigint,test_boolean, test_float, test_double,test_string,test_binary,test_timestamp, test_decimal, test_char, test_varchar, test_date , 'p1' as test_par1, 'p2' as test_par2 from tmp_table"
}
}
sink {
# choose stdout output plugin to output data to console
Hive {
table_name = "test_hive.test_hive_sink_text_simple"
metastore_uri = "thrift://ctyun7:9083"
partition_by = ["test_par1", "test_par2"]
sink_columns = ["test_tinyint", "test_smallint", "test_int", "test_bigint", "test_boolean", "test_float", "test_double", "test_string", "test_binary", "test_timestamp", "test_decimal", "test_char", "test_varchar", "test_date", "test_par1", "test_par2"]
}
}
Running Command
sh bin/start-seatunnel-flink-connector-v2.sh --config config/flink_hiveorc_to_hivetext_simple.conf
Error Exception
No Exception
Flink or Spark Version
SeaTunnel version: dev Hadoop version: Hadoop 2.10.2 Flink version: 1.12.7 Spark version: 2.4.3, scala version 2.11.12
Java or Scala Version
JDK 1.8
Screenshots
No response
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
I found the hive sink connector have no do msck repair table
after file commite. Another way is use hive metastore add partition to hive.
Assigned to me
This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.
This issue has been closed because it has not received response for too long time. You could reopen it if you encountered similar problems in the future.