[Bug] [File-Connector] There is a problem with the path written to the file connector text and orc files.
Search before asking
- [X] I had searched in the issues and found no similar issues.
What happened
I tried to read kafka and write to hive. I found that the data was written to hdfs normally, but the data could not be read. I found that there was a problem with the file on hdfs. This is the problematic path: [hdfs://nameservice1/user/hive/warehouse/test.db/university\ T_858684107524145153_678ce70c71_0_1_0.txt]. I found that there was suddenly one more path, and the actual path should be [hdfs://nameservice1/user/hive/warehouse/test.db/university/T_858684107524145153_678ce70c71_0_1_0.txt]
SeaTunnel Version
dev
SeaTunnel Config
env {
job.mode = "batch"
parallelism = "1"
job.retry.times = "0"
job.name = "aace8bb9f8864562b0264ea75e3991f5"
checkpoint.interval = "30000"
}
source {
Kafka {
schema = {
fields {
"university": "string"
#"AppearTime": "timestamp",
#"Calling": "int",
#"DeviceID": "string",
#"Direction": "string",
#"DisappearTime": "timestamp",
}
}
format = "json"
bootstrap.servers = "10.28.xxxx:9092"
format_error_handle_way = "skip"
topic = "student2"
consumer.group = "1111"
semantics = EXACTLY_ONCE
start_mode = "earliest"
result_table_name = "hive1"
}
}
sink {
# choose stdout output plugin to output data to console
Hive {
source_table_name = "hive1"
table_name = "test.university2"
metastore_uri = "thrift://xxxx:9083"
hdfs_site_path = "D:/安装包/kerberos/hive认证/hdfs-site.xml"
hive_site_path = "D:/安装包/kerberos/hive认证/hive-site.xml"
kerberos_principal = "hive/[email protected]"
krb5_path = "D:/安装包/kerberos/hive认证/krb5.conf"
kerberos_keytab_path = "D:/安装包/kerberos/hive认证/hive.service.keytab"
}
}
Running Command
-e local
Error Exception
无
Zeta or Flink or Spark Version
dev
Java or Scala Version
1.8
Screenshots
No response
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
Such a path results in empty table data read by hive
This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.
This issue has been closed because it has not received response for too long time. You could reopen it if you encountered similar problems in the future.