chunjun
chunjun copied to clipboard
FlinkSQL 消费kafka数据写入hive为空
Search before asking
- [X] I had searched in the issues and found no similar issues.
What happened
问题:
flinkSQL kafka-sink可以消费topic数据,hive sink接收到数据,但是往hdfs上写入文件时,一直为空,即使强制刷新,.data文件一直为空
问题代码:
执行SQL: CREATE TABLE source ( id int, device_id string ) WITH ( 'connector' = 'kafka-x', 'topic' = 'test', 'properties.bootstrap.servers' = '172.18.8.203:9092', 'scan.startup.mode' = 'earliest-offset', 'format' = 'json' ); CREATE TABLE test_hive ( id int, device_id string ) WITH (
'connector' = 'hive-x',
'properties.hadoop.user.name' = 'hadoop' ,
'default-fs' = 'hdfs://hadoop02:8020',
'file-type' = 'text',
'url' = 'jdbc:hive2://172.18.8.208:10000/ljgk_dw',
'username' = 'root',
'password' = '123456',
'encoding' = 'utf-8',
'field-delimiter' = '\t',
'partition' = 'pt',
'partition-type' = 'DAY',
'sink.parallelism' = '1',
'write-mode' = 'overwrite',
'table-name' = 'test_hive01'
);
insert into test_hive select * from source
What you expected to happen
问题代码:
How to reproduce
x
Anything else
No response
Version
1.12_release
Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct