Exchangis icon indicating copy to clipboard operation
Exchangis copied to clipboard

Exchangis configured mysql>hive,Is there support for writing data to hive?

Open Alecor-sudo opened this issue 4 years ago • 2 comments

image

  • 1: I have configured the mysql datasource and the hive datasource on exchageis; -2: When the job is executed, the data results are written to HDFS in the form of ".gz" compression

question: Is there support for writing data to hive?

For example:$hive> LOAD DATA LOCAL INPATH '/home/hadoop/tmp/user/hive/warehouse/hivedata.db/pub_info/exchangis_hive_w__03458f02_e1a8_499f_b4c3_301ba473b1d3.gz' OVERWRITE INTO TABLE pub_info;


-1: mysql data source and hive data source are configured on exchageis. -2: When the job is executed, the data results are written on hdfs and compressed in. gz mode.

Excuse me: Is it possible to directly or provide an option to write data into hive? ?-1: mysql data source and hive data source are configured on exchageis. -2: When the job is executed, the data results are written on hdfs and compressed in. gz mode.

Excuse me: Is it possible to directly or provide an option to write data into hive? ?

  • 1: 在exchageis上配置了mysql数据源,和hive数据源
  • 2: 在执行作业的时候、数据结果写到了hdfs上、并且是以.gz压缩的方式

请问:是否可以直接或者提供选择将数据写到hive中呢??

Alecor-sudo avatar Jun 01 '20 13:06 Alecor-sudo

It seems that your configuration is wrong. You can check your hive table's storage location if is "/user/hive/warehouse/hivedata.db/pub_info/". And Then check your hive data source model configuration if the hdfs path is "hdfs://{authority}/home/hadoop/tmp",replace it with "hdfs://{authority}“

Davidhua1996 avatar Jun 01 '20 14:06 Davidhua1996

Thank you. This error is due to my own configuration。 It can be used normally after modification。

😀

Alecor-sudo avatar Jun 02 '20 02:06 Alecor-sudo

This problem had been solved! The latest version is exchangis1.1.1. You can pay attention to it.

jefftlin avatar Sep 29 '22 06:09 jefftlin