chunjun icon indicating copy to clipboard operation
chunjun copied to clipboard

A data integration framework

Results 293 chunjun issues
Sort by recently updated
recently updated
newest added

## Development Task 1、chunjun1.12 已经修复hive无法指定分区 2、chunjun1.12已经扩展hive读 3、chunjun1.12已经扩展robbitmq

enhancement

--- name: "Feature request" about: As a user, I want to request a New Feature on the product title: Add prefix in redis sink key to adapt the business labels:...

feature-request

We can config our redis key with what we like to describe to identify our business by setting a field called “keyPrefix” in redis conf. [755](https://github.com/DTStack/chunjun/issues/755)

脚本: java -cp /opt/chunjun/lib/* com.dtstack.chunjun.client.Launcher -mode yarn-per-job -jobType sync -job /opt/chunjun/chunjun-examples/json/stream/stream.json -chunjunDistDir chunjun-dist -flinkConfDir /opt/chunjun/flinkconf -hadoopConfDir /opt/chunjun/hadoopconf -flinkLibDir /opt/chunjun/flinklib -confProp "{\"flink.checkpoint.interval\":60000,\"yarn.application.queue\":\"default\"}" 报错摘要: File file:/home/flink/.flink/application_1653375620246_0033/chunjun-connector-stream-master.jar does not exist 报错日志: Application application_1653375620246_0033...

**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd...

feature-request

读取ftp数据的时候,FtpSourceFactory#createSource,ftpConfig.setColumn 每个FieldConf的index为null,会导致FtpInputFormat#nextRecordInternal 里的分支永远不会执行, 需要修改一下 将图1的图片改为 ftpConfig.setColumn(syncConf.getReader().getFieldList());

bug

mysql表字段是int类型 在jdbc阶段转为BigDecimalColumn ,但是在hbase写入阶段又没有对应的类型,就会抛异常

bug

com.dtstack.chunjun.connector.jdbc.source.JdbcInputFormatBuilder#checkFormat `if (conf.isIncrement()) { if (StringUtils.isBlank(conf.getIncreColumn())) { sb.append("increColumn can't be empty when increment is true;\n"); } **conf.setSplitPk(conf.getIncreColumn());** if (conf.getParallelism() > 1) { conf.setSplitStrategy("mod"); } }`

bug

为了防止重复提交作业,如果提交任务传如果提入了jobId并且长度为32位那么就使用该jobId作为flink任务运行的jobId。 如果没传入jobId,那么就使用flink自己生成的jobId 关联的issue是 758

可查看taier此issue https://github.com/DTStack/Taier/issues/463