DataSphereStudio
DataSphereStudio copied to clipboard
DataSphereStudio is a one stop data application development& management portal, covering scenarios including data exchange, desensitization/cleansing, analysis/mining, quality measurement, visualizati...
1. 安装linkis jobtypes 按照官方安装文档进行自动化安装,执行sh install.sh最后一步报错:{"error":"Missing required parameter 'execid'."}。并没有看到文档中所说的“如果安装成功最后会打印:`{"status":"success"}`”,但是能在azkaban的/plugins/jobtypes目录下看到已经安装好的linkis任务插件。通过排查在安装脚本最后一步会去调用"curl http://azkaban_ip:executor_port/executor?action=reloadJobTypePlugins"进行插件的刷新。重启azkaban executor日志中看到已经加载了插件的信息 `INFO [JobTypeManager][Azkaban] Loaded jobtype linkis com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.AzkabanDssJobType`。当时没有排查到相应的问题于是跳过。当发布linkis任务到azkaban执行成功之后反过来复盘这个问题的时候,这确定应该是个误报信息。 2. 从dss发布project到azkaban 问题描述:日志报错azkaban不存在当前用户 问题排查:确认报用户不存在的用户是能正常访问的azkaban的,异常堆栈日志被捕获了没有太多日志。于是本地远程调试发现在AzkabanSecurityService#getSession方法执行`httpClient.execute(httpPost, context)`时直接报错了。我们的azkaban开启了https当前登录的接口不支持https,临时的解决方案是关闭了azkaban的https。 3. 问题2的衍生 解决完第一个问题之后还是不能发布任务,但是`response = httpClient.execute(httpPost, context);` reponse返回的信息已经是变为“incorrect login”。最后排查发现是把azkaban的登录请求中的password写成了userpwd,改了重新打包验证通过。 4....
**Is your feature request related to a problem? Please describe.** DSS is not managing the user's development-> test-> production release process, it is recommended to control this process **Describe the...
问题描述:关于工程中某个工作流节点单独执行不报错,保存工作流时提示:NotClassDefFoundError:Cloud not initialize class dispatch.Http$ 问题原因:linkis-publish微服务关于netty-3.6.2.Final.jar升级包缺失 问题解决:将升级包上传重启linkis-publish微服务即可。
After the selected node of sendemail sending item is deleted, the publishing project will report an error:NoSuchElementException:No value present
dss create workflow,报 om.webank.wedatasphere.linkis.bml.common.HdfsResourceHelper 74 upload - hadoop write to hdfs:///tmp/linkis/hadoop/20191212/09282afb-bb3c-4b9b-9cb8-c78ece3a2428.json failed, reason is, IOException: java.io.IOException: You have not permission to access path /tmp/linkis/hadoop/20191212/09282afb-bb3c-4b9b-9cb8-c78ece3a2428.json 。 my permission is drwxrwxrwx -...
查看linkis-metadata组件的log日志,报错信息为 ERROR(Druid-ConnectionPool-Create) ERROR DruidDataSource - create connection error java.sql.SQLSyntaxErrorException:Syntax error:Encountered "" at line 1 ,column 8.
``` 2019-12-12 23:04:41.706 ERROR [Engine-Scheduler-ThreadPool-2] com.webank.wedatasphere.appjoint.QualitisAppJoint 106 submit - Error! Can not submit job java.lang.NullPointerException: null at com.webank.wedatasphere.appjoint.QualitisNodeExecution.submit(QualitisNodeExecution.java:67) [dss-qualitis-appjoint-0.6.0.jar:?] at com.webank.wedatasphere.dss.appjoint.execution.core.LongTermNodeExecution.execute(LongTermNodeExecution.java:65) [dss-appjoint-core-0.6.0.jar:?] at com.webank.wedatasphere.dss.linkis.appjoint.entrance.execute.AppJointEntranceEngine.execute(AppJointEntranceEngine.scala:169) [linkis-appjoint-entrance-0.6.0.jar:?] at com.webank.wedatasphere.dss.linkis.appjoint.entrance.job.AppJointEntranceJob$$anonfun$1.apply(AppJointEntranceJob.scala:75) [linkis-appjoint-entrance-0.6.0.jar:?] at com.webank.wedatasphere.dss.linkis.appjoint.entrance.job.AppJointEntranceJob$$anonfun$1.apply(AppJointEntranceJob.scala:75)...
**问题描述:登陆后无法连接上数据库 **已尝试解决方法: 已在 ${LINKIS_HOME}/linkis-metadata/conf/linkis.properties中添加了Hive meta信息,如下: ##datasource wds.linkis.server.login.use.default=false hive.meta.url=jdbc:mysql://127.0.0.1:3306/hive?characterEncoding=UTF-8 hive.meta.user=xxxx hive.meta.password=xxxx **查看日志 ${LINKIS_HOME}/linkis-metadata/logs/linkis-metada.out ERROR (main) ERROR DruidDataSource - mysql should not use 'PoolPreparedStatements'
yarn队列问题,查看spark-enginemanage的log文件,报错内容如下: ``` ERROR [qtp1670196451-332] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 72 apply - error code(错误码): 111006, error message(错误信息): Get the Yarn queue information exception.(获取Yarn队列信息异常). com.webank.wedatasphere.linkis.common.exception.ErrorException: errCode: 111006 ,desc: Get the Yarn queue information exception.(获取Yarn队列信息异常) ,ip:...
 new jdbc file, open it error like image