KubeFATE
KubeFATE copied to clipboard
kubeFate example upload data to hdfs failed
**What deployment mode you are use? ** Kuberentes --> minikube version: v1.20.0
**What KubeFATE and FATE version you are using? ** KubeFATE: v 1.4.5 FATE: v 1.9.0 MUST Please state the KubeFATE and FATE version you found the issue
**What OS you are using for docker-compse or Kubernetes? Please also clear the version of OS. **
- OS: Ubuntu
- Version : 22.04
参照流程: https://github.com/FederatedAI/KubeFATE/blob/master/docs/tutorials/Build_Two_Parties_FATE_Cluster_in_One_Linux_Machine_with_MiniKube.md
并于 fate-9999 pipeline_tutorial_upload_jerry.ipynb 中执行到pipeline_upload.upload(drop=1)
资料的时候会有 hdfs connection 相关问题, fateboard job log:
手动于 k8s_fateflow_python-0_fate-9999_a661fab6-509a-4893-b6c8-cc3f6a6a94b5_0
container 中加入 pyarrow & hdfs client 与环境变数皆无法正常执行 upload 动作.
以下为 k8s_fateflow_python-0_fate-9999_a661fab6-509a-4893-b6c8-cc3f6a6a94b5_0
中修改的环境变数:
-
~/.bashrc
-
/data/projects/fate/conf/service_conf.yaml
-
$HADOOP_HOME/etc/hadoop/core-site.xml
请问一下应该怎么解决呢?谢谢
Please get into your hadoop namenode pod, and run hdfs dfsadmin -report
, and check if the ip address of your datanode is the pod ip of your datanode