tiup icon indicating copy to clipboard operation
tiup copied to clipboard

tiup deployment fails on rocky 9.2

Open jebter opened this issue 2 years ago • 2 comments

Bug Report

Please answer these questions before submitting your issue. Thanks!

  1. What did you do?

tiup cluster deploy tidb-750 v7.5.0 ~/topology.yaml --user root -p tiup is checking updates for component cluster ... Starting component cluster: /home/tidb/.tiup/components/cluster/v1.12.1/tiup-cluster deploy tidb-750 v7.5.0 /home/tidb/topology.yaml --user root -p Input SSH password:

  • Detect CPU Arch Name

    • Detecting node 172.16.6.50 Arch info ... Done
  • Detect CPU OS Name

    • Detecting node 172.16.6.50 OS info ... Done Please confirm your topology: Cluster type: tidb Cluster name: tidb-750 Cluster version: v7.5.0 Role Host Ports OS/Arch Directories

pd 172.16.6.50 2379/2380 linux/x86_64 /data1/tidb-deploy/pd-2379,/data1/tidb-data/pd-2379 tikv 172.16.6.50 20160/20180 linux/x86_64 /data1/tidb-deploy/tikv-20160,/data2/tikv-20160 tidb 172.16.6.50 4000/10080 linux/x86_64 /data1/tidb-deploy/tidb-4000 tiflash 172.16.6.50 9000/8123/3930/20170/20292/8234 linux/x86_64 /data1/tidb-deploy/tiflash-9000,/data1/tiflash-9000 prometheus 172.16.6.50 9090/12020 linux/x86_64 /data1/tidb-deploy/prometheus-9090,/data1/tidb-data/prometheus-9090 grafana 172.16.6.50 3000 linux/x86_64 /data1/tidb-deploy/grafana-3000 alertmanager 172.16.6.50 9093/9094 linux/x86_64 /data1/tidb-deploy/alertmanager-9093,/data1/tidb-data/alertmanager-9093 Attention: 1. If the topology is not what you expected, check your yaml file. 2. Please confirm there is no port/directory conflicts in same host. Do you want to continue? [y/N]: (default=N) y

  • Generate SSH keys ... Done
  • Download TiDB components
    • Download pd:v7.5.0 (linux/amd64) ... Done
    • Download tikv:v7.5.0 (linux/amd64) ... Done
    • Download tidb:v7.5.0 (linux/amd64) ... Done
    • Download tiflash:v7.5.0 (linux/amd64) ... Done
    • Download prometheus:v7.5.0 (linux/amd64) ... Done
    • Download grafana:v7.5.0 (linux/amd64) ... Done
    • Download alertmanager: (linux/amd64) ... Done
    • Download node_exporter: (linux/amd64) ... Done
    • Download blackbox_exporter: (linux/amd64) ... Done
  • Initialize target host environments
    • Prepare 172.16.6.50:22 ... Done
  • Deploy TiDB instance
    • Copy pd -> 172.16.6.50 ... Done
    • Copy tikv -> 172.16.6.50 ... Done
    • Copy tidb -> 172.16.6.50 ... Done
    • Copy tiflash -> 172.16.6.50 ... Done
    • Copy prometheus -> 172.16.6.50 ... Done
    • Copy grafana -> 172.16.6.50 ... Done
    • Copy alertmanager -> 172.16.6.50 ... Done
    • Deploy node_exporter -> 172.16.6.50 ... Done
    • Deploy blackbox_exporter -> 172.16.6.50 ... Done
  • Copy certificate to remote host
  • Init instance configs
    • Generate config pd -> 172.16.6.50:2379 ... Done
    • Generate config tikv -> 172.16.6.50:20160 ... Done
    • Generate config tidb -> 172.16.6.50:4000 ... Done
    • Generate config tiflash -> 172.16.6.50:9000 ... Done
    • Generate config prometheus -> 172.16.6.50:9090 ... Done
    • Generate config grafana -> 172.16.6.50:3000 ... Done
    • Generate config alertmanager -> 172.16.6.50:9093 ... Done
  • Init monitor configs
    • Generate config node_exporter -> 172.16.6.50 ... Done
    • Generate config blackbox_exporter -> 172.16.6.50 ... Done Enabling component pd Enabling instance 172.16.6.50:2379 Failed to enable unit: Unit file pd-2379.service does not exist.

Error: failed to enable/disable pd: failed to enable: 172.16.6.50 pd-2379.service, please check the instance's log(/data1/tidb-deploy/pd-2379/log) for more detail.: executor.ssh.execute_failed: Failed to execute command over SSH for '[email protected]:22' {ssh_stderr: Failed to enable unit: Unit file pd-2379.service does not exist. , ssh_stdout: , ssh_command: export LANG=C; PATH=$PATH:/bin:/sbin:/usr/bin:/usr/sbin /usr/bin/sudo -H bash -c "systemctl daemon-reload && systemctl enable pd-2379.service"}, cause: Process exited with status 1

Verbose debug logs has been written to /home/tidb/.tiup/logs/tiup-cluster-debug-2023-10-20-16-27-24.log.

  1. What did you expect to see?

  2. What did you see instead?

  3. What version of TiUP are you using (tiup --version)? 1.13.1 tiup Go Version: go1.21.1 Git Ref: v1.13.1 GitHash: 3653dc521afbd0da505cdbe4bcbc92c39fe66b74

jebter avatar Oct 20 '23 08:10 jebter

playground is ok

jebter avatar Oct 20 '23 08:10 jebter

tiup playground v7.5.0 tiup is checking updates for component playground ... Starting component playground: /home/tidb/.tiup/components/playground/v1.12.1/tiup-playground v7.5.0 Start pd instance:v7.5.0 Start tikv instance:v7.5.0 Start tidb instance:v7.5.0 Waiting for tidb instances ready 127.0.0.1:4000 ... Done Start tiflash instance:v7.5.0 Waiting for tiflash instances ready 127.0.0.1:3930 ... Done

🎉 TiDB Cluster is started, enjoy!

Connect TiDB: mysql --host 127.0.0.1 --port 4000 -u root TiDB Dashboard: http://127.0.0.1:2379/dashboard Grafana: http://127.0.0.1:3000

jebter avatar Oct 20 '23 08:10 jebter