docker-volume-sshfs
docker-volume-sshfs copied to clipboard
Connection reset by peer
Hi, when I use this plugin, it always give an error, can somebody tell how to solve it?
dingrui@ubuntu:/var/lib/docker$ docker run -d --name sshfs-container --volume-driver vieux/sshfs --mount src=sshvolume,target=/app,volume-opt=sshcmd=root@ipaddr:/home/dingrui,volume-opt=passwd=***** nginx:latest
WARN[0000] `--volume-driver` is ignored for volumes specified via `--mount`. Use `--mount type=volume,volume-driver=...` instead.
ebea8552688de12a197c042fb1908937bd80b548029d84c45c7e9c910f84a826
docker: Error response from daemon: error while mounting volume '': VolumeDriver.Mount: sshfs command execute failed: exit status 1 (read: Connection reset by peer
).
I too have the same issue. Any solution to this?
oh dear just when i was looking for an solution
My sshvolume
only works when using -o password=xxx
, but when I use -o IdentityFile=xxx
, The error is always Connection reset by peer.
I think the issue is related to #61. By creating with volume with -o debug
flag and trying to run busybox, it shows the message:
docker: Error response from daemon: error while mounting volume '': VolumeDriver.Mount: sshfs command execute failed: exit status 1 (FUSE library version: 2.9.7
nullpath_ok: 0
nopath: 0
utime_omit_ok: 0
Bad owner or permissions on /root/.ssh/config
read: Connection reset by peer
).
I didn't have the config
file. I then tried added a root
owned and chmod 600
to the new empty config file. The error is still the same.
root@ideapad:~/.ssh# ls -l
total 4
-rw------- 1 root root 0 Mar 6 19:19 config
-rw-r--r-- 1 root root 444 Mar 6 12:38 known_hosts
I can reproduce this error, even when following the instructions from https://github.com/vieux/docker-volume-sshfs yielding:
OK the solution is described in #58 :
The volume create command must have the ssh key set to /root/.ssh/id_rsa
instead of home/your-user/.ssh/id_rsa
because the plugin mounts it differently (check #58 for details)
OK the solution is described in #58 :
The volume create command must have the ssh key set to
/root/.ssh/id_rsa
instead ofhome/your-user/.ssh/id_rsa
because the plugin mounts it differently (check #58 for details)
this didnt work for me by just changing to /root/.ssh/id_rsa
because the incorrect config was still applied to the volume that was already created in my failed attempts. it still showed my users ssh dir. i needed to remove the volume and have it create it again with the proper config.