duf icon indicating copy to clipboard operation
duf copied to clipboard

sshfs volume listed with incorrect statistics

Open kode54 opened this issue 3 years ago • 7 comments

I just installed version 0.5.0 on Arch Linux, and I have an sshfs volume connected to my remote TrueNAS Core server, hosting a 9.9T or so ZFS pool.

df output:

Filesystem                     1K-blocks                 Used          Available Use% Mounted on
chris@server:/mnt/storage     9377490260           8294366400         1083123860  89% /mnt/storage

duf output for duf /mnt/storage:

╭────────────────────────────────────────────────────────────────────────────╮
│ 1 fuse device                                                              │
├──────────────┬──────┬──────┬────────┬────────┬──────────┬──────────────────┤
│ MOUNTED ON   │ SIZE │ USED │  AVAIL │  USE%  │ TYPE     │ FILESYSTEM       │
├──────────────┼──────┼──────┼────────┼────────┼──────────┼──────────────────┤
│ /mnt/storage │ 2.2P │ 1.9P │ 258.2T │  88.4% │ fuse.ssh │ chris@server     │
│              │      │      │        │        │ fs       │ :/mnt/storage    │
╰──────────────┴──────┴──────┴────────┴────────┴──────────┴──────────────────╯

kode54 avatar Dec 21 '20 08:12 kode54

I've just come to report the same thing. Same situation, SSHFS on Ubuntu 20.04 (running on WSL2) and the mount is from TrueNAS Core,

df ouput:

Filesystem                          Size  Used Avail Use% Mounted on
truenas:/mnt/raid-z/entertainment/  6.7T  2.0T  4.8T  30% /tmp/from-NAS

duf output

╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ 1 fuse device                                                                                                           │
├───────────────┬──────┬────────┬───────┬───────────────────────────────┬────────────┬────────────────────────────────────┤
│ MOUNTED ON    │ SIZE │   USED │ AVAIL │              USE%             │ TYPE       │ FILESYSTEM                         │
├───────────────┼──────┼────────┼───────┼───────────────────────────────┼────────────┼────────────────────────────────────┤
│ /tmp/from-NAS │ 1.7P │ 503.5T │  1.2P │ [#####...............]  29.4% │ fuse.sshfs │ truenas:/mnt/raid-z/entertainment/ │
╰───────────────┴──────┴────────┴───────┴───────────────────────────────┴────────────┴────────────────────────────────────╯

M1XZG avatar Jun 20 '21 13:06 M1XZG

Probably the same as #44, don't you think?

IGLOU-EU avatar Mar 22 '22 21:03 IGLOU-EU

I'm having the same thing. It doesn't seem to be the same as #44, the difference is not a 5% of usage, but a 200+ factor of size difference (9.9T -> 2.2P or 6.7T -> 1.7P). FWIW, the percentage used is more or less the same between df and duf (the difference is <1%).

nitnelave avatar Aug 02 '22 12:08 nitnelave

Hi @nitnelave, Can you share the output of df and duf ?

IGLOU-EU avatar Aug 02 '22 12:08 IGLOU-EU

$ df -h
Filesystem                                                              Size  Used Avail Use% Mounted on
udev                                                                    7.7G     0  7.7G   0% /dev
tmpfs                                                                   1.6G   18M  1.6G   2% /run
/dev/md2                                                                2.7T  2.1T  513G  81% /
tmpfs                                                                   7.7G     0  7.7G   0% /dev/shm
tmpfs                                                                   5.0M     0  5.0M   0% /run/lock
tmpfs                                                                   7.7G     0  7.7G   0% /sys/fs/cgroup
/dev/loop1                                                              6.5M  6.5M     0 100% /snap/ffsend/48
/dev/loop4                                                              6.7M  6.7M     0 100% /snap/ffsend/49
/dev/loop10                                                              30M   30M     0 100% /snap/node/6268
/dev/md1                                                                487M  159M  303M  35% /boot
u2*****@u2*****.host.com:/u2*****@u2*****.host.com                      4.9T  4.0T  971G  81% /mnt/storagebox
/dev/loop12                                                              30M   30M     0 100% /snap/node/6331
/dev/loop13                                                             114M  114M     0 100% /snap/core/13308
/dev/loop5                                                               47M   47M     0 100% /snap/snapd/16010
/dev/loop7                                                               44M   44M     0 100% /snap/certbot/2133
/dev/loop0                                                               47M   47M     0 100% /snap/snapd/16292
/dev/loop3                                                               44M   44M     0 100% /snap/certbot/2192
/dev/loop8                                                              114M  114M     0 100% /snap/core/13425
/dev/loop6                                                               62M   62M     0 100% /snap/core20/1581
/dev/loop9                                                               62M   62M     0 100% /snap/core20/1587
tmpfs                                                                   1.6G     0  1.6G   0% /run/user/1000
$ duf
╭───────────────────────────────────────────────────────────────────────────────────────────╮
│ 2 local devices                                                                           │
├────────────┬────────┬────────┬────────┬───────────────────────────────┬──────┬────────────┤
│ MOUNTED ON │   SIZE │   USED │  AVAIL │              USE%             │ TYPE │ FILESYSTEM │
├────────────┼────────┼────────┼────────┼───────────────────────────────┼──────┼────────────┤
│ /          │   2.7T │   2.0T │ 512.3G │ [###############.....]  76.2% │ ext4 │ /dev/md2   │
│ /boot      │ 486.8M │ 158.4M │ 302.9M │ [######..............]  32.5% │ ext3 │ /dev/md1   │
╰────────────┴────────┴────────┴────────┴───────────────────────────────┴──────┴────────────╯
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ 1 fuse device                                                                                                                │
├─────────────────┬──────┬─────────┬────────┬───────────────────────────────┬────────────┬─────────────────────────────────────┤
│ MOUNTED ON      │ SIZE │    USED │  AVAIL │              USE%             │ TYPE       │ FILESYSTEM                          │
├─────────────────┼──────┼─────────┼────────┼───────────────────────────────┼────────────┼─────────────────────────────────────┤
│ /mnt/storagebox │ 1.2P │ 1007.3T │ 242.7T │ [################....]  80.6% │ fuse.sshfs │ u2*****@u2*****@host.com:           │
│                 │      │         │        │                               │            │ /u2*****@u2*****@host.com           │
╰─────────────────┴──────┴─────────┴────────┴───────────────────────────────┴────────────┴─────────────────────────────────────╯
╭───────────────────────────────────────────────────────────────────────────────────────────────╮
│ 6 special devices                                                                             │
├────────────────┬──────┬───────┬───────┬───────────────────────────────┬──────────┬────────────┤
│ MOUNTED ON     │ SIZE │  USED │ AVAIL │              USE%             │ TYPE     │ FILESYSTEM │
├────────────────┼──────┼───────┼───────┼───────────────────────────────┼──────────┼────────────┤
│ /dev           │ 7.6G │    0B │  7.6G │                               │ devtmpfs │ udev       │
│ /dev/shm       │ 7.7G │    0B │  7.7G │                               │ tmpfs    │ tmpfs      │
│ /run           │ 1.5G │ 17.1M │  1.5G │ [....................]   1.1% │ tmpfs    │ tmpfs      │
│ /run/lock      │ 5.0M │    0B │  5.0M │                               │ tmpfs    │ tmpfs      │
│ /run/user/1000 │ 1.5G │    0B │  1.5G │                               │ tmpfs    │ tmpfs      │
│ /sys/fs/cgroup │ 7.7G │    0B │  7.7G │                               │ tmpfs    │ tmpfs      │
╰────────────────┴──────┴───────┴───────┴───────────────────────────────┴──────────┴────────────╯

nitnelave avatar Aug 02 '22 12:08 nitnelave

Hi @nitnelave, I can't reproduce this bug with sshfs, would it be possible for you to give more information about this storage and its connection?

IGLOU-EU avatar Aug 07 '22 10:08 IGLOU-EU

Hmm, I don't know what else I can say about it. The /etc/fstab entry is:

sshfs#u******@u******.host.com:/u******@u******.host.com /mnt/storagebox fuse auto,_netdev,delay_connect,user,allow_other,noatime,follow_sym
links,IdentityFile=/root/storagebox_key,reconnect,ServerAliveInterval=15,ServerAliveCountMax=3

Does that help?

nitnelave avatar Aug 07 '22 15:08 nitnelave