ibm-spectrum-scale-csi icon indicating copy to clipboard operation
ibm-spectrum-scale-csi copied to clipboard

hardlink are not being copied after snapshot restore pvc operation - mmxcp

Open Jainbrt opened this issue 5 years ago • 1 comments

Describe the bug hardlink are not being copied after snapshot restore pvc operation

To Reproduce Steps to reproduce the behavior:

1.Created storageclass and pvc (independent fileset)

[root@oc-w3 independent]# oc get sc ibm-spectrum-scale-csi-fileset -o yaml
apiVersion: storage.k8s.io/v1
kind: StorageClass
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"storage.k8s.io/v1","kind":"StorageClass","metadata":{"annotations":{},"name":"ibm-spectrum-scale-csi-fileset"},"parameters":{"volBackendFs":"gpfs0"},"provisioner":"spectrumscale.csi.ibm.com","reclaimPolicy":"Delete"}
  creationTimestamp: "2020-09-02T11:14:41Z"
  managedFields:
  - apiVersion: storage.k8s.io/v1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:annotations:
          .: {}
          f:kubectl.kubernetes.io/last-applied-configuration: {}
      f:parameters:
        .: {}
        f:volBackendFs: {}
      f:provisioner: {}
      f:reclaimPolicy: {}
      f:volumeBindingMode: {}
    manager: oc
    operation: Update
    time: "2020-09-02T11:14:41Z"
  name: ibm-spectrum-scale-csi-fileset
  resourceVersion: "133523711"
  selfLink: /apis/storage.k8s.io/v1/storageclasses/ibm-spectrum-scale-csi-fileset
  uid: c8416991-24a1-4c11-9c59-cc2f8e6fbc4d
parameters:
  volBackendFs: gpfs0
provisioner: spectrumscale.csi.ibm.com
reclaimPolicy: Delete
volumeBindingMode: Immediate

[root@oc-w3 independent]# oc get pvc
NAME                  STATUS   VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS                     AGE
scale-fset            Bound    pvc-0ee86d4c-f939-4225-bf83-c088a3593ce0   1Gi        RWX            ibm-spectrum-scale-csi-fileset   34m

  1. make hardlink files on the created pvc data dir
[root@oc-w3 pvc-0ee86d4c-f939-4225-bf83-c088a3593ce0-data]# ln filewithlink filewithlink1
[root@oc-w3 pvc-0ee86d4c-f939-4225-bf83-c088a3593ce0-data]# ls -iltr
total 0
655360 -rw-r--r--. 2 root root 0 Sep  8 21:25 filewithlink1
655360 -rw-r--r--. 2 root root 0 Sep  8 21:25 filewithlink
  1. Created volumesnapshotclass and volumesnapshot from the created pvc
[root@oc-w3 independent]# oc get volumesnapshotclasses.snapshot.storage.k8s.io
NAME         DRIVER                      DELETIONPOLICY   AGE
snapclass1   spectrumscale.csi.ibm.com   Delete           26m

[root@oc-w3 independent]# cat volumesnapshot.yaml
apiVersion: snapshot.storage.k8s.io/v1beta1
kind: VolumeSnapshot
metadata:
  name: snap1
spec:
  volumeSnapshotClassName: snapclass1
  source:
    persistentVolumeClaimName: scale-fset


[root@oc-w3 independent]# oc apply -f  volumesnapshot.yaml
volumesnapshot.snapshot.storage.k8s.io/snap1 created

[root@oc-w3 independent]# oc get volumesnapshot
NAME    READYTOUSE   SOURCEPVC    SOURCESNAPSHOTCONTENT   RESTORESIZE   SNAPSHOTCLASS   SNAPSHOTCONTENT                                    CREATIONTIME   AGE
snap1   true         scale-fset                           1Gi           snapclass1      snapcontent-fb911a37-03aa-42ba-ae10-864fff2d760b   <invalid>      6s
  1. Created pvc from the snapshots
[root@oc-w3 independent]# oc apply -f pvcfilesetfromsnapshot.yaml
persistentvolumeclaim/restore-pvc created

[root@oc-w3 ]# oc get pvc 
NAME                  STATUS    VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS                     AGE
scale-fset            Bound     pvc-0ee86d4c-f939-4225-bf83-c088a3593ce0   1Gi        RWX            ibm-spectrum-scale-csi-fileset   38m
restore-pvc           Bound     pvc-f782b3b2-5bd6-4304-9546-78aee7b73b09   10Gi       RWO            ibm-spectrum-scale-csi-fileset   18s

  1. Checked data on the restored pvc
[root@oc-w3 ]# ls /ibm/gpfs0/pvc-f782b3b2-5bd6-4304-9546-78aee7b73b09/pvc-f782b3b2-5bd6-4304-9546-78aee7b73b09-data/
filewithlink  filewithlink1

[root@oc-w3 ]# ls -iltr  /ibm/gpfs0/pvc-f782b3b2-5bd6-4304-9546-78aee7b73b09/pvc-f782b3b2-5bd6-4304-9546-78aee7b73b09-data/
total 0
803585 -rw-r--r--. 1 root root 0 Sep  8 21:25 filewithlink1
803584 -rw-r--r--. 1 root root 0 Sep  8 21:25 filewithlink

Expected behavior hardlink should be copied

Environment

OCP 4.5.8
Driver image     quay.io/ibm-spectrum-scale-dev/ibm-spectrum-scale-csi-driver@sha256:f3ca0ecc73ea3fbed26265c898e9c9dae30afac8d89ff78d064c5397e556aedc
Operator image    quay.io/ibm-spectrum-scale-dev/ibm-spectrum-scale-csi-operator@sha256:677e73b26842a514ce25d27386d12d2acf616559affa18eb47f47fe619277fcb

Jainbrt avatar Sep 08 '20 16:09 Jainbrt

I can recreate the behavior mentioned in this issue on latest builds + k8s setup.

When I restored a snapshot to pvc, a hardlink from snapshot is created as a new file in pvc i.e. hardlink was not pointing to original file's inode number, rather it got a new inode number. So, writing to original file thereafter wont reflect changes in hardlink file.

builds used for testing

k8s - v1.20.1
IBM Spectrum Scale - 5.1.1.0 210107.122040
apiVersion: snapshot.storage.k8s.io/v1
quay.io/ibm-spectrum-scale-dev/ibm-spectrum-scale-csi-operator:snapshots
quay.io/ibm-spectrum-scale-dev/ibm-spectrum-scale-csi-driver:snapshots
us.gcr.io/k8s-artifacts-prod/sig-storage/snapshot-controller:v4.0.0

test log

  1. pvc created. A file and hardlink created in it. snapshot created. Both original file and hardlink appeared in snapshot.
[root@ck-x-master 2021_01_13-21:15:20 test_snapshot]$ kubectl -n ibm-spectrum-scale-csi-driver get pvc pvc11
NAME    STATUS   VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS   AGE
pvc11   Bound    pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f   1Gi        RWX            sc-ind         15s
[root@ck-x-master 2021_01_13-21:15:21 test_snapshot]$

[root@ck-x-master 2021_01_13-22:00:52 test_snapshot]$ cd /mnt/fs1/pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f/pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data/
[root@ck-x-master 2021_01_13-22:01:10 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$ ls -ltr
total 0
[root@ck-x-master 2021_01_13-22:01:11 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$ echo ganesha > file1
[root@ck-x-master 2021_01_13-22:01:19 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$ ln file1 file1_hardlink
[root@ck-x-master 2021_01_13-22:01:44 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$ ls -lthri
total 1.0K
1835520 -rw-r--r-- 2 root root 8 Jan 13 22:01 file1_hardlink
1835520 -rw-r--r-- 2 root root 8 Jan 13 22:01 file1
[root@ck-x-master 2021_01_13-22:01:46 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$

[root@ck-x-master 2021_01_13-22:02:37 test_snapshot]$  kubectl -n ibm-spectrum-scale-csi-driver get volumesnapshot vs11
NAME   READYTOUSE   SOURCEPVC   SOURCESNAPSHOTCONTENT   RESTORESIZE   SNAPSHOTCLASS   SNAPSHOTCONTENT                                    CREATIONTIME   AGE
vs11   true         pvc11                               1Gi           vsclass1        snapcontent-6c284ad3-48ca-4e16-8336-79dea172a72b   1s             4s
[root@ck-x-master 2021_01_13-22:02:41 test_snapshot]$

[root@ck-x-master 2021_01_13-22:02:42 test_snapshot]$ cd /mnt/fs1/pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f/.snapshots/snapshot-6c284ad3-48ca-4e16-8336-79dea172a72b/pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data/
[root@ck-x-master 2021_01_13-22:03:11 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$ ls -lthri
total 1.0K
1835520 -rw-r--r-- 2 root root 8 Jan 13 22:01 file1_hardlink
1835520 -rw-r--r-- 2 root root 8 Jan 13 22:01 file1
[root@ck-x-master 2021_01_13-22:03:17 pvc-4c54d6d3-b524-4b56-8464-6a1df3fcf09f-data]$
  1. restored snapshot to a pvc
[root@ck-x-master 2021_01_13-22:04:08 test_snapshot]$ cat pvcfromvs11.yaml
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
   name: pvcfromvs11
spec:
   accessModes:
   - ReadWriteMany
   resources:
      requests:
         storage: 1Gi
   storageClassName: sc-ind
   dataSource:
      name: vs11
      kind: VolumeSnapshot
      apiGroup: snapshot.storage.k8s.io
[root@ck-x-master 2021_01_13-22:04:11 test_snapshot]$ kubectl -n ibm-spectrum-scale-csi-driver apply -f pvcfromvs11.yaml
persistentvolumeclaim/pvcfromvs11 created
[root@ck-x-master 2021_01_13-22:04:27 test_snapshot]$
[root@ck-x-master 2021_01_13-22:04:30 test_snapshot]$ kubectl -n ibm-spectrum-scale-csi-driver get pvc pvcfromvs11 -w
NAME          STATUS    VOLUME   CAPACITY   ACCESS MODES   STORAGECLASS   AGE
pvcfromvs11   Pending                                      sc-ind         11s
pvcfromvs11   Pending   pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17   0                         sc-ind         17s
pvcfromvs11   Bound     pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17   1Gi        RWX            sc-ind         17s
^C[root@ck-x-master 2021_01_13-22:04:45 test_snapshot]$
  1. observed that hardlink is copied as a separate file (inode 2097664), instead of pointing to original file (inode 2097665).
[root@ck-x-master 2021_01_13-22:04:48 test_snapshot]$ cd /mnt/fs1/pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17/pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data/
[root@ck-x-master 2021_01_13-22:05:05 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ ls -ltrhi
total 1.0K
2097664 -rw-r--r-- 1 root root 8 Jan 13 22:01 file1_hardlink
2097665 -rw-r--r-- 1 root root 8 Jan 13 22:01 file1
[root@ck-x-master 2021_01_13-22:05:08 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ cat file1
ganesha
[root@ck-x-master 2021_01_13-22:05:14 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ cat file1_hardlink
ganesha
[root@ck-x-master 2021_01_13-22:08:19 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ echo ganesha >> file1
[root@ck-x-master 2021_01_13-22:08:25 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ ls -ltrhi
total 1.0K
2097664 -rw-r--r-- 1 root root  8 Jan 13 22:01 file1_hardlink
2097665 -rw-r--r-- 1 root root 16 Jan 13 22:08 file1
[root@ck-x-master 2021_01_13-22:08:28 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ cat file1
ganesha
ganesha
[root@ck-x-master 2021_01_13-22:08:30 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$ cat file1_hardlink
ganesha
[root@ck-x-master 2021_01_13-22:08:34 pvc-5e686c39-b3d5-4b5e-b8d1-1786d3099c17-data]$

kulkarnicr avatar Jan 14 '21 06:01 kulkarnicr