trident icon indicating copy to clipboard operation
trident copied to clipboard

Upgrade to 22.07 TridentBackendConfig failed

Open fabian-born opened this issue 3 years ago • 2 comments

Describe the bug After the upgrade from v21.10.0 to v22.07.0 only the ontap-nas drivers are stuck in a failed state:

message: Failed to apply the backend update; updating the data plane IP address isn't currently supported

But the IP address wasn't changed. It looks like Issue #759.

But I use the cert-based authentication:

Environment Provide accurate information about the environment to help us reproduce the issue.

  • Trident version: 22.07.0
  • Trident installation flags used: Operator
  • Container runtime: Docker
  • Kubernetes version: 1.23.10
  • Kubernetes orchestrator: Rancher v2.6.6
  • OS: Ubuntu 20.04
  • NetApp backend types: 9.10.1
  • Other:

To Reproduce Steps to reproduce the behavior:

Backend:

apiVersion: trident.netapp.io/v1
kind: TridentBackendConfig
metadata:
  name: backend-tbc-ontap-nas-cert
spec:
  version: 1
  storageDriverName: ontap-nas
  managementLIF: 192.168.69.60
  backendName: tbc-ontap-nas-cert-3
  svm: trident
  exportPolicy: default
  storagePrefix: wildboar_
  credentials:
    name: ontap-nfs-svm-secret-cert
  clientCertificate: LS0tLS1CRUdJTiBDR...

Secrets:

apiVersion: v1
kind: Secret
metadata:
  name: ontap-nfs-svm-secret-cert
type: Opaque
stringData:
  clientPrivateKey: LS0tLS1C...

Upgrade (based on our documentation):

kubectl delete -f trident-2101/deploy/bundle.yaml 
kubectl apply -f trident-2207/deploy/bundle.yaml 

Logfile:

time="2022-09-28T14:17:59Z" level=debug msg="Updated persistent state version." OrchestratorAPIVersion=1 PersistentStoreVersion=crdv1 requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Retrieved backend secret." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=warning msg="clientPrivateKey is specified in both config and secret; overriding from secret."
time="2022-09-28T14:17:59Z" level=debug msg="Processing backend." handler=Bootstrap persistentBackend.BackendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 persistentBackend.Name=tbc-ontap-nas-cert-3 persistentBackend.configRef=73abc85a-76cf-4f9b-a81e-2cb6b3afa9ff persistentBackend.online=true persistentBackend.state=online requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Parsed storage prefix." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal storagePrefix=wildboar_
time="2022-09-28T14:17:59Z" level=debug msg="Credentials field not empty." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Parsed commonConfig: Version:1 StorageDriverName:\"ontap-nas\" BackendName:\"tbc-ontap-nas-cert-3\" Debug:false DebugTraceFlags:map[string]bool(nil) DisableDelete:false StoragePrefixRaw:json.RawMessage{0x22, 0x77, 0x69, 0x6c, 0x64, 0x62, 0x6f, 0x61, 0x72, 0x5f, 0x22} StoragePrefix:(*string)(0xc00007f8a0) SerialNumbers:[]string{} DriverContext:\"\" LimitVolumeSize:\"\" Credentials:<REDACTED> " requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Retrieved backend secret." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Initializing storage driver." driver=ontap-nas requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=warning msg="clientPrivateKey is specified in both config and secret; overriding from secret."
time="2022-09-28T14:17:59Z" level=error msg="Could not initialize storage driver." error="error initializing ontap-nas driver: more than one authentication method (username/password and clientPrivateKey) present in backend config; please ensure only one authentication method is provided" requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Failed storage backend." backendName=tbc-ontap-nas-cert-3 backendUUID= driver=ontap-nas requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="NewStorageBackendForConfig failed." backend="&{0xc00020ca80 tbc-ontap-nas-cert-3 6bc19857-c4f4-4bee-9000-2f0d09917f11 false failed map[] map[] 73abc85a-76cf-4f9b-a81e-2cb6b3afa9ff false}" backendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 configRef=73abc85a-76cf-4f9b-a81e-2cb6b3afa9ff err="problem initializing storage driver 'ontap-nas': error initializing ontap-nas driver: more than one authentication method (username/password and clientPrivateKey) present in backend config; please ensure only one authentication method is provided" requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=warning msg="Cannot terminate an uninitialized backend." backend=tbc-ontap-nas-cert-3 backendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 driver=ontap-nas requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal state=failed
time="2022-09-28T14:17:59Z" level=warning msg="Problem adding backend." backendErr="problem initializing storage driver 'ontap-nas': error initializing ontap-nas driver: more than one authentication method (username/password and clientPrivateKey) present in backend config; please ensure only one authentication method is provided" handler=Bootstrap newBackendExternal="&{tbc-ontap-nas-cert-3 6bc19857-c4f4-4bee-9000-2f0d09917f11 file CommonStorageDriverConfig:&storagedrivers.CommonStorageDriverConfig{Version:1, StorageDriverName:\"ontap-nas\", BackendName:\"tbc-ontap-nas-cert-3\", Debug:false, DebugTraceFlags:map[string]bool(nil), DisableDelete:false, StoragePrefixRaw:json.RawMessage{0x22, 0x77, 0x69, 0x6c, 0x64, 0x62, 0x6f, 0x61, 0x72, 0x5f, 0x22}, StoragePrefix:(*string)(0xc0002604c0), SerialNumbers:[]string(nil), DriverContext:\"csi\", LimitVolumeSize:\"\", Credentials:map[string]string{\"name\":\"<REDACTED>\", \"type\":\"<REDACTED>\"}} ManagementLIF:\"\" DataLIF:\"\" IgroupName:\"\" SVM:\"\" Username:<REDACTED> Password:<REDACTED> Aggregate:\"\" UsageHeartbeat:\"\" QtreePruneFlexvolsPeriod:\"\" QtreeQuotaResizePeriod:\"\" QtreesPerFlexvol:\"\" LUNsPerFlexvol:\"\" EmptyFlexvolDeferredDeletePeriod:\"\" NfsMountOptions:\"\" LimitAggregateUsage:\"\" AutoExportPolicy:false AutoExportCIDRs:[]string(nil) OntapStorageDriverPool:storagedrivers.OntapStorageDriverPool{Labels:map[string]string(nil), Region:\"\", Zone:\"\", SupportedTopologies:[]map[string]string(nil), OntapStorageDriverConfigDefaults:storagedrivers.OntapStorageDriverConfigDefaults{SpaceAllocation:\"\", SpaceReserve:\"\", SnapshotPolicy:\"\", SnapshotReserve:\"\", SnapshotDir:\"\", UnixPermissions:\"\", ExportPolicy:\"\", SecurityStyle:\"\", SplitOnClone:\"\", FileSystemType:\"\", Encryption:\"\", Mirroring:\"\", TieringPolicy:\"\", QosPolicy:\"\", AdaptiveQosPolicy:\"\", CommonStorageDriverConfigDefaults:storagedrivers.CommonStorageDriverConfigDefaults{Size:\"\"}}} Storage:[]storagedrivers.OntapStorageDriverPool(nil) UseCHAP:false UseREST:false ChapUsername:<REDACTED> ChapInitiatorSecret:<REDACTED> ChapTargetUsername:<REDACTED> ChapTargetInitiatorSecret:<REDACTED> ClientPrivateKey:<REDACTED> ClientCertificate:\"\" TrustedCACertificate:\"\" ReplicationPolicy:\"\" ReplicationSchedule:\"\"  map[] failed false [] 73abc85a-76cf-4f9b-a81e-2cb6b3afa9ff}" requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Parsed storage prefix." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal storagePrefix=wildboar_
time="2022-09-28T14:17:59Z" level=debug msg="Credentials field not empty." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Parsed commonConfig: Version:1 StorageDriverName:\"ontap-nas\" BackendName:\"tbc-ontap-nas-cert-3\" Debug:false DebugTraceFlags:map[string]bool(nil) DisableDelete:false StoragePrefixRaw:json.RawMessage{0x22, 0x77, 0x69, 0x6c, 0x64, 0x62, 0x6f, 0x61, 0x72, 0x5f, 0x22} StoragePrefix:(*string)(0xc0005db1f0) SerialNumbers:[]string{} DriverContext:\"\" LimitVolumeSize:\"\" Credentials:<REDACTED> " requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Retrieved backend secret." requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Initializing storage driver." driver=ontap-nas requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=warning msg="clientPrivateKey is specified in both config and secret; overriding from secret."
time="2022-09-28T14:17:59Z" level=error msg="Could not initialize storage driver." error="error initializing ontap-nas driver: more than one authentication method (username/password and clientPrivateKey) present in backend config; please ensure only one authentication method is provided" requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Failed storage backend." backendName=tbc-ontap-nas-cert-3 backendUUID= driver=ontap-nas requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=debug msg="Backend information." newBackendExternal="&{tbc-ontap-nas-cert-3 6bc19857-c4f4-4bee-9000-2f0d09917f11 file CommonStorageDriverConfig:&storagedrivers.CommonStorageDriverConfig{Version:1, StorageDriverName:\"ontap-nas\", BackendName:\"tbc-ontap-nas-cert-3\", Debug:false, DebugTraceFlags:map[string]bool(nil), DisableDelete:false, StoragePrefixRaw:json.RawMessage{0x22, 0x77, 0x69, 0x6c, 0x64, 0x62, 0x6f, 0x61, 0x72, 0x5f, 0x22}, StoragePrefix:(*string)(0xc0002604c0), SerialNumbers:[]string(nil), DriverContext:\"csi\", LimitVolumeSize:\"\", Credentials:map[string]string{\"name\":\"<REDACTED>\", \"type\":\"<REDACTED>\"}} ManagementLIF:\"\" DataLIF:\"\" IgroupName:\"\" SVM:\"\" Username:<REDACTED> Password:<REDACTED> Aggregate:\"\" UsageHeartbeat:\"\" QtreePruneFlexvolsPeriod:\"\" QtreeQuotaResizePeriod:\"\" QtreesPerFlexvol:\"\" LUNsPerFlexvol:\"\" EmptyFlexvolDeferredDeletePeriod:\"\" NfsMountOptions:\"\" LimitAggregateUsage:\"\" AutoExportPolicy:false AutoExportCIDRs:[]string(nil) OntapStorageDriverPool:storagedrivers.OntapStorageDriverPool{Labels:map[string]string(nil), Region:\"\", Zone:\"\", SupportedTopologies:[]map[string]string(nil), OntapStorageDriverConfigDefaults:storagedrivers.OntapStorageDriverConfigDefaults{SpaceAllocation:\"\", SpaceReserve:\"\", SnapshotPolicy:\"\", SnapshotReserve:\"\", SnapshotDir:\"\", UnixPermissions:\"\", ExportPolicy:\"\", SecurityStyle:\"\", SplitOnClone:\"\", FileSystemType:\"\", Encryption:\"\", Mirroring:\"\", TieringPolicy:\"\", QosPolicy:\"\", AdaptiveQosPolicy:\"\", CommonStorageDriverConfigDefaults:storagedrivers.CommonStorageDriverConfigDefaults{Size:\"\"}}} Storage:[]storagedrivers.OntapStorageDriverPool(nil) UseCHAP:false UseREST:false ChapUsername:<REDACTED> ChapInitiatorSecret:<REDACTED> ChapTargetUsername:<REDACTED> ChapTargetInitiatorSecret:<REDACTED> ClientPrivateKey:<REDACTED> ClientCertificate:\"\" TrustedCACertificate:\"\" ReplicationPolicy:\"\" ReplicationSchedule:\"\"  map[] failed false [] 73abc85a-76cf-4f9b-a81e-2cb6b3afa9ff}" newBackendExternal.BackendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 newBackendExternal.Name=tbc-ontap-nas-cert-3 newBackendExternal.State=failed persistentBackend.BackendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 persistentBackend.Name=tbc-ontap-nas-cert-3 persistentBackend.State=online requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal
time="2022-09-28T14:17:59Z" level=info msg="Added an existing backend." backend=tbc-ontap-nas-cert-3 backendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 configRef=73abc85a-76cf-4f9b-a81e-2cb6b3afa9ff handler=Bootstrap online=true persistentBackends.BackendUUID=6bc19857-c4f4-4bee-9000-2f0d09917f11 requestID=e46fbc7f-bc73-4a12-850c-f85f67135d10 requestSource=Internal state=failed

Username + password was never used!

kubectl get secrets -n trident ontap-nfs-svm-secret-cert -o yaml                                                                                                                                                                                                       
apiVersion: v1
data:
  clientPrivateKey: TFMwdExTMUNSVWRKVGlC...
kind: Secret
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"v1","kind":"Secret","metadata":{"annotations":{},"name":"ontap-nfs-svm-secret-cert","namespace":"trident"},"stringData":{"clientPrivateKey":"LS0tLS1CRUd..."},"type":"Opaque"}
  creationTimestamp: "2022-09-28T13:58:42Z"
  name: ontap-nfs-svm-secret-cert
  namespace: trident
  resourceVersion: "20889"
  uid: aad58bdd-b391-4a23-b643-8cd27f7e452d
type: Opaque

fabian-born avatar Sep 28 '22 14:09 fabian-born

my workaround:

k get tbe -n trident -o yaml

apiVersion: v1
items:
- apiVersion: trident.netapp.io/v1
  backendName: tbc-ontap-nas-cert-3
  backendUUID: 6bc19857-c4f4-4bee-9000-2f0d09917f11
  config:
    ontap_config:
 [ ... ]
      clientCertificate: LS0tLS1CRUd...
      clientPrivateKey: secret:ontap-nfs-svm-secret-cert
      credentials:
        name: ontap-nfs-svm-secret-cert
      dataLIF: 192.168.69.61
      debug: false
      debugTraceFlags: null
      defaults:
[ ... ]
      username: secret:ontap-nfs-svm-secret-cert

I patched TBE to remove the username: kubectl patch tridentbackend tbe-hzwl6 --type=merge -p '{"config":{"ontap_config":{"username": ""}}}' -n trident and restart the deployment kubectl rollout restart deployment -n trident trident-csi

fabian-born avatar Sep 28 '22 15:09 fabian-born

Hello @fabian-born,

This issue is the same as #759. The workaround here would have been to use an empty username username: "" and password password: "" in your secret. The workaround you identified will work as well but we caution users against modifying internal Trident CRDs to avoid any unknown consequences. For Trident backends TridentBackendConfig is the external/user-facing CRD that should be changed.

And, the fix for the issue commit 9b78a23 is already in the master and will be included in the Trident 22.10 release.

rohit-arora-dev avatar Sep 28 '22 15:09 rohit-arora-dev

This issue was fixed with commit 9b78a23 and is included in the Trident v22.10 release.

gnarl avatar Feb 12 '23 23:02 gnarl