azure-cli icon indicating copy to clipboard operation
azure-cli copied to clipboard

az storage container create

Open DandaSowmithri opened this issue 2 years ago • 6 comments

This is autogenerated. Please review and update as needed.

Describe the bug

Command Name az storage container create

Errors:

The command failed with an unexpected error. Here is the traceback:
'MaxRetryError' object has no attribute 'lower'
Traceback (most recent call last):
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/connectionpool.py", line 699, in urlopen
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/connectionpool.py", line 382, in _make_request
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/connectionpool.py", line 1010, in _validate_conn
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/connection.py", line 416, in connect
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
  File "ssl.py", line 512, in wrap_socket
  File "ssl.py", line 1070, in _create
  File "ssl.py", line 1341, in do_handshake
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\requests/adapters.py", line 439, in send
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/connectionpool.py", line 755, in urlopen
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\urllib3/util/retry.py", line 574, in increment
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='dandatstate.blob.core.windows.net', port=443): Max retries exceeded with url: /tstate?restype=container (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/multiapi/storage/v2018_11_09/common/storageclient.py", line 321, in _perform_request
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/multiapi/storage/v2018_11_09/common/_http/httpclient.py", line 86, in perform_request
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\requests/sessions.py", line 542, in request
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\requests/sessions.py", line 655, in send
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\requests/adapters.py", line 514, in send
requests.exceptions.SSLError: HTTPSConnectionPool(host='dandatstate.blob.core.windows.net', port=443): Max retries exceeded with url: /tstate?restype=container (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/core/commands/__init__.py", line 692, in _run_job
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/core/commands/__init__.py", line 328, in __call__
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/core/commands/command_operation.py", line 121, in handler
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/command_modules/storage/operations/blob.py", line 148, in create_container
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/multiapi/storage/v2018_11_09/blob/baseblobservice.py", line 685, in create_container
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/multiapi/storage/v2018_11_09/common/storageclient.py", line 430, in _perform_request
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/multiapi/storage/v2018_11_09/common/storageclient.py", line 361, in _perform_request
azure.common.AzureException: HTTPSConnectionPool(host='dandatstate.blob.core.windows.net', port=443): Max retries exceeded with url: /tstate?restype=container (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\knack/cli.py", line 231, in invoke
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/core/commands/__init__.py", line 658, in execute
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/core/commands/__init__.py", line 721, in _run_jobs_serially
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/core/commands/__init__.py", line 713, in _run_job
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/command_modules/storage/__init__.py", line 382, in new_handler
  File "D:\a\1\s\build_scripts\windows\artifacts\cli\Lib\site-packages\azure/cli/command_modules/storage/__init__.py", line 285, in handler
AttributeError: 'MaxRetryError' object has no attribute 'lower'

To Reproduce:

Steps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.

  • Put any pre-requisite steps here...
  • az storage container create --name {} --account-name {} --account-key {}

Expected Behavior

Environment Summary

Windows-10-10.0.19043-SP0
Python 3.10.3
Installer: MSI

azure-cli 2.35.0 *

Additional Context

DandaSowmithri avatar Jun 15 '22 18:06 DandaSowmithri

storage

yonzhan avatar Jun 16 '22 01:06 yonzhan

Does dandatstate storage account exist?

evelyn-ys avatar Jun 16 '22 02:06 evelyn-ys

Yes this issue occurred for az 2.36.0 as well.

azure-cli 2.36.0 * core 2.36.0 * telemetry 1.0.6 Dependencies: msal 1.18.0 azure-mgmt-resource 20.0.0

Python location '/home/ecaeskickstartadmininstallation/ecaes-analytics/bin/python3.7' Extensions directory '/home/ecaeskickstartadmininstallation/.azure/cliextensions'

Python (Linux) 3.7.5 (default, Dec 9 2021, 17:04:37)

Error Client-Request-ID=996a3084-f15a-11ec-9f57-00224899ed03 Retry policy did not allow for a retry: , HTTP status code=Unknown, Exception=HTTPSConnectionPool(host='ecaanalyticblobstgtwo.blob.core.windows.net', port=443): Max retries exceeded with url: /eca-analytics?restype=container (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f6819d07190>: Failed to establish a new connection: [Errno -2] Name or service not known')).

kjoth avatar Jun 21 '22 12:06 kjoth

@kjoth Please ensure that storage account ecaanalyticblobstgtwo exists. The error message says that Name or service not known

evelyn-ys avatar Jun 21 '22 12:06 evelyn-ys

@evelyn-ys , The storage account is present. It could be an issue accessing storage from Azure VM Microsoft n/w.

There is a VM which has been created in the Azure in different region from the storage account. From the VM trying to access Storage by running az create container command, after adding the VM public IP in Firewall rules. By someway this VM is not able to resolve the dns name of the blob-storage.

Have further debugged, the storage has private endpoint to another network. If we disable/delete the private endpoint then VM could resolve the dns name and able to execute the command.

Yet to figure out why this private endpoint is creating this issue.

kjoth avatar Jun 23 '22 06:06 kjoth

@DandaSowmithri , Could you check

  1. From where is this command is being executed? Is it from an Azure VM 2.If so did the Azure VM and storage account resides in same region?

kjoth avatar Jun 23 '22 06:06 kjoth

Hi, we're sending this friendly reminder because we haven't heard back from you in a while. We need more information about this issue to help address it. Please be sure to give us your input within the next 7 days. If we don't hear back from you within 14 days of this comment the issue will be automatically closed. Thank you!

ghost avatar Aug 19 '22 14:08 ghost