AzureStorageExplorer icon indicating copy to clipboard operation
AzureStorageExplorer copied to clipboard

With SAS token policy enabled, upload/download not working

Open latschesar-atanassov opened this issue 1 month ago • 4 comments

Storage Explorer Version

1.40.2

Regression From

No response

Architecture

x64

Storage Explorer Build Number

20251101.1

Platform

Windows

OS Version

Windows 11

Bug Description

To reproduce the issue

  • setup a storage account with SAS token policy enabled with expiry interval 1 day, and set option to block SAS usage once SAS upper limit is exceeded.
  • have an Entra ID user with permissions to upload / download files into/from storage account
  • open Azure Storage Explorer , login with Entra ID user, you will be able to see all folders and files within the storage account, BUT once you try to upload or download files/folders you will see the error in the activities window below

"Transfer of .... to .... failed: 0 items transfered (used SAS, discovery completed)"

If I "copy AzCopy command to clipboard" and try the URL in the browser, I receive an interesting error message <AuthenticationErrorDetail>Policy violated by no signed start.</....>

and additionally I read in the documentation, "...when a SAS expiration policy is in effect for the storage account, the signed start field is required for every SAS." https://docs.azure.cn/en-us/storage/common/sas-expiration-policy?tabs=azure-portal#about-sas-expiration-policies

If I disable the SAS token policy, file upload and download works flawlessly.

So my first guess would be, that the signed start field is not set in the SAS token, and my request is blocked.

Resource Types

Blob

Authentication Method

None

Connection Type

Sign in (subscription)

Steps to Reproduce

To reproduce the issue

  1. setup a storage account with SAS token policy enabled with expiry interval 1 day, and set option to block SAS usage once SAS upper limit is exceeded.
  2. have an Entra ID user with permissions to upload / download files into/from storage account
  3. open Azure Storage Explorer , login with Entra ID user, you will be able to see all folders and files within the storage account, BUT once you try to upload or download files/folders you will see the error in the activities window below

"Transfer of .... to .... failed: 0 items transfered (used SAS, discovery completed)"

Actual Experience

open Azure Storage Explorer , login with Entra ID user, you will be able to see all folders and files within the storage account, BUT once you try to upload or download files/folders you will see the error in the activities window below

"Transfer of .... to .... failed: 0 items transfered (used SAS, discovery completed)"

Expected Experience

I would expect that the file upload / download was successful.

Additional Context

No response

latschesar-atanassov avatar Nov 26 '25 21:11 latschesar-atanassov

@latschesar-atanassov Can you please share your AzCopy log file with us? You can find the AzCopy logs by following the instructions here: https://learn.microsoft.com/en-us/troubleshoot/azure/azure-storage/blobs/alerts/storage-explorer-troubleshooting?tabs=Windows#azcopy-logs

The logs can give us more insight as to why you are experiencing failing transfers.

richardMSFT avatar Dec 01 '25 18:12 richardMSFT

The SAS token expiration limit is set 1 day on the storage account, so I also have set SAS duration in Azure Storage Explorer Setting to 1 day.

2025/12/02 05:49:23 AzcopyVersion  10.30.1
2025/12/02 05:49:23 OS-Environment  windows
2025/12/02 05:49:23 OS-Architecture  amd64
2025/12/02 05:49:23 Log times are in UTC. Local time is 2 Dec 2025 06:49:23
2025/12/02 05:49:23 ISO 8601 START TIME: to copy files that changed before or after this job started, use the parameter --include-before=2025-12-02T05:49:18Z or --include-after=2025-12-02T05:49:18Z
2025/12/02 05:49:23 Any empty folders will not be processed, because source and/or destination doesn't have full folder support
2025/12/02 05:49:23 Job-Command copy C:\####\test https://####.blob.core.windows.net/####/test?se=2025-12-03t05%3A49%3A20z&sig=-REDACTED-&sp=rwl&sr=c&sv=2025-07-05 --output-type=json --cancel-from-stdin --overwrite=prompt --from-to=LocalBlob --blob-type Detect --follow-symlinks --check-length=true --put-md5 --follow-symlinks --disable-auto-decoding=false --recursive --log-level=INFO 
2025/12/02 05:49:23 Number of CPUs: 14
2025/12/02 05:49:23 Max file buffer RAM 7.000 GB
2025/12/02 05:49:23 Max concurrent network operations:  will be dynamically tuned up to 3000 (Based on auto-tuning limit. Set AZCOPY_CONCURRENCY_VALUE environment variable to override)
2025/12/02 05:49:23 Check CPU usage when dynamically tuning concurrency: true (Based on hard-coded default. Set AZCOPY_TUNE_TO_CPU environment variable to true or false override)
2025/12/02 05:49:23 Max concurrent transfer initiation routines: 64 (Based on hard-coded default. Set AZCOPY_CONCURRENT_FILES environment variable to override)
2025/12/02 05:49:23 Max enumeration routines: 16 (Based on hard-coded default. Set AZCOPY_CONCURRENT_SCAN environment variable to override)
2025/12/02 05:49:23 Parallelize getting file properties (file.Stat): false (Based on AZCOPY_PARALLEL_STAT_FILES environment variable)
2025/12/02 05:49:23 Max open files when downloading: 2147479959 (auto-computed)
2025/12/02 05:49:23 Final job part has been created
2025/12/02 05:49:23 Trying 4 concurrent connections (initial starting point)
2025/12/02 05:49:23 Final job part has been scheduled
2025/12/02 05:49:23 INFO: [P#0-T#0] Starting transfer: Source "\\\\?\\C:\\####\\test" Destination "https://####.blob.core.windows.net/####/test". Specified chunk size 8388608
2025/12/02 05:49:23 ==> REQUEST/RESPONSE (Try=1/62.0704ms, OpTime=273.194ms) -- RESPONSE STATUS CODE ERROR
   HEAD https://####.blob.core.windows.net/####/test?se=2025-12-03T05%3A49%3A20Z&sig=-REDACTED-&sp=rwl&sr=c&sv=2025-07-05
   Accept: application/xml
   User-Agent: Microsoft/Azure/Storage/ azsdk-go-azblob/v1.6.2 (go1.24.6; Windows_NT)
   X-Ms-Client-Request-Id: 2548d2be-091a-4824-6a99-be71b10d7f8d
   x-ms-version: 2025-05-05
   --------------------------------------------------------------------------------
   RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
   Date: Tue, 02 Dec 2025 05:49:21 GMT
   Server: Microsoft-HTTPAPI/2.0
   X-Ms-Error-Code: AuthenticationFailed
   X-Ms-Request-Id: b8910046-001e-008b-3a4f-63ee2f000000
Response Details: 

2025/12/02 05:49:23 ERR: [P#0-T#0] UPLOADFAILED: \\?\C:\####\test : 000 : Could not check destination file existence. HEAD https://####.blob.core.windows.net/####/test
--------------------------------------------------------------------------------
RESPONSE 403: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
ERROR CODE: AuthenticationFailed
--------------------------------------------------------------------------------
Response contained no body
--------------------------------------------------------------------------------

   Dst: https://####.blob.core.windows.net/####/test
2025/12/02 05:49:23 JobID=353ed500-467b-734e-5133-22c6ddd8d3c7, Part#=0, TransfersDone=1 of 1
2025/12/02 05:49:23 all parts of entire Job 353ed500-467b-734e-5133-22c6ddd8d3c7 successfully completed, cancelled or paused
2025/12/02 05:49:23 is part of Job which 1 total number of parts done 
2025/12/02 05:49:25 PERF: primary performance constraint is Unknown. States: X:  0, O:  0, M:  0, L:  0, R:  0, D:  0, W:  0, F:  0, B:  0, E:  0, T:  0, GRs:  4
2025/12/02 05:49:25 0.0 %, 0 Done, 1 Failed, 0 Pending, 0 Skipped, 1 Total, 
2025/12/02 05:49:25 Closing Log

latschesar-atanassov avatar Dec 02 '25 06:12 latschesar-atanassov

Just want to point out - my assumption is, that the signed start field "st" is missing in the SAS token.

As the documentation says "when a SAS expiration policy is in effect for the storage account, the signed start field is required for every SAS." 1 and it is obviously missing. but I might be wrong of course.

latschesar-atanassov avatar Dec 02 '25 07:12 latschesar-atanassov

UPDATE

I found a workaround, so that Azure Storage Explorer does not use SAS token for uploading / downloading files. You just need to set the RBAC permissions as following

  • Reader on the Storage Account
  • Storage Blob Data Contributor on the Container inside the Storage Account

If RBAC is configured this way, Azure Storage Explorer will not use SAS token and will authenticate with Entra ID OAuth instead and you don't run into this issue.

latschesar-atanassov avatar Dec 04 '25 15:12 latschesar-atanassov