Winget repository does not respond in IPv6
Brief description of your issue
The main repository (https://cdn.winget.microsoft.com/cache) resolves with an AAAA record but does not properly respond to HTTP requests on IPv6 (although it is IPv6-pingable). This causes winget to hang indefinitely on machines with IPv6 enabled due to IPv6 taking precedence for hosts that have AAAA records.
Steps to reproduce
nslookup cdn.winget.microsoft.com:
Server: one.one.one.one
Address: 1.1.1.1
Non-authoritative answer:
Name: s-part-0049.t-0009.t-msedge.net
Addresses: 2620:1ec:bdf::77
13.107.246.77
Aliases: cdn.winget.microsoft.com
winget-cache-pme-cxfsgwfxarb8hwg0.z01.azurefd.net
star-azurefd-prod.trafficmanager.net
shed.dual-low.s-part-0049.t-0009.t-msedge.net
ping cdn.winget.microsoft.com:
Pinging s-part-0049.t-0009.t-msedge.net [2620:1ec:bdf::77] with 32 bytes of data:
Reply from 2620:1ec:bdf::77: time=32ms
Reply from 2620:1ec:bdf::77: time=5ms
Ping statistics for 2620:1ec:bdf::77:
Packets: Sent = 2, Received = 2, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 5ms, Maximum = 32ms, Average = 18ms
curl -v https://cdn.winget.microsoft.com/cache/source.msix:
* Host cdn.winget.microsoft.com:443 was resolved.
* IPv6: 2620:1ec:bdf::77
* IPv4: 13.107.246.77
* Trying [2620:1ec:bdf::77]:443...
* schannel: disabled automatic use of client certificate
* ALPN: curl offers http/1.1
(no response, stopped with Ctrl+C)
However, curl -v --ipv4 https://cdn.winget.microsoft.com/cache/source.msix does work:
* Host cdn.winget.microsoft.com:443 was resolved.
* IPv6: (none)
* IPv4: 13.107.246.77
* Trying 13.107.246.77:443...
* schannel: disabled automatic use of client certificate
* ALPN: curl offers http/1.1
* ALPN: server accepted http/1.1
* Connected to cdn.winget.microsoft.com (13.107.246.77) port 443
* using HTTP/1.x
> GET /cache/source.msix HTTP/1.1
> Host: cdn.winget.microsoft.com
> User-Agent: curl/8.10.1
> Accept: */*
>
* Request completely sent off
* schannel: remote party requests renegotiation
* schannel: renegotiating SSL/TLS connection
* schannel: SSL/TLS connection renegotiated
* schannel: remote party requests renegotiation
* schannel: renegotiating SSL/TLS connection
* schannel: SSL/TLS connection renegotiated
* schannel: failed to decrypt data, need more data
* schannel: failed to decrypt data, need more data
< HTTP/1.1 200 OK
...
Expected behavior
curl -v https://cdn.winget.microsoft.com/cache/source.msix returns a response
Actual behavior
curl -v https://cdn.winget.microsoft.com/cache/source.msix hangs indefinitely
Environment
Windows Package Manager v1.10.320
Copyright (c) Microsoft Corporation. All rights reserved.
Windows: Windows.Desktop v10.0.26100.3194
System Architecture: Arm64
Package: Microsoft.DesktopAppInstaller v1.25.320.0
Winget Directories
---------------------------------------------------------------------------------------------------------
Logs %LOCALAPPDATA%\Packages\Microsoft.DesktopAppInstaller_8wekyb3d8bbwe\L…
User Settings %LOCALAPPDATA%\Packages\Microsoft.DesktopAppInstaller_8wekyb3d8bbwe\L…
Portable Links Directory (User) %LOCALAPPDATA%\Microsoft\WinGet\Links
Portable Links Directory (Machine) C:\Program Files\WinGet\Links
Portable Package Root (User) %LOCALAPPDATA%\Microsoft\WinGet\Packages
Portable Package Root C:\Program Files\WinGet\Packages
Portable Package Root (x86) C:\Program Files (x86)\WinGet\Packages
Installer Downloads %USERPROFILE%\Downloads
Configuration Modules %LOCALAPPDATA%\Microsoft\WinGet\Configuration\Modules
Links
---------------------------------------------------------------------------
Privacy Statement https://aka.ms/winget-privacy
License Agreement https://aka.ms/winget-license
Third Party Notices https://aka.ms/winget-3rdPartyNotice
Homepage https://aka.ms/winget
Windows Store Terms https://www.microsoft.com/en-us/storedocs/terms-of-sale
Admin Setting State
--------------------------------------------------
LocalManifestFiles Disabled
BypassCertificatePinningForMicrosoftStore Disabled
InstallerHashOverride Disabled
LocalArchiveMalwareScanOverride Disabled
ProxyCommandLineOptions Disabled
DefaultProxy Disabled
~~is there a temporary workaround for this, short of disabling IPv6 each time we need to run winget upgrade?~~
read here https://github.com/microsoft/winget-cli/issues/5269#issuecomment-3009976114.
Is there maybe a list of AAAA records we can block in something like AdGuard Home ? Tried the line below but it still hangs during update
||cdn.winget.microsoft.com^$dnstype=AAAA
Is there maybe a list of AAAA records we can block in something like AdGuard Home ? Tried the line below but it still hangs during update
||cdn.winget.microsoft.com^$dnstype=AAAA
ok, I found a solution for AdGuard Home users.
- go to Filters --> DNS rewrites
- click on Add DNS rewrite
- paste
cdn.winget.microsoft.comin the first field (Enter domain name or wildcard) - write
Ain the second field (Enter IP address or domain name) - hit Save
this forces AdGuard Home to just keep the A record for that specific domain from upstream.
I just tested it in my network environment and I can now successfully run winget upgrade with IPv6 enabled.
adding a rewrite for cdn.winget.microsoft.com is enough, there are no other domains to add, hope this helps.
I had similar issues. I'm my case, it was solved by adding the following firewall rule in my (custom) router:
ip6tables -A FORWARD -o ppp0 -p tcp -m tcp --tcp-flags SYN,RST SYN -j TCPMSS --clamp-mss-to-pmtu
ppp0 is the interface to my ISP's modem. It uses PPPoE and is MTU is therefore 8 bytes smaller than the standard Ethernet MTU, which caused the issues. This rule clamps (limits) the MSS value in TCP packets to ensure they fit in this smaller MTU.
The background of the issue is explained here: https://forum.mikrotik.com/t/trying-to-understand-the-need-for-mss-clamping/175749
I had similar issues. I'm my case, it was solved by adding the following firewall rule in my (custom) router:
ip6tables -A FORWARD -o ppp0 -p tcp -m tcp --tcp-flags SYN,RST SYN -j TCPMSS --clamp-mss-to-pmtu
ppp0 is the interface to my ISP's modem. It uses PPPoE and is MTU is therefore 8 bytes smaller than the standard Ethernet MTU, which caused the issues. This rule clamps (limits) the MSS value in TCP packets to ensure they fit in this smaller MTU.
The background of the issue is explained here: https://forum.mikrotik.com/t/trying-to-understand-the-need-for-mss-clamping/175749
Wow, that's extremely interesting. I experimented a bit with my router (It's a Mikrotik running RouterOS, so the config is slightly different from iptables) and it seems that MSS was indeed the issue, however, clamping to PMTU still didnt solve it - what ended up working for me is setting the MSS to 1400. Interestingly enough, I saw the same issue with a few other servers hosted on Azure. I wonder if this is an issue with my ISP or router that surfaces because of IPv6 configuration in Azure (maybe other cloud services don't accept large TCP MSS values?), or if the Azure IPv6 gateways themselves have an MTU bottleneck somewhere.
Anyway, I'm gonna close this issue, since it's probably not an issue in Winget. For future readers who end up here, the RouterOS configuration that ended up fixing this for me is:
/ipv6 firewall mangle add action=change-mss chain=forward new-mss=1400 out-interface=sfp1 protocol=tcp tcp-flags=syn,!rst
Adapt this to your router and experiment with lower MSS values if it still doesn't work.
Based on the excellent diagnostic work by @Stamgastje and @hadeutscher regarding MSS clamping, I can confirm this appears to be the root cause. I did some layer-by-layer testing that supports your MTU/MSS theory:
- IPv6 ICMP: Works (Path MTU Discovery functional)
- IPv6 TCP Handshake: Establishes successfully
- IPv6 TLS Negotiation: Completes without issues
- IPv6 HTTP Data Transfer: => Hangs indefinitely ← MSS/fragmentation bottleneck
The fact that TCP+TLS succeed but HTTP data transfer fails strongly indicates that:
- Small packets (handshakes) pass through fine
- Large data packets hit MTU restrictions and get dropped
- No ICMP "Packet Too Big" messages reach the client (common with stateful firewalls)
- This creates the indefinite hang behavior
In my eyes this all suggests Microsoft's Azure Front Door / msedge.net infrastructure is the problem, either
- Aggressive stateful firewalls dropping large IPv6 packets
- Missing or misconfigured IPv6 Path MTU Discovery responses
- IPv6-specific MSS handling differences vs IPv4
@hadeutscher's RouterOS fix (new-mss=1400) and @Stamgastje's PPPoE clamping both point to the issue that Microsoft's CDN IPv6 infrastructure cannot handle standard MSS values currently (worked a few weeks ago at least)
My temporary workaround: Add IPv4 mapping to hosts file:
40.90.65.187 cdn.winget.microsoft.com # Force IPv4
13.107.253.44 www.powershellgallery.com # Force IPv4
(yes, PowerShell gallery updates / installs currently also don't work, seems to be the same)
Since this is closed, I filed #5693 with complete diagnostic details and suggested escalation
Can confirm the ipv6 issue.
Added the ipv4 to the hosts file like @manuboek said, worked as a temporary workaround.