recipes icon indicating copy to clipboard operation
recipes copied to clipboard

Silverlight.download Recipe 0 Byte Download

Open badstreff opened this issue 8 years ago • 7 comments

This looks to be an issue with how URLDownloader.py processor is sending it's request. I've written up a curl command by hand, similar to what the processor is sending: curl --silent --show-error --no-buffer --dump-header - --speed-time 30 --location --url http://www.microsoft.com/getsilverlight/handlers/getsilverlight.ashx -A "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_3) AppleWebKit/536.28.10 (KHTML, like Gecko) Version/6.0.3 Safari/536.28.10" --output /Users/Shared/silverlight.dmg This results in an 403 Forbidden error.

However when specifying keep-alive header and upping the speed time to 120 (slow redirects?) everything downloads smoothly.

My quick fix would be to add support for specifying the speed-time field and adding a boolean flag for the keep-alive header in the processor input. Would it be better to write a custom Silverlight URL Provider Processor?

badstreff avatar Mar 29 '16 15:03 badstreff

I've found that even when requesting http://www.microsoft.com/getsilverlight/handlers/getsilverlight.ashx via Safari there is a very long pause of a minute or more before the download commences.

If you'd like to write and maintain a SilverlightDownloader Processor, take a stab at it. I wonder, though, given Silverlight's deprecated status, if it's worth the effort.

gregneagle avatar Mar 29 '16 15:03 gregneagle

Looks like my initial guess of it being a cookie issue was wrong, it is a timeout issue. This is also something I would suspect could be added to the URLDownloader processor without interfering with any existing recipes.

badstreff avatar Mar 29 '16 15:03 badstreff

Or it could just be a temporary Microsoft issue that will go away.

gregneagle avatar Mar 29 '16 15:03 gregneagle

I agree at this point, but it would be nice to have more lenient rules that users could override when downloading from possibly sluggish sites. I'll check the existing recipe periodically over the next few days and see if it succeeds at any point without making any changes.

badstreff avatar Mar 29 '16 16:03 badstreff

I'd think that in the time it would take you to recognize that some site had become sluggish, you then edit an override (or create one) to insert a higher timeout value, then run the recipe again (possibly more than once as you experiment with timeout values)... it would be faster and easier to just manually download the stupid thing and provide it to an upstream recipe using the --pkg flag.

gregneagle avatar Mar 29 '16 16:03 gregneagle

(But don't let me stop you from adding that functionality to URLDownloader)

gregneagle avatar Mar 29 '16 16:03 gregneagle

I've noticed this 0 byte issue as well on this one as well as a few others. If I go in and remove the 0 byte file and re-run, all clears up fine. It isn't consistent when it happens, but from time to time, it happens to me too.

RicknTx avatar Jul 02 '16 15:07 RicknTx