html icon indicating copy to clipboard operation
html copied to clipboard

Responsive images causes performance issues on high-resolution devices

Open mor10 opened this issue 5 years ago • 34 comments

tl;dr: Correctly marked up responsive images (srcset + sizes) result in high-resolution devices (2x, 3x, 4x, 5x, 6x dpi) downloading and using very large files causing significant performance hits, especially on mobile devices on mobile networks.

Scenario

  • Layout with full-bleed (edge-to-edge) image, served in a responsive layout to all device widths.
  • srcset list includes images of varying src options, from small (~400px wide) to very large (~3000px wide) to account for mobile, tablet, desktop, and large monitors, high-resolution laptops etc.
  • sizes attribute marked up to spec with natural breakpoints.

Under these circumstances, a high-resolution smartphone will download larger image files to meet screen resolution demands, thus negating the original intent (as I understand it) of the RICG responsive images work which was to lower the bandwidth hit on mobile devices when viewing images.

This scenario is further complicated by modern design trends including full-bleed images as described above. Because the srcset list must include very large image sources to account for large high-resolution screens, small devices with high-resolution screens have access to and will use these same sources resulting in wasted bandwidth and significantly degraded performance.

Practical use case

This issue came to my attention while working on the new content editor for WordPress ("Gutenberg"). The content editor allows for images to be aligned wide and full, the latter meaning the image takes up the full available width which often means the width of the viewport.

When updating WordPress to generate appropriate srcset and sizes attributes to account for these layouts, I found significant performance reductions on mobile devices caused by the download of larger-than-required images as per the sizes attribute.

A functional example, with detailed explanation, is available here: https://gutenberg.mor10.com/responsive-images-demo-corrected-code/

The result of this discovery was a sub-optimal workaround for WordPress core which is live on every WordPress site running version 5.0 or above: To avoid the performance hit on mobile devices, the largest image source in the srcset list is 1024px unless the theme explicitly states otherwise. In other words, browsers are served with smaller-than-necessary images, causing a poor user experience.

A functional example of the current auto-generated output as of WordPress 5.1 is available here: https://gutenberg.mor10.com/image-test-current/

Summation: Due to the performance issues introduced by the srcset + sizes combination, 33.3% of the web is currently shipping the wrong image sources on purpose as a workaround. That said, this same issue will be experienced by anyone setting up a site with full-bleed images as described above, WordPress or not.

Possible solutions

From my perspective as a front-end developer, the ideal solution would be to amend the spec to allow developers to declare, through attributes or similar on each individual image, the pixel density the sizes attribute should be measured against. Something like this:

<img 
  src="fish-1-1024x684.jpg" alt="" 
  srcset="
    fish-1-1024x684.jpg 1024w, 
    fish-1-300x200.jpg 300w, 
    fish-1-768x513.jpg 768w, 
    fish-1-1568x1047.jpg 1568w" 
  sizes="
    (min-width: 768px) calc(8 * (100vw / 12) - 28px), 
    (min-width: 1168) calc(6 * 100vw/12) - 28px),
    calc(100% - (2 * 1rem))"
  resolution="1x"
/>

Alternatively, the browser could detect available bandwidth and other factors and actively throttle the srcset list accordingly so high-resolution mobile devices on mobile networks would receive the appropriate resolution interpretation based on available data and performance. This of course brings into question how to measure bandwidth and download limits, especially for users on max-megabytes-per-month plans.

I know this type of throttling is technically possible using client hints, but configuring client hints and server side solutions is beyond the capacity of most CMSes and site owners, and puts the onus of having the web work as expected on the individual user.

In lieu of the suggestions above type of bandwidth throttling, a third option could be to put browser limits on how high the resolution can be for images (2x limit on a 4x screen as an example).

cc @yoavweiss @joemcgill @getsource

mor10 avatar Mar 07 '19 22:03 mor10

Alternatively, the browser could detect available bandwidth and other factors and actively throttle the srcset list accordingly so high-resolution mobile devices on mobile networks would receive the appropriate resolution interpretation based on available data and performance.

^ This. IMHO, the spec is about ar art directionish as it should be and all this should be left to the browsers. But I wholeheartedly agree that this needs to be adressed.

Hlsgs avatar Mar 07 '19 23:03 Hlsgs

I've discussed this with @mor10 last week. It seems like the ubiquity of high resolution screens has made it so that there's a need for a cap on the DPR levels browsers take into account, either implicitly or explicitly.

From my perspective, the main hurdle towards an on-by-default cap, automatically enforced by the browser, is the lack of data on the cut-off ratio, where higher-resolution doesn't necessarily mean better user-experience. If such data could be provided, that'd be helpful on that front.

Barring that, the fastest route towards a solution here would be an opt-in cap (e.g. a maxresolution attribute). Those are not mutually exclusive, and we could start out with an opt-in and modify its default value from the current infinity into a more reasonable default once we have data to back it up.

Regarding bandwidth based restrictions, it's possible that @tarunban has looked into that.

/cc @tabatkins @eeeps

yoavweiss avatar Mar 11 '19 05:03 yoavweiss

If memory serves, most browsers used to cap the selection process at 2x. If browsers are no longer specifically doing this, then I would agree that a markup solution would be nice so that developers could control the resolution caps themselves. I would suggest doing this via a meta value that applied to the whole page, rather than on an individual image basis, but I could see a use case for both options.

joemcgill avatar Mar 11 '19 14:03 joemcgill

The entire point of this feature is that browsers should be smart enough to be able to intelligently choose the "best" resolution to download; they have access to more information than the page author does, and whatever heuristics they come up with to intelligently pick between options get automatically applied to every page using this feature, rather than only being for whatever tiny % of people use a particularly good image-choosing JS library.

If browsers aren't making good decisions, that's a bug on browsers. There shouldn't be any need for the page to further intervene here. It's definitely plausible that browsers are being naive here, so please file bugs showing bad results. ^_^

tabatkins avatar Mar 11 '19 20:03 tabatkins

First of all: I agree with @tabatkins , the spec gives browsers great power here, and with that power comes all of the responsibility. The bugs belong with them.

While you're all here though... I barfed out some initial research and thoughts re: Hi-DPR’s diminishing quality returns. https://observablehq.com/@eeeps/visual-acuity-and-device-pixel-ratio

TL;DR @joemcgill I don't remember any browser limiting DPR to 2, but by the looks of it, maybe that's not such a terrible rule-of-thumb to start with.

eeeps avatar Mar 18 '19 22:03 eeeps

I too agree browsers should be doing this, but from what I see in the real world, they are not. Which gives us an impossible dilemma of either serving gigantic images for fancy 5x screens and making people on expensive data plans pay for image data they neither want nor need, or serving too-small images to all devices causing people on larger screens to question their eyesight.

How do we move forward in a constructive way here? Is the next step reaching out to browser manufacturers and asking them where bandwidth / dataplan throttling is in the pipeline?

Just to reiterate, this issue is currently causing 33% of the web to receive incorrectly sized images via WordPress (through no fault of WordPress). Finding a path toward a solution is non-trivial to the overall UX of the web.

mor10 avatar Mar 18 '19 23:03 mor10

While you're all here though... I barfed out some initial research and thoughts re: Hi-DPR’s diminishing quality returns. https://observablehq.com/@eeeps/visual-acuity-and-device-pixel-ratio

@eeeps - That's amazing work. Thank you for that!! My read is that the mean 25YO can't see much more than 2x, but that probably means that some chunk of that population can. Is there more data/research on the distribution within that group?

@tabatkins - thoughts on the above and the conclusion? Should browsers use that to cut-off DPR calculations at ~2x? Higher?

yoavweiss avatar Mar 19 '19 10:03 yoavweiss

Is the next step reaching out to browser manufacturers and asking them where bandwidth / dataplan throttling is in the pipeline?

@mor10 - you already have 😺 As a browser implementor, the next step is to gather data on what the ideal cut-off is, which @eeeps has already started above. Once we reach conclusions on that research, I can probably send an intent to modify the current behavior.

Once Chrome ships this and proves that it's useful behavior, I suspect it won't take too long for other browser vendors to follow.

yoavweiss avatar Mar 19 '19 10:03 yoavweiss

As an aside, I do think that while a default should be spec'ed, that this should be overridden by the UA and/or markup.

I can imagine scenarios where the UA might desire to have a max-DPR that is non-standard and >2x. I think of TVs as an example. While mobile devices (phones, laptops, watches) have a generally accepted interaction distance, TVs have a less standard use. Home viewing is one thing, but dashboards in the office or campus, or even signage in a store or trade-hall have very different viewing and interaction distances. For this reason I could reasonably see a TV manufacture wanting a 5x DPR to compensate for the viewing and interaction distances.

colinbendell avatar Mar 19 '19 12:03 colinbendell

As an aside, I do think that while a default should be spec'ed, that this should be overridden by the UA and/or markup.

I'm not sure we even want to specify a default, although we may add some data-backed recommendation of a good cut-off value, which browsers can then adopt as they see fit.

yoavweiss avatar Mar 19 '19 12:03 yoavweiss

@mor10 do you have any more detailed data on the “costs” side of things?

I ran your test page through Chrome dev tools on a 1024x768 px viewport at DPRs:

1x: 839K 2x: 1.4MB 3x: 1.4MB

...which, 1.4MB for a page with three images on it is not great, but I feel like there's a better story here about the sort of widespread performance impact that WordPress is trying to avoid by limiting images to 1024w. What would happen to a typical WP site if it adopted the new Twenty-Ninteen theme and all limits were removed, so that the largest srcset resources were the full, user-uploaded versions?

eeeps avatar Mar 19 '19 14:03 eeeps

I'll update the example with larger image sizes. The original was put together to demonstrate the negative effects of small images on wide screens.

mor10 avatar Mar 19 '19 15:03 mor10

@eeeps Here's a new post with an extended range of srcset options, most of which are wider than what WordPress now outputs. The new generated image sizes are based on this proposal which also holds the rationale for the size breakdown based on popular viewport widths.

For reference, my Pixel 3 in horizontal mode now pulls down the 2304px images for all three images in the example.

https://gutenberg.mor10.com/responsive-images-demo-extended-srcset-range/

mor10 avatar Mar 19 '19 18:03 mor10

I asked for this or a similar feature already in 2014 (I think optimumdensity attribute).

As it was shut down I created a JS plugin for lazySizes.

@eeeps There is also a demo page which allows you to constrain pixel density and outputs the sizes. If you think this could be useful I can tweak the tool with other images and a wider range of density. Note: You must use a high density device to use this tool currently.

@yoavweiss It would be also nice to tweak source selection algorithm with this. Because the simple arithmetic middle isn't that good either. My algorithm looked shit because I'm bad in math but having an algorithm that tends to select the smaller image if both images are high dpi is better and of course the algorithm should tend to select a higher dpi image if both are low dpi.

aFarkas avatar Mar 20 '19 07:03 aFarkas

For reference, the WordPress core ticket to find a solution has been punted to a future release: https://core.trac.wordpress.org/ticket/45407#comment:33

mor10 avatar Mar 21 '19 18:03 mor10

Hi, has something changed in this topic?

lukasz-galka avatar Jan 05 '20 18:01 lukasz-galka

I can see "SrcsetMaxDensity" was committed here, but I can't understand whether it shipped or not.

@yoavweiss would you be able to send an update? Thanks

PS: of course using picture is an option, but it's much more verbose and I think it should be used only when the images have different ratios and other corner cases.

verlok avatar Mar 01 '21 11:03 verlok

I've never seen a good solution with <picture> and srcset-w in <source>s.

You could use <picture> with multiple <source>s with srcset-x, but you won't be able to deal with a fluid layout properly. It only works for multi-steps fixed layouts (as The Guardian for example).

What I usually do is reduce the quality of the image (increasing the compression) on high density screens, as with Compressive Images, to limit network data. But it's really really bad for memory usage on the device.

Here is some simplified pseudo-code for a full viewport width image:

<picture>
  <source
    media="(min-resolution: 3dppx)"
    sizes="100vw"
    srcset="
      img.jpg?w=300&q=15 300w,
      img.jpg?w=600&q=15 600w, …">
  <img
    sizes="100vw"
    srcset="
      img.jpg?w=300&q=auto 300w,
      img.jpg?w=600&q=auto 600w, …"
    alt="…">
</picture>

I use q=15 as a "low quality" parameter, and q=auto as the "best" computed quality for optimized weight and preserved visual quality. The actual syntax depends on the image CDN/service you use.

media="(min-resolution: 3dppx)" is "enough" here even if Safari doesn't support dppx because I chose to cap the quality for devices with density above 3dppx and there is no Apple device with a higher density (yet?).

If you want to limit quality above 2dppx (as it would be enough as shown in @eeeps 's study), you have to add min-device-pixel-ratio and it starts becoming more verbose.

So yes, a simple way to limit the maximum resolution the browser uses to compute the size of the required image would be awesome.

nhoizey avatar Mar 01 '21 17:03 nhoizey

I've never seen a good solution with and srcset-w in s.

I wrote an article about it, How to cap image fidelity to 2x and save 45% image weight on high-end mobile phones, which uses some sources with x, some others with w.

That solution works well to cap image fidelity on smaller viewports (smartphones), and let the w descriptor do its job on larger viewports (tablets and computers).

The thing is that solution is ~a nightmare~ quite difficult to design even for an experienced developer, it requires a lot of upfront thinking, and it's a hell to maintain. Especially when you have to write server-side code that generates that picture tag.

That's why I think that a max-density attribute would be great to solve that problem with a single img tag.

verlok avatar Mar 02 '21 09:03 verlok

That's amazing data, thanks! /cc @kenjibaheux

yoavweiss avatar Mar 02 '21 09:03 yoavweiss

@verlok I never thought about using both x and w but for different viewports, well done! 👍

Depending on the target of the site, I would still consider feature phones with a 1dppx density, which would really benefit smaller images. I use a Nokia 8110 4G, for example, with a 240 px viewport and 167 ppi density, to test some sites built for some clients.

As you say, it's quite difficult to generate such code, even more if the layout changes a lot on multiple viewport breakpoints.

We need max-density!

nhoizey avatar Mar 02 '21 09:03 nhoizey

max-density to the rescue! 💪

verlok avatar Mar 02 '21 20:03 verlok

max-density is cool, but I think we need to think about who gets to make max-useful-resolution decisions, and why...

Initial conversations around max-useful-resolutions were predicated on srcset selection being a browser decision – leveraging the srcset spec's flexibility to eliminate wasted bytes and pixels that almost no one (p90?) would ever see. Changes that affect every srcset loaded by the browser must necessarily be conservative and worst-case (boo) because they affect every single srcset load (yay!).

max-density makes this an author decision (what tradeoff do I want to make for this particular image?). This gives authors the ability to make content- and context- informed decisions using information that browsers don't have access to (yay) but requires authors to do work/not make mistakes, doesn't work for existing content, and probably will never work for a majority of content. (boo)

I strongly think there should be some kind of user preference here, too. Maybe that's just some spec language mandating that browsers can always override max-density (and then tie an override value to something like low-data mode). Author's context/content-dependent decisions are one thing -- end-user decisions about their own particular priorities (especially ones that crank the max down) should trump everything.

Long term, I want all of the above, structured according to the priority of constituencies:

  1. Browsers do their best to make very conservative gains here, for everyone, leveraging the flexibility afforded them by the srcset spec.
  2. Authors can override browsers' default values with some kind of attribute (max-density).
  3. Users via some preference or another (e.g. data-saver mode), have ultimate say, and can override author-provided maxes.

eeeps avatar Mar 03 '21 07:03 eeeps

@eeeps I thought we shouldn't introduce more attributes with a dash. isn't maxdensity as content attribute and maxDensity as IDL the way to go?

aFarkas avatar Mar 03 '21 08:03 aFarkas

I can see "SrcsetMaxDensity" was committed here, but I can't understand whether it shipped or not.

@yoavweiss would you be able to send an update? Thanks

It's landed behind a flag but not enabled by default. This could be the basis for an origin trial to test out that approach.

yoavweiss avatar Mar 03 '21 12:03 yoavweiss

This could be the basis for an origin trial to test out that approach.

I'd love to try that. When will it be ready to enable? :)

verlok avatar Mar 03 '21 13:03 verlok

You can try it out locally by running Chrome with the --enable-blink-features=SrcsetMaxDensity flag. I'm working on figuring out how we can enable this for an Origin Trial

yoavweiss avatar Mar 03 '21 13:03 yoavweiss

You can try it out locally by running Chrome with the --enable-blink-features=SrcsetMaxDensity flag.

Hey @yoavweiss, I've tried that, both on Stable and Canary, I got this. What am I doing wrong?

image

verlok avatar Mar 16 '21 17:03 verlok

Any news on this @yoavweiss?

verlok avatar Apr 13 '21 08:04 verlok

More generically, I think browsers should do more in automatically selecting smaller images from a srcset attribute when:

  • network conditions are poor
  • the device is high-end
  • user preference is to save data

What do you think?

verlok avatar Apr 13 '21 08:04 verlok