Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When is a lazy-loaded image "about to intersect the viewport" #5408

Closed
zcorpan opened this issue Mar 27, 2020 · 40 comments · Fixed by #5917
Closed

When is a lazy-loaded image "about to intersect the viewport" #5408

zcorpan opened this issue Mar 27, 2020 · 40 comments · Fixed by #5917

Comments

@zcorpan
Copy link
Member

zcorpan commented Mar 27, 2020

If img's lazy loading attribute is in the Lazy state, img does not intersect the viewport, and img is not about to intersect the viewport, then return true.

https://html.spec.whatwg.org/multipage/images.html#updating-the-image-data:lazy-loading-attribute

An element is said to intersect the viewport when it is being rendered and its associated CSS layout box intersects the viewport.

Note: This specification does not define the precise timing for when the intersection is tested, but it is suggested that the timing match that of the Intersection Observer API. [INTERSECTIONOBSERVER]

https://html.spec.whatwg.org/multipage/rendering.html#intersect-the-viewport

When to start loading a lazy-loaded image is a key aspect of the feature, but the spec doesn't give advice beyond what is quoted above. Right now, different implementations do different things: Chromium starts loading early (I think currently 3000px to 8000px before entering the viewport, depending on effective network speed and latency), Gecko and WebKit start loading late (when at least 1px is visible). See https://www.ctrl.blog/entry/lazy-loading-viewports.html -- they argue that the implemented extremes are too early and too late; nobody has the goldilocks "just right" behavior, yet.

From my experiments, it seems Chromium only applies the "margin" for top-level page scrolling. For images that are in scrollable elements, or in iframes, the loading starts when the element is at least 1px visible. The spec doesn't differentiate between different cases of "about to become visible". The element scroll container case is common for image carousels.

See this demo: https://lazy-img-demo.glitch.me/

To view the same demo in an iframe: https://glitch.com/edit#!/lazy-img-demo - click "Show" and then "Next to The Code".

I'm curious what JS libraries that implement lazy-loaded images do. Have they iterated on this, and know something we could apply here?

Usually, details like this are left to the UA to optimize. However, I think it's important to get some consistency in implementations for web developers to be able to use the feature and know that browsers won't load all images anyway (because their scrollable area is smaller than the browser's lazy margins) and won't load images too late, resulting in users always seeing images load after they're within the viewport.

cc @domfarolino @bengreenstein @emilio @smfr @othermaciej @rwlbuis

@smfr
Copy link

smfr commented Mar 27, 2020

What is "the viewport" as specified here? Layout viewport or visual viewport? Needs to reference the non-existent CSS Viewport spec (which I'm working on).

@bengreenstein
Copy link
Contributor

bengreenstein commented Mar 27, 2020 via email

@pazguille
Copy link

Hi! I think it should be configurable, like the Intersection Observer API. In this case, we could use vh instead of px.

@zcorpan
Copy link
Member Author

zcorpan commented Mar 27, 2020

How would you configure it? Would you configure it differently for different situations? What should the default be?

@mikesherov
Copy link

How would you configure it?

One way is to have attitrubes that mimic what the intersectionObserver API provides. That's what a lot of JS based lazyloaders do.

I could also imagine a more configurable API that could be media query - esque but respond to different effective connection speeds.

But at minimum, parity with IntersectionObserver would go a long way.

@zcorpan
Copy link
Member Author

zcorpan commented Mar 28, 2020

Ok, but I meant, if it was possible to configure the margin, how big would you make it?

@mikesherov
Copy link

Ok, but I meant, if it was possible to configure the margin, how big would you make it?

By default, I typically do one viewport height's distance.

Would you configure it differently for different situations?

If I was trying to be super smart about it, I'd factor in:

  1. The effective connection speed.
  2. The dimensions of the image (if possible to determine ahead of time).
  3. The average amount the user has scrolled thus far on the page.
  4. Whether the rest of the page has loaded yet, akin to requestIdleCallback.

Especially number 3&4. If the image is just below the fold on initial pageload and the user hasn't scrolled at all yet, I'd want to load the image at the earlier of 2 events: the user begins to scroll or window.onload has fired.

@othermaciej
Copy link

Besides above mentioned things, another factor to include is scrolling speed. At the extreme, how fast the user is actually scrolling, but in a simpler form, how fast users typically scroll on a given device.

Just guessing, but I'd expect that, usually, users would scroll more than a viewport height faster on a typical mobile device than a typical desktop/laptop device.

I'm not sure whether waiting for the load event before loading any images below the fold is right. Often the slowest outliers in a page are ad frames, and users can read and scroll without waiting for those.

@mikesherov
Copy link

Often the slowest outliers in a page are ad frames, and users can read and scroll without waiting for those.

Correct, that's why I said to wait for load OR scroll, whichever is earlier. But if ad frames are a sticking point, amend my earlier statement with window.load (net of subresources).

The reason to wait is that while users often do scroll before "above the fold" is completely loaded, it's less likely, and you don't want bandwidth contention between above the fold images, CSS, and JS vs. below the fold images.

@smfr
Copy link

smfr commented Mar 30, 2020

@zcorpan
Copy link
Member Author

zcorpan commented Apr 6, 2020

Research of JS libraries that do lazy-loading

Analysis summary

To do

I haven't yet looked at httparchive to see how web pages typically configure the rootMargin (or equivalent) when using these librarires. My hypothesis is that most use the default or whatever the examples suggest, i.e. 0-300px, but some use other values like 100vh. There won't be many that do something elaborate like different settings for different connection types or change the margin in response to scrolling etc.

Comments on Twitter

See this twitter thread: https://twitter.com/bocoup/status/1243580618811666432

A few points:

  • "There is nothing worse than when you fully loaded a page to read, went into no network zone (plane, subway, train), and then you scroll to see an image just started loading."
  • "I would like to add this to be considered as well: https://danluu.com/web-bloat/ " -- "I would say one needs to adjust the threshold depending on the connection speed. And in case it is too low, get back to an eager strategy so that people can leave their phone downloading the required assets to later pick it up and read the whole page without interruptions."
  • "We've been using about a viewport height away in triggering them, but are using importance soon too for things like hero images. Native lazy was often so large that the current implementation didn't trigger for most of our pages."
  • "For image carousels 3 in each direction, except on mobile Safari due to bugs and also except in chrome due to an RTL bug."

@zcorpan
Copy link
Member Author

zcorpan commented Apr 9, 2020

I tried to figure out how common these libraries are used in httparchive. This was a bit tricky, and the actual usage might be different from this, but I hope this gives an indication.

Row num lib
1 162796 lazysizes
2 64042 lazy load
3 17938 blazy.js
4 13792 layzr
5 12622 jquery unveil
6 9722 lozad.js
7 5485 vue-lazyload.js
8 248 echo
9 51 react-lazyload

This roughly matches with number of stars in GitHub, though -- lazysizes is most common, followed by lazyload.

query
SELECT * FROM (
SELECT page, REGEXP_EXTRACT(body, r'(?i)(?:/\*[\!\*](?:\s+\*)?\s+|"\./|",|\(\'|\),)(lazysizes|lazy load|Vue-Lazyload\.js|lozad.js|layzr|data_bg_multi_hidpi|react-lazyload|jQuery Unveil|\[data-echo\], \[data-echo-background\]|hey, \[be\]Lazy\.js|yallLoad)') AS lib FROM `httparchive.response_bodies.2020_03_01_desktop`
) WHERE lib IS NOT NULL
SELECT COUNT(0) AS num, LOWER(lib) FROM `bocoup-2018.bocoup_httparchive.lazy_lib_pages`
GROUP BY LOWER(lib)
ORDER BY num DESC

@zcorpan
Copy link
Member Author

zcorpan commented Apr 9, 2020

Looking at only pages that configure the expand for lazysizes, the values they set it to are (rounded to nearest 100). 6917 pages do this (4.25% of pages using lazysizes).

Row num expand
1 3238 300
2 1299 1200
3 695 200
4 689 0
5 447 1000
6 348 100
7 91 500
8 47 400
9 38 800
10 13 700
11 4 1500
12 3 1300
13 2 600
14 2 2000
15 1 8000
query
SELECT * FROM (
SELECT rb.url, REGEXP_EXTRACT(body, r'lazySizesConfig\.expand\s*=\s*(\d+)') AS expand FROM `httparchive.response_bodies.2020_03_01_desktop` AS rb JOIN `bocoup-2018.bocoup_httparchive.lazysizes_pages` AS p ON rb.url = p.page
) WHERE expand IS NOT NULL
SELECT COUNT(0) AS num, CAST(CAST(expand AS FLOAT64) / 100 AS INT64) * 100 AS expand FROM `bocoup-2018.bocoup_httparchive.lazysizes_config_expand_pages` 
GROUP BY expand
ORDER BY num DESC

@zcorpan
Copy link
Member Author

zcorpan commented Apr 9, 2020

lazysizes also allows setting expand on a per-image basis with the data-expand attribute. 59 pages do this (so pretty rare to do at all, 0.03% of pages using lazysizes):

Row page data-expand
1 http://www.coop.nl/ 20
2 http://www.jacques-tourtaux.com/ 600
3 https://adao.co.uk/ 0
4 https://advantus.mitiendanube.com/ 1000
5 https://axosis.mitiendanube.com/ 1000
6 https://bale.mitiendanube.com/ 1000
7 https://bearddesign.co/ 1000
8 https://bettiautopecas.com.br/ 1000
9 https://biancachandon.com/ 600
10 https://crane-brothers.com/ 1
11 https://creacours.com/ 200
12 https://franklinpetfood.com/ 300
13 https://isport.blesk.cz/ 10
14 https://kajal.mitiendanube.com/ 1000
15 https://lillet.de/ 1
16 https://loja.carcoating.com.br/ 1000
17 https://lojadancanope.com.br/ 1000
18 https://monmarche.geantcasino.fr/ 100
19 https://oppositehq.com/ 1000
20 https://sociedadedavirtude.lojavirtualnuvem.com.br/ 1000
21 https://themacindex.com/ 5
22 https://tofinoresortandmarina.com/ 0
23 https://www.ateliesilvinhaborges.com.br/ 1000
24 https://www.azazie.ca/ 1
25 https://www.azazie.com/ 1
26 https://www.babybam.com.ar/ 1000
27 https://www.balcoessobmedida.com.br/ 1000
28 https://www.cameraninja.com.br/ 1000
29 https://www.coop.nl/ 20
30 https://www.englishblinds.co.uk/ 100
31 https://www.fingerindustries.com.ar/ 1000
32 https://www.fusoseiki.co.jp/ 200
33 https://www.grupopignataro.com.ar/ 1000
34 https://www.hfiperformance.com.ar/ 1000
35 https://www.hiyacar.co.uk/ 50
36 https://www.inlinestore.com.br/ 1000
37 https://www.its.de/ 10
38 https://www.jahnreisen.de/ 10
39 https://www.laquintaresort.com/ 20
40 https://www.laquintaresort.com/ 20
41 https://www.laquintaresort.com/ 20
42 https://www.laquintaresort.com/ 20
43 https://www.mca.ie/ 600
44 https://www.monk.ca/ 300
45 https://www.motorespesados.com/ 1000
46 https://www.netflights.com/ 10
47 https://www.newhomesguide.com/ 225
48 https://www.rakkau.com.br/ 1000
49 https://www.rileychildrens.org/ 1
50 https://www.tiendabarista.com.co/ 1000
51 https://www.tracom.co.jp/ 100
52 https://www.tudotranquilo.com.br/ 1000
53 https://www.tui.dk/ 1
54 https://www.tui.fi/ 1
55 https://www.tui.no/ 1
56 https://www.tui.se/ 1
57 https://www.vaporever.com.ar/ 1000
58 https://www.vineyardvines.com/ 0
59 https://xiaobox.com.br/ 1000

@zcorpan
Copy link
Member Author

zcorpan commented Apr 9, 2020

OK, so, what can we conclude?

  • Most JS libraries don't do anything fancy, just have a static rootMargin that can be configured
  • LazySizes seems to be the most common library, and it has more involved logic with automatically shrinking and growing expand based on idleness and scrolling, I think. The default expand can be configured and set per image.
  • ~4% of pages using lazysizes configure the expand, ~0% set data-expand per image.

I think the browser is usually in a better position to determine when to load images based on the user's connectivity and scrolling pattern and such. But this should be in the same ballpark as what web developers are doing, and should be consistent between browsers, so that web developers want to use the native feature over JS librarires. Ideally the behavior should be smart enough so users aren't annoyed by seeing images start loading after they scroll (which JS libraries are often failing at, as far as I can tell).

I think the browser should have some margin also for images in element scroll containers (for image carousels) and iframes, not just the top-level page scrolling.

In some situations the web developer is in a better position to predict when it's a good time to load an image (because the page might be driving scrolling, e.g. image carousel). There is an API already for "please load this image now", though -- set loading = "eager". Providing a way to override the browser's lazy logic with a per-document or per-image rootMargin seems like it could regress the user experience, if we assume that the browser managed to implement something better than a static rootMargin.

@mikesherov
Copy link

@zcorpan what amazing research you've done here. Indeed, lazysizes has the lions share of usage here but I'd hesitate to draw any conclusions about its default configuration being a signal to constrain what browsers make available to developers.

What makes lazysizes so awesome is:

Intelligent prefetch/Intelligent resource prioritization: lazysizes prefetches/preloads near the view assets to improve user experience, but only while the browser network is idling (see also expand, expFactor and loadMode options). This way in view elements are loaded faster and near of view images are preloaded lazily before they come into view.

But what makes it even more awesome is the expand attribute. Even though only 4% of sites using lazysizes use it, it feels critical to be able to give the browser additional info as to how sensitive lazyloading should be, and feels critical to me that you get that control without JS. To me, it seems there really isn't a perfect one size fits all here, no matter how smart the browser default is, and devs will continue to use IntersectionObserver instead (or no lazyloading at all!) when the default fails them.

@aFarkas
Copy link

aFarkas commented Apr 10, 2020

@zcorpan asked me wether I can describe the rationale and the functionality of lazySizes flexible expand feature.

The rationale of this feature is the idea that lazy loaded elements that are currently not inside of the viewport should not consume network bandwidth while other in viewport elements are currently loading. At the end it should give you a better UX. On one hand we preload things before the user can see it, so the user doesn't have to wait. On the other hand as soon as the user sees something that needs to load we don't preload because this would divide the bandwidth for currently unneeded elements.

I can describe some mechanics because they might be interesting for some implementation ideas.

  1. lazySizes has a flexible margin value: shrink: 0, default: Math.min(calc(vh/vw - 1px), 600)
  2. lazySizes has three different visible checks: a) normal page viewport, b) scroll container viewport, if inside one (your carousel use case) and c) visibility hidden

Depending on the loading state of the document and how many lazy elements are currently loading lazysizes switches between those visibility checks and expands. For example until the page is not loaded and not scrolled (you had the same idea with ad frames as me) we use the shrink expand and do all visible checks. After that we switch between them based on how many elements are currently loading. So first we start with the most conservative check (0margin + all visible checks). After that if we have no currently loading elements we expand our search.

About scroll speed:
If a user scrolls extremely fast it is always impossible to get it right and find the sweet spot between preloading the right amount so the user doesn't see any image loading. Also if the user scrolls faster then the viewport height it means he wants to jump somewhere without seeing the middle part. You should not preload for this use case. What you can do instead is to try not to load so many elements in the middle. lazySizes has a queue in front of the browsers download queue, which makes sure that if there are more than 6/8 elements loading all checks are idling.

About the scroll container check:
I would argue that 99% of carousels have a width of 100vw so it is in most situations aligned to the page viewport.

I'm currently on nicotine detox so I really have difficulties to concentrate, sorry for that.

@ozcoder
Copy link

ozcoder commented Apr 11, 2020

Sounds like we need more attributes to make it more configurable and cover more use cases, but with sensible defaults that different browser vendors reach a consensus about.

Warning assumptions ahead.
A lot of ecommerce and news sites have footer image that is never seen. Lazy loading this and, in fact in practice, never loading this unseen image would be best. I doubt it would affect SEO. There would be other cases like this too where we don't want the browser to download the image at all unless it's going to be seen soon. DataSaver is a factor too.

Other times we might want an image to download eventually. Sometimes we might even change our mind after DOMContentLoaded and want to set an attribute to say this image that was flagged as only download if about to be seen should now be also download after onLoad and the lazy load thread would notice this flag.

Product images below the fold might have SEO juice and so would want to be loaded after the onLoad event, although I am not sure about this. The bots would still have the image URL, alt text, title text etc.

Responsive images can further complicate matters on whether the designer needs the image to be there to hold the layout together, although they shouldn't be doing this.

Will the picture element have loading="lazy" for each of it's different media queries attributes or just for the tag itself?

Nowadays Microsoft has thousands of HTML/CSS tests. Has anyone heard from them about their defaults, or are they only using what Chromium provides?

Sorry for the rambling, I just want this to be really useful in different cases.

@zcorpan
Copy link
Member Author

zcorpan commented Apr 11, 2020

@aFarkas , thank you, that is very useful!

If a user scrolls extremely fast it is always impossible to get it right and find the sweet spot between preloading the right amount so the user doesn't see any image loading. Also if the user scrolls faster then the viewport height it means he wants to jump somewhere without seeing the middle part. You should not preload for this use case. What you can do instead is to try not to load so many elements in the middle. lazySizes has a queue in front of the browsers download queue, which makes sure that if there are more than 6/8 elements loading all checks are idling.

So I think there are two common cases for fast scrolling on touch devices:

  • the user flicks and waits for the scrolling to stop by itself
  • the user flicks and then stops the scrolling soon after to quickly scroll some desired amount

For the first case, I think the browser already knows where the scroll position will end up, and could start loading those images as soon as the scrolling momentum is known. For the second case, it seems a bit harder to get right.

On desktop browsers (without touch), the scrolling patterns are probably different. If the user uses the scrollbar thumb to quickly scroll somewhere, there is no scrolling momentum to predict the final scroll position.

@zcorpan
Copy link
Member Author

zcorpan commented Apr 11, 2020

Sounds like we need more attributes to make it more configurable and cover more use cases,

I'm not convinced of this. I think we should improve the defaults first, and then see what the remaining problems are (if any).

but with sensible defaults that different browser vendors reach a consensus about.

Yes. 🙂

Warning assumptions ahead.
A lot of ecommerce and news sites have footer image that is never seen. Lazy loading this and, in fact in practice, never loading this unseen image would be best. I doubt it would affect SEO. There would be other cases like this too where we don't want the browser to download the image at all unless it's going to be seen soon. DataSaver is a factor too.

Could they remove the entire footer?

Other times we might want an image to download eventually. Sometimes we might even change our mind after DOMContentLoaded and want to set an attribute to say this image that was flagged as only download if about to be seen should now be also download after onLoad and the lazy load thread would notice this flag.

When would you want to do this? Do you have a URL where this is done today?

Product images below the fold might have SEO juice and so would want to be loaded after the onLoad event, although I am not sure about this. The bots would still have the image URL, alt text, title text etc.

I think this doesn't change anything for this issue.

Responsive images can further complicate matters on whether the designer needs the image to be there to hold the layout together, although they shouldn't be doing this.

You can set the right aspect ratio for the image with the width and height attributes on img. There is still an open issue for when different sources have different aspect ratio, though: #4968

Will the picture element have loading="lazy" for each of it's different media queries attributes or just for the tag itself?

The loading attribute on img applies to all sources in the picture.

Nowadays Microsoft has thousands of HTML/CSS tests.

Which tests do you mean?

Has anyone heard from them about their defaults, or are they only using what Chromium provides?

I assume the latter for this case.

@ozcoder
Copy link

ozcoder commented Apr 11, 2020

Warning assumptions ahead.
A lot of ecommerce and news sites have footer image that is never seen. Lazy loading this and, in fact in practice, never loading this unseen image would be best. I doubt it would affect SEO. There would be other cases like this too where we don't want the browser to download the image at all unless it's going to be seen soon. DataSaver is a factor too.

Could they remove the entire footer?

There are other things, such as links in the footer that some people want to see and will scroll to the bottom. I was just giving an example of a image that most of the time wouldn't be needed to be downloaded, but would be if it is going to be seen soon.

Other times we might want an image to download eventually. Sometimes we might even change our mind after DOMContentLoaded and want to set an attribute to say this image that was flagged as only download if about to be seen should now be also download after onLoad and the lazy load thread would notice this flag.

When would you want to do this? Do you have a URL where this is done today?

I'm not sure, just another scenario I thought of. Maybe something like this :- Some user interaction would cause the browser to scroll into view an element that is way down the page, while nearby is an image was set to lazy load if going to be seen soon, (and would be loaded if the user manually scrolled down there) but now because of some earlier interactions you are confident that the scroll into view is likely to happen and so want the image to download after onLoad as a sort of preload. Pretty contrived example, and probably not worth worrying about and I don't have any URL examples.

Nowadays Microsoft has thousands of HTML/CSS tests.

Which tests do you mean?

It's so long ago, I can't remember any real details. I think when they were working on IE8 and trying to be better with standards, mostly CSS. Around the ACID3 era I think. Some people there developed lots of tests to check they were following standards and they found some issues with the descriptions/explanations of some of the standards and in doing so help make them better. I have never worked for Microsoft, so I don't know any internal details.

@zcorpan
Copy link
Member Author

zcorpan commented Apr 12, 2020

I was just giving an example of a image that most of the time wouldn't be needed to be downloaded, but would be if it is going to be seen soon.

Ok, then I think a normal loading=lazy should handle this case.

I'm not sure, just another scenario I thought of. Maybe something like this :- Some user interaction would cause the browser to scroll into view an element that is way down the page, while nearby is an image was set to lazy load if going to be seen soon, (and would be loaded if the user manually scrolled down there) but now because of some earlier interactions you are confident that the scroll into view is likely to happen and so want the image to download after onLoad as a sort of preload.

You can tell the image to load by changing the loading attribute to eager.

As for tests, ok. We'll write new tests for this issue in https://github.com/web-platform-tests/wpt when we change the spec. 🙂

Cc @gregwhitworth for any input from MS.

@rik
Copy link

rik commented Apr 12, 2020

Regarding footers, I think the web platform is missing lazy CSS images

@rwlbuis
Copy link

rwlbuis commented Apr 14, 2020

For WebKit the current approach is to use compositor information (https://bugs.webkit.org/show_bug.cgi?id=203557).

On my 15" macbook pro this typically gives values around 1800px on my test page (https://mathiasbynens.be/demo/img-loading-lazy) and on iPhone ES (simulator) around 800px.

@zcorpan
Copy link
Member Author

zcorpan commented Apr 16, 2020

Thanks, @rwlbuis . Can you give a summary of the approach taken in your patch, and rationale?

@aFarkas
Copy link

aFarkas commented Apr 20, 2020

I must re-iterate on this. No matter wether you have a fixed "margin" of 100px, 300px, 800px or 1800px. A flexibel/adaptive value is always much more powerful.

Think of the default situation during the onload phase you have two images in view but due to your extended margin value of 100 - 1800px you are loading for example 6 images in parallel. Those 4 unnecessary image downloads are cutting the bandwidth literally in half. Of course as soon as those two images are loaded you can start to preload those 4 images.

Also in earlier versions of lazysizes I had a much higher extended margin values than now and a lot of developers where complaining about it (partially because they did not understand how the adaptive margin is speeding up in view images compared to out of view images). By cutting it down to max of the innerHeight - 1/innerWidth -1 most complaints went away.

@smfr
Copy link

smfr commented Apr 20, 2020

I agree that this needs to be specified more precisely, rather than just saying "it's based on something implemented in WebKit". WebKit changes the compositor coverage for scrollables based on scrolling velocity, in ways that could change in future. I don't think web-facing behavior should be built on top of it (sorry, I did suggest it initially, but now think that was a mistake).

@addyosmani
Copy link

addyosmani commented Apr 26, 2020

When to start loading a lazy-loaded image is a key aspect of the feature, but the spec doesn't give advice beyond what is quoted above.

Hey folks. I wanted to provide some background for how we arrived on the current thresholds in Chromium in case it helps with alignment on the question "when should we consider an image is about to intersect with the viewport".

Scroll speed : We believe how fast users typically scroll on a given device matters (perhaps similar to @othermaciej?)

We attempted to optimize for perceived performance by setting conservative thresholds we believed would minimize how often users would quickly scroll down to an image that has not yet loaded - ideally, you shouldn't be staring at some blank pixels.

Part of this is to workaround a platform limitation: you cannot easily configure a placeholder for a natively lazy-loaded image, without using JavaScript. JavaScript lazy-loaders often have more flexibility here. It's often possible to say use a generic placeholder image, LQIP, SQIP etc...but the platform doesn't exactly solve for this. We can reserve dimensions for the image, maybe even set some UA specific background-color, but nothing as close (yet) to what's possible in userland.

Network quality: As captured in our implementation, we adjust thresholds based on the user's effective connection type.

Given how widespread Chromium is used in regions where network quality can be highly variable, we wanted to balance giving users on a fast connection different thresholds (i.e load more images on 4G) while keeping in mind quality and data-plan costs and loading less if you're on say, slow 2G/3G.

Now I personally believe Chromium's current thresholds are different enough to what users get by default with libraries like LazySizes that they can sometimes come across as unintuitive. Like @mikesherov, I often configure my JS lazy-loading libraries to use one viewport height's distance for rootMargin. The data savings here can be significant (e.g ~40-50%). In contrast, Chromium's current thresholds might get you ~10-15%.

it feels critical to be able to give the browser additional info as to how sensitive lazy-loading should be

+1 I would separate this out into two questions: what should the defaults be? what should the API surface for supporting configuration be?

Fwiw, I would personally love to give developers control over lazy-loading sensitivity, whether this is done in a preset manner (e.g <loading=very-lazy> or via a model that follows IntersectionObserver and provides very granular customization).

If I was throwing longer-term questions and ideas out there...

  • How would we feel about bringing back <img lowsrc> or an <img placeholder> attribute to address the "...avoid users who scroll too fast looking at empty pixels" problem?
  • How much do other vendors care about the empty pixels problem? Is it actually OK to not worry about users scrolling too fast? and punt on solving for this until we can do placeholders or modern image formats can help us address the platform gap? My two cents are we probably need to factor in scroll speed into any decisions about the "when" of intersection and what of thresholds we use.

@rik
Copy link

rik commented Apr 27, 2020

Think of the default situation during the onload phase you have two images in view but due to your extended margin value of 100 - 1800px you are loading for example 6 images in parallel. Those 4 unnecessary image downloads are cutting the bandwidth literally in half. Of course as soon as those two images are loaded you can start to preload those 4 images.

I can echo that feedback. Here's a scenario I'm seeing on a website I currently maintain (and I believe this is a common pattern):

<link rel="stylesheet" href="stylesheet.css">
<!-- In viewport -->
<div style="background-image: url(hero.jpg)">
</div>

<!-- Below viewport --> 
<img loading="lazy" src="product1.jpg" alt="">
<img loading="lazy" src="product2.jpg" alt="">
<img loading="lazy" src="product3.jpg" alt="">
<img loading="lazy" src="product4.jpg" alt="">

Browsers will start to download the stylesheet and the product images. Once the stylesheet is downloaded and layout performed, hero.jpg will start downloading but it is now competing for bandwidth with images that are irrelevant at the moment. During the initial load, Firefox's current behaviour has my preference.

@smfr
Copy link

smfr commented Apr 27, 2020

How would we feel about bringing back or an attribute to address the "...avoid users who scroll too fast looking at empty pixels" problem?

Would prefer a <picture> element solution, preferably with some styleability based on state.

How much do other vendors care about the empty pixels problem?

We do somewhat, but existing JS solutions show empty pixels often enough that maybe having defaults that match them is good enough. Aggressive fetching seems worse than empty pixels.

I do think that giving authors some customizability of lazy loading would be reasonable, but I'm not sure what that would look like declaratively. Maybe it would be OK for authors who want something more than the default behavior to fall back to Intersection Observer.

@zcorpan
Copy link
Member Author

zcorpan commented May 4, 2020

To keep this on track, I'd like to scope this issue to getting consistency in the behavior for the feature as-is. New API for placeholder image or customizing the thresholds should be separate issues.

Cases to consider

Scrolling vertically & horizontally for:

  • Top-level viewport
  • Element scroll container scrolling
  • Nested browsing context scrolling i.e. iframe

Input to the decision model

The things that an implementation could use as input for the decision:

  • typical scrolling speed on the current device - how far is the user likely to scroll in a short amount of time
  • actual current scrolling speed / momentum - where is the user likely to end up with the current scroll
  • current network quality - a fast, low latency connection can be more lazy (smaller threshold)
  • current network saturation - images not currently in view can be lower priority than images currently in view (assuming no current fast scroll)
  • data saving mode - if the user has indicated that they wish to save data, be more lazy (smaller threshold)

Not inputs to the decision model

  • The dimensions of the image - I'm not aware of any implementations or experiments with this
  • The loading state of the document - though implemented in lazySizes, I'm not yet convinced that the loading strategy for lazy images should be different before and after the document has reached a certain loading state.

Privacy

The implemented behavior should not expose information about the user that the page doesn't already have access to otherwise. For example, if the implementation doesn't expose battery levels, the battery level should not be an input to the model. The "typical scrolling speed on the current device" shouldn't be so precise as to help finger-print a user.

Issues

@zcorpan
Copy link
Member Author

zcorpan commented May 5, 2020

Scrolling vertically & horizontally for:

  • Element scroll container scrolling

So, I'm not sure how this would work. In particular, for the image carousel use case, using only the implicit root (which I think browser implementations do now) would mean that there is no threshold for the element scroll container case, so those images would only start loading after they are partially in view.

There is likely a performance hit to observing all scrollable elements when lazy images are used. Is there a good way to make it "do what I want" without adding more API surface? Or is the web developer explicitly setting the root the best way to solve this? Edit: filed w3c/IntersectionObserver#431

  • Nested browsing context scrolling i.e. iframe

Should lazy images in iframes use the implicit root, or the images' node document as the intersection root? The former takes away the rootMargin if the origins aren't similar-origin, per IntersectionObserver spec.

Edit: in #5510 we've set root to the image's node document the implicit root.

@mikkorantalainen
Copy link

It seems that Chrome currently has (for toplevel document) logic that if the lazy image is within ~3000px of the visible viewport, start loading it. Firefox starts to load it once it should be already rendering the first row or column of the pixels within the image. Clearly Firefox logic is always going to cause visibly delayed rendering. On the other hand Chrome will often load the whole page.

How about keeping track of a preload margin per site instead? Maybe start with 500px but keep a log about how many pixels you had extra margin at the time the image was fully loaded; if you had more than say 50px extra margin, reduce the margin. If you had less than 50px extra margin, increase the margin. How much to change margin at once? I'd suggest trying to target the 50px extra margin and do a binary search towards it. For example, start loading image by default when it's closer than 500px from the viewport. Once that image is fully loaded and user is still 400px from the image you can compute that image took "100px" worth of loading time and your preload margin should be closer to 150px (includes 50px extra margin above). Split the difference and use 0.5*(500px - 150px) = 175px as the new safety margin.

This would result in pretty fast converging algorithm needing only one integer value per site of memory. I think one value per site is required because different sites have so huge variance in loading speed. Being logically a binary search it should be able to quickly adjust to scrolling speed changes even within a single infinite scroll page.

The extra margin above is needed to combat the issue that different images will have different byte sizes even if the pixel dimensions were the same. With suitable tuning the above algorithm should be pretty good at getting the images just-in-time unless user changes scrolling speed very rapidly.

@mikkorantalainen
Copy link

If user is currently scrolling fast it might be sensible to load only one lazy loaded image in parallel to be able to skip more images if user scrolls so fast that all images cannot be loaded in any case. That should reduce the latency to start loading the visible images once user slows down enough.

@zcorpan
Copy link
Member Author

zcorpan commented May 8, 2020

Thanks @mikkorantalainen , that sounds like an interesting approach. It's difficult to evaluate how well it would work in practice without an experimental implementation. It seems to me though that it may get too small margin if the user scrolls slowly for a while and then scrolls quickly, for example. If we'd like images to be available when users scroll a screen length or so quickly, I think the implementation needs to work from the assumption that the user can do so at any time.

@scott-little
Copy link

scott-little commented Jun 12, 2020

Sorry for the late reply here, I'd just like to add some more explanation for Chrome's choices of thresholds to what zcorpan and Addy mentioned earlier.

Chrome currently uses relatively conservative thresholds, as other folks have mentioned above - typically 3000px on a fast network, and larger thresholds on slower networks (since the images are expected to need longer to load in). These current thresholds used for loading=lazy are the same ones that were developed for the Automatic LazyLoad behavior that Android Chrome users who've turned on Lite Mode will see, which attempts to lazily load page content where suitable (even if it's not marked loading=lazy) in order to reduce data usage and speed up critical content.

The main regression metric that we've focused on in Chrome for these thresholds is what we're calling image visible load time, which measures how long an image is in the viewport before it finishes loading. The goal was to choose thresholds large enough that we can minimize visible load time regressions, such that typically the user experience would match what they'd see without lazy load, plus the data savings and speedups of critical content.

The initial thresholds are purposely overly conservative, since that way the user experience errs more on the side of matching what users would see without any lazy loading.

I am experimenting with more aggressive thresholds (1250px on 4G-speed networks) that get some additional data savings without any significant regressions in visible load time. I'm hoping to launch these more aggressive thresholds for Chrome soon.

I've also experimented with even more aggressive thresholds (750px on 4G-speed networks), but at that point the visible load time regressions start to become more noticeable. I've also experimented with using less conservative thresholds for slow networks (e.g. 2G-speed networks), but from the data so far, it looks like there isn't much room to get more aggressive for slow networks.

@zcorpan
Copy link
Member Author

zcorpan commented Sep 15, 2020

I've implemented my earlier comment in #5917 except not including this:

current network saturation - images not currently in view can be lower priority than images currently in view (assuming no current fast scroll)

This has to do with fetch priority, and I think is a bit orthogonal to this issue. Image priority could depend on in-viewportness regardless of the loading attribute. But also the ideal logic for this could be counter-intuitive: if the user scrolls while 3 images that are currently in view are still loading, and the "next screen" will have 3 other images that will be in view when the scroll is done, it would be better to have the new images be fetched with higher priority.

@zcorpan
Copy link
Member Author

zcorpan commented Oct 29, 2020

This was discussed a few days ago in the WHATWG TPAC breakout session. Minutes at https://www.w3.org/2020/10/26-whatwg-minutes.html#lazy

zcorpan: want to discuss different approaches between browsers with regard to when they're going to load the image. Specifically, the rootMargin on the IntersectionObserver. E.g. Firefox uses 0 rootMargin. Chromium uses a network-dependent rootMargin, 1250px to 8000px.

zcorpan: Open questions: 1. Are people happy with the Chromium behavior? 2. There are suggestions in the HTML Standard about what information to consider.

emilio: Firefox update: currently shipping 0 margin default (but user-configurable). Actively looking into updated strategies with the performance team for better defaults. Developer feedback is that they like the control JS lazy-loading gives them. Maybe a different topic, but worth discussing… would the value be global, or per image, or what?

vmpstr: IntersectionObserver doesn't apply to nested scrollers. Any way to deal with that?

zcorpan: yes, this is an open issue with the IntersectionObserver spec. No non-hacky way to really ground this on IntersectionObserver. It'd be ideal for an IO to opt in to specifying a rootMargin that applies to all scrollable containers. w3c/IntersectionObserver#431

domenic: a bit surprised we're using IO as the basis given this and many other mismatches.

emilio: agreed, but it's getting better.

zcorpan: browsers do use IO to implement lazyloading, so probably worth keeping this layering and resolving the IO issues.

fantasai: Authors might want to adjust the rootMargin based on their guesses as to the user’s scrolling behavior, but it seems more likely that they need to adjust the timing due to differences in resource sizes. So maybe providing hints as to the size of each resource would be more useful more of the time (and would avoid interfering with user prefs or UA smarts as to scrolling behavior and network speed/latency).

emilio: need more data on what authors need, what they’re doing now

zcorpan: some JS libraries allow per-image customization. Most seem to have small rootMargin values (but that often results in the images not being loaded by the time they're seen).

@zcorpan
Copy link
Member Author

zcorpan commented Oct 29, 2020

There will be another TPAC breakout session tomorrow (30 October 14:00–15:00 UTC) to discuss changes to IntersectionObserver to better support lazy-loading use cases.

https://www.w3.org/2020/10/TPAC/breakout-schedule.html#intersectionobserver

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging a pull request may close this issue.