-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
core(byte-efficiency): replace pessimistic graph with optimistic #15651
Conversation
@@ -4457,7 +4457,7 @@ | |||
"warnings": [], | |||
"metricSavings": { | |||
"FCP": 0, | |||
"LCP": 300 | |||
"LCP": 610 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now this is surprising. LCP savings should be going down not up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did some investigation using PageDependencyGraph.printGraph
and here are the results for uses-text-compression
LCP (may want to paste into a different viewer to see better):
######## OPTIMISTIC
====== | http://localhost:10200/dobetterweb/dbw_tester.html
= | cpu
===== | http://localhost:10200/dobetterweb/dbw_tester.css?delay=100
===== | http://localhost:10200/dobetterweb/unknown404.css?delay=200
================== | http://localhost:10200/dobetterweb/dbw_tester.css?delay=2200
===== | http://localhost:10200/dobetterweb/dbw_disabled.css?delay=200&isdisabled
======================== | http://localhost:10200/dobetterweb/dbw_tester.css?delay=3000&capped
================ | http://localhost:10200/dobetterweb/dbw_tester.css?delay=2000&async=true
============================ | http://localhost:10200/dobetterweb/dbw_tester.css?delay=3000&async=true
========= | http://localhost:10200/dobetterweb/dbw_tester.js
========= | http://localhost:10200/dobetterweb/empty_module.js?delay=500
= | cpu
================================================ | http://localhost:10200/dobetterweb/fcp-delayer.js?delay=5000
============ | http://localhost:10200/dobetterweb/third_party/aggressive-promise-polyfill.js
====== | http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?isr1
= | cpu
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?isr2
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?isr3
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg
= | cpu
========== | cpu
======= | http://localhost:10200/dobetterweb/dbw_tester.css?scriptActivated&delay=200
========= | http://localhost:10200/dobetterweb/dbw_tester.html
============================= | http://localhost:10200/dobetterweb/lighthouse-1024x680.jpg?lcp&redirect=lighthouse-1024x680.jpg%3Fre
======= | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?iar2
== | cpu
===== | http://localhost:10200/dobetterweb/empty.css
================ | http://localhost:10200/dobetterweb/lighthouse-1024x680.jpg?redirected-lcp
######## PESSIMISTIC
====== | http://localhost:10200/dobetterweb/dbw_tester.html
= | cpu
===== | http://localhost:10200/dobetterweb/dbw_tester.css?delay=100
===== | http://localhost:10200/dobetterweb/unknown404.css?delay=200
================= | http://localhost:10200/dobetterweb/dbw_tester.css?delay=2200
===== | http://localhost:10200/dobetterweb/dbw_disabled.css?delay=200&isdisabled
======================== | http://localhost:10200/dobetterweb/dbw_tester.css?delay=3000&capped
================ | http://localhost:10200/dobetterweb/dbw_tester.css?delay=2000&async=true
============================ | http://localhost:10200/dobetterweb/dbw_tester.css?delay=3000&async=true
========= | http://localhost:10200/dobetterweb/dbw_tester.js
========= | http://localhost:10200/dobetterweb/empty_module.js?delay=500
= | cpu
=============================================== | http://localhost:10200/dobetterweb/fcp-delayer.js?delay=5000
============ | http://localhost:10200/dobetterweb/third_party/aggressive-promise-polyfill.js
====== | http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js
====== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?iar1
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?isr1
= | cpu
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?isr2
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?isr3
=========== | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg
================================================= | http://localhost:10200/dobetterweb/lighthouse-rotating.gif
= | cpu
========= | cpu
======= | http://localhost:10200/dobetterweb/dbw_tester.css?scriptActivated&delay=200
========= | http://localhost:10200/dobetterweb/dbw_tester.html
============================= | http://localhost:10200/dobetterweb/lighthouse-1024x680.jpg?lcp&redirect=lighthouse-1024x680.jpg%3Fre
======= | http://localhost:10200/dobetterweb/lighthouse-480x318.jpg?iar2
== | blob:http://localhost:10200/3f2bc9df-684b-4541-837c-1590152ef65d
== | cpu
===== | http://localhost:10200/dobetterweb/empty.css
= | filesystem:http://localhost:10200/temporary/empty-0.43448333410631723.png
================ | http://localhost:10200/dobetterweb/lighthouse-1024x680.jpg?redirected-lcp
COMPRESSABLE: http://localhost:10200/dobetterweb/dbw_tester.html -- SAVE % 66.14231358964166
COMPRESSABLE: http://localhost:10200/dobetterweb/third_party/aggressive-promise-polyfill.js -- SAVE % 80.95526868808166
COMPRESSABLE: http://localhost:10200/dobetterweb/dbw_tester.html -- SAVE % 66.14231358964166
The most important difference I spotted was that lighthouse-rotating.gif
is missing from the optimistic graph and yet it is the last node to finish. This means the byte savings on the second dbw_tester.html
graph shouldn't impact the pessimistic simulation because the second dbw_tester.html
is downloaded in parallel with lighthouse-rotating.gif
. This just happens to be the way the requests downloaded, and therefore we can't rely on the optimistic graph providing a smaller savings estimate than the pessimistic graph for LCP.
That being said, both optimistic and pessimistic LCP graphs cast very wide nets when deciding which network requests to include and that might just make the results kinda noisy. It might be worth looking into a new optimistic LCP definition that focuses on the specific LCP request.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Took a look at only a handful of sites, seems like pessimistic savings being lower than optimistic savings only occasionally happens and solely for LCP.
We could:
- use the min of pessimistic & optimistic
- or we could just run with and rely on what optimistic gives us even if it might give us more savings
i'm open to both, though leaning on (1) to prioritize these audits less when sorting.
It might be worth looking into a new optimistic LCP definition that focuses on the specific LCP request.
I agree, worth taking note of this to dive into later
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adam convinced me of going with 2. let's leave a comment in there about this weirdness.
results, | ||
pessimisticLCPGraph, | ||
simulator, | ||
{providedWastedBytesByUrl: result.wastedBytesByUrl, label: 'lcp'} | ||
); | ||
|
||
// Use the min savings between the two graphs for LCP metricSavings. | ||
const lcpMinGraphSavings = Math.min(lcpOptimisticGraphSavings, lcpPessimisticGraphSavings); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thinking about this a little more. I think it was a mistake for our modus operandi to be "use the smallest estimate for savings possible". We should still use the optimistic graph because it is less noisy and focuses on the requests that definitely impacted LCP (at least in principle), but I don't think we need to add this arbitrary Math.min
constraint just to get the lowest estimate. If a handful of edge cases have a higher savings estimate as a result that's fine with me but...
As I said in my previous comment, I think we should explore an LCP optimistic graph that is modeled on the LCP critical path which should further reduce the noise in our LCP savings estimate here. As an initial thought, the optimistic graph could include just the LCP resource and any render blocking requests.
"scoreDisplayMode": "metricSavings", | ||
"numericValue": 3450, | ||
"numericUnit": "millisecond", | ||
"displayValue": "Potential savings of 666 KiB", | ||
"metricSavings": { | ||
"FCP": 0, | ||
"LCP": 3300 | ||
"LCP": 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a good change. Byte savings on the large gif are good, but it isn't the LCP resource so it's LCP impact is superficial.
notes: go/primoprio-follow-up