Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lighthouse V10.0.2 Not As Good As V9.6.8 #14902

Closed
Generosus opened this issue Mar 16, 2023 · 7 comments
Closed

Lighthouse V10.0.2 Not As Good As V9.6.8 #14902

Generosus opened this issue Mar 16, 2023 · 7 comments

Comments

@Generosus
Copy link

Good Day!

After performing multiple tests (no changes made to our websites), we noticed PSI scores obtained via Lighthouse V10.0.2 are not as good as those obtained with V9.6.8.

Metrics mostly impacted are: LCP and TBT.

We also noticed PSI scores vary tremendously between runs. Results are not consistent.

Can anyone comment on this? Any explanations? Could there be a bug in V10.0.2?

Thank you :)

@adamraine
Copy link
Member

Can you please provide a URL that can reproduce this?

@Generosus
Copy link
Author

Generosus commented Mar 16, 2023

Absolutely. Click here.

When we run Lighthouse within our browser (V9.6.8) scores are good and results are consistent. When we run Lighthouse externally (V10.0.2) the opposite occurs.

Recommend at least 6 test runs per case to see the difference. Also, if you don't mind, please test other sites as well.

Thank you :)

@adamraine
Copy link
Member

There were changes to scoring in v10.0 (see release notes) but I don't think this would have affected your page specifically.

I'm not able to reproduce a major downgrade in performance caused by updating to v10.0.2. For example, I'm consistently getting performance scores around 95 with v10.

Improving score variance in PSI remains active area of work, but this could also be normal page speed variance. Please take a look at our variability docs which go into more detail on variance mitigations.

@devtools-bot
Copy link

Thanks! Appreciate you filing this bug. 👏

This is a known issue, most well described in #10657. So, we'll automatically close this as a duplicate.

🤖 Beep beep boop.

@Generosus
Copy link
Author

Generosus commented Mar 16, 2023

Hey @adamraine,

Thanks for checking. At our end, this is what we get in one of our sample test runs: Performance Score: 83 (average), TBT: 620 ms (fail).

Since there are many variables to consider, more work is probably needed to pin down the culprit(s).

Hey @devtools-bot,

Thank you. Comforting to know we're not the only ones that noticed this.

Concerning Issue or Topic #10657, is there a Google Document that provides a good set of instructions for lay people to ensure they're testing websites in a consistent manner?

For one, we believe we're testing in a consistent manner (i.e., simply running PSI several times within the hour, no change to our environment) but still getting different results with V10.0.2 as reported above. With V9.6.8, we don't have this issue.

Thanks for your time and assistance.

@Generosus
Copy link
Author

Hey @adamraine,

Thanks for sharing the info. We'll dive into it next week.

Yeah, I can see where different testing environments can yield different results. A well-vetted filter or "smart gimbal", though, could help produce consistent results (with a margin of error not to exceeed 5-10%). Something to think about. Future project?

Hey @devtools-bot,

Your call, but not sure why you closed this ticket so soon without giving the public a chance to read it and comment on it. Especially if we're dealing with a fairly new PSI version (10.0.2)

Any chance you can reopen it -- or -- cross-reference it to Topic or Issue No. 6708 (still open).

Thank you :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants