Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests(smoke): make oopif-requests assertions more specific #14100

Merged
merged 10 commits into from
Jun 22, 2022

Conversation

paulirish
Copy link
Member

fixes #14099

We currently just assert that there's 70+ requests. I've seen CI failures for this with 69, 67 and 41 requests. Here's a diff of the "correct" run vs one of the failing ones: https://diff.googleplex.com/#key=AfquQTaPCU4B

In the cases where it fails... its either missing subresources from paulirish.com or items from the youtube iframe. (images and JS, generally). These missing requests don't indicate (to me) any oopif instrumentation issue. I suspect the failures just indicate slow outbound network on GHA's runners. (i guess)

IMO this assertion feels stronger to me. yah?

@paulirish paulirish requested a review from a team as a code owner June 8, 2022 01:43
@paulirish paulirish requested review from brendankenny and removed request for a team June 8, 2022 01:43
@adamraine
Copy link
Member

These missing requests don't indicate (to me) any oopif instrumentation issue. I suspect the failures just indicate slow outbound network on GHA's runners.

This hasn't been the case when it comes to our other, more reproducible oopif test failures (perf-diagnostics-third-party and oopif-scripts). In those cases there definitely is an oopif instrumentation problem because we verified that the network requests happened, and LH just didn't catch them. For this reason, I don't want to assume that the condition is too strict.

I might actually prefer making the condition more strict if the current failures are indeed caused by an instrumentation failure because of how infrequent oopif-requests failures are.

@paulirish
Copy link
Member Author

@adamraine when you say "strict" are you referring to the >70 requests assertion? (cuz to me, it is, but also it isn't)

I'm very fine with asserting more than what I put in this PR. But I do feel like we should have a test page that we can put extremely firm assertions on. (Aka not depending on youtube and disqus asset URLs)

@adamraine
Copy link
Member

Yeah, I mean having more concrete assertions, not necessarily increasing the >70 resource count.

// FYI: Youtube has a ServiceWorker which sometimes cancels the document request. As a result, there will sometimes be multiple requests for this file.
{url: 'https://www.youtube.com/embed/NZelrwd_iRs', finished: true, statusCode: 200, resourceType: 'Document'},
// Subresource of youtube embed
{url: /https:\/\/www\.youtube\.com\/.*?\/embed.js/, finished: true, statusCode: 200, resourceType: 'Script'},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why have all the finished/statusCode/resourceType stuff?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generally: why not?

specifically:

finished tells us it ... finished. When working on this I was seeing cases where in an iframe we didn't get the loadingFinished event for that document.

However a request that gets Network.loadingFailed is marked as finished and thus the check of statusCode. it's -1 in bad cases.

and resourceType is there to basically verify it was transferred the way we expect. i don't expect it to catch much, but i don't see a reason why not.

@@ -47,11 +47,20 @@ const expectations = {
'network-requests': {
details: {
items: {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add the first non-document request for each iframe as well. This would be a check to ensure we start listening for OOPIF requests in time. For me, those were:

  • paulirish.com: https://www.googletagmanager.com/gtag/js?id=G-PGXNGYWP8E
  • YT embed: https://www.youtube.com/s/player/966d033c/www-player.css

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gtag: ok
yt: that's gonna go stale real quick, just check for any css ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's gonna go stale real quick, just check for any css ?

Maybe look for https://www.youtube.com/s/player/*/www-player.css. I think looking for this specific CSS file is important. It was frequently one of the "missing" requests over in perf-diagnostics-third-party which also has a YT embed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adamraine added.

@adamraine
Copy link
Member

This is good, I think oopif-requests is much less flaky now.

The problem is that it's still failing, so we might need to disable this smoke test and add it to the backlog.

@connorjclark connorjclark changed the title tests(smoke): update oopif-requests assertion tests(smoke): update oopif-requests assertion to be more specific Jun 13, 2022
@connorjclark connorjclark changed the title tests(smoke): update oopif-requests assertion to be more specific tests(smoke): make oopif-requests assertions more specific Jun 13, 2022
@connorjclark connorjclark requested review from connorjclark and removed request for brendankenny June 22, 2022 17:53
@connorjclark connorjclark merged commit b7efa1d into master Jun 22, 2022
@connorjclark connorjclark deleted the oopifreqs-asssertion branch June 22, 2022 17:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

FR oopif-requests is flaky
5 participants