Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix(?): flaky TestSentReceivedMetrics test #1149

Merged
merged 3 commits into from
Sep 11, 2019
Merged

Conversation

imiric
Copy link
Contributor

@imiric imiric commented Sep 9, 2019

This should fix flaky test TestSentReceivedMetrics.
See https://circleci.com/gh/loadimpact/k6/6474, https://circleci.com/gh/loadimpact/k6/6484 .

This issue was introduced in a63bb58 (PR #1138), where the previous implementation of getWebsocketEchoHandler() closed the connection after writing, so this change brings that back.

It seems that closing the connection creates additional metrics which TestSentReceivedMetrics takes into account sometimes, hence the flakiness.

Please run go test -count=1 ./core on this branch a few times on your local machine to ensure it's fixed. :)

There are a few unrelated things to the fix here, but check out the individual commits, and let me know if you prefer another PR for them. I just added them here since they're kind of related.

}
if noReuseReceived < reuseReceived {
t.Errorf("noReuseReceived=%f is greater than reuseReceived=%f", noReuseReceived, reuseReceived)
t.Errorf("reuseReceived=%f is greater than noReuseReceived=%f", reuseReceived, noReuseReceived)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is now correct, right? 😄

The error message noReuseReceived=28325.000000 is greater than reuseReceived=28329.000000 seemed wrong to me :)

@codecov-io
Copy link

codecov-io commented Sep 9, 2019

Codecov Report

Merging #1149 into master will decrease coverage by 0.01%.
The diff coverage is 55.55%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1149      +/-   ##
==========================================
- Coverage   73.61%   73.59%   -0.02%     
==========================================
  Files         144      144              
  Lines       10527    10529       +2     
==========================================
  Hits         7749     7749              
- Misses       2321     2322       +1     
- Partials      457      458       +1
Impacted Files Coverage Δ
lib/testutils/httpmultibin/httpmultibin.go 92.53% <55.55%> (-1.41%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a2077dd...ae564fd. Read the comment docs.

@cuonglm
Copy link
Contributor

cuonglm commented Sep 9, 2019

Please run go test -count=1 ./core on this branch a few times on your local machine to ensure it's fixed. :)

Why not -race?

@imiric
Copy link
Contributor Author

imiric commented Sep 9, 2019

Why not -race?

Go ahead, I was just testing with -count=1 ./core since it was the quickest way to reproduce the issue (and avoid caching issues when switching branches), but -race or running the full suite should work as well. You just need to wait more for the ocasional failure :)

@imiric imiric requested a review from cuonglm September 11, 2019 07:49
cuonglm
cuonglm previously approved these changes Sep 11, 2019
@cuonglm cuonglm mentioned this pull request Sep 11, 2019
lib/testutils/httpmultibin/httpmultibin.go Outdated Show resolved Hide resolved
This _should_ fix flaky test `TestSentReceivedMetrics`.
See https://circleci.com/gh/loadimpact/k6/6474, https://circleci.com/gh/loadimpact/k6/6484 .

This issue was introduced in a63bb58 (PR #1138), where the previous
implementation of `getWebsocketEchoHandler()` closed the connection
after writing, so this change brings that back.

It seems that closing the connection creates additional metrics which
`TestSentReceivedMetrics` takes into account _sometimes_, hence the
flakiness.

From discussions with @na--, we agreed to remove the persistent
connection implementation of a63bb58 (the `for` loop here) since that
matches the previous version, but we'll create a new WS test that
tests multiple message passing, since none of our current tests seem
to do this.
Copy link
Member

@na-- na-- left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just don't forget to add a new v0.26.0 issue for the multi-message WS test like we talked on slack.

@imiric imiric merged commit 1f913f8 into master Sep 11, 2019
@imiric imiric deleted the fix/flaky-metrics-test branch September 11, 2019 15:28
@na-- na-- added this to the v0.26.0 milestone Sep 13, 2019
Copy link
Contributor

@mstoykov mstoykov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Post-factum: I am pretty sure there was a test that failed which looked like it is sending multiple messages but looking at all the tests while there are some that seem like they do it .. they don't :(
They just loop multiple times without sending anything or if they send anything they expect an error so probably something else was wrong and I fixed it along the way ...

@@ -667,7 +667,7 @@ func TestRunTags(t *testing.T) {
})

group("websockets", function() {
var response = ws.connect("wss://HTTPSBIN_IP:HTTPSBIN_PORT/ws-echo", params, function (socket) {
var response = ws.connect("WSBIN_URL/ws-echo", params, function (socket) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should've been WSSBIN_URL although ... apperantly it doesn't matter :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants