Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Threshold enhancements - custom exit codes per threshold #680

Open
SaberStrat opened this issue Jun 22, 2018 · 4 comments
Open

Threshold enhancements - custom exit codes per threshold #680

SaberStrat opened this issue Jun 22, 2018 · 4 comments
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6 ux

Comments

@SaberStrat
Copy link

Respect thresholds in setup/teardown code

The docs for thresholds say

A big difference between the init stage and setup/teardown stages is that you have the full k6 API available in the latter

However, thresholds don't seem to work there:
moving code that adds to a metric in the default function/VU code (triggers a threshold correctly and prints the metric value in the results output) to the setup code leads to no aborting on a threshold failure or printing of the metric in the output.

This might be intentional, in that custom metrics only get collected at the end of a VU iteration, but it still feels like this should be possible at the end of the setup/teardown function as well.

Custom exit code setting per threshold
When running k6 in an CI environment, I would like to be able to catch the type of test failure.

Currently there is either fail or no fail (non-zero exit code or exit code zero). I know I could add failure messages in the form of tags to the result data and then parse the data after the test, but since a dynamic exit code is already a thing, it would feel natural to be able to influence the value of the non-zero exit code.

With that I could label the testrun appropriately in my CI tool - e.g. "p90 exceeded SLA" when a Trend threshold failed, "error rate exceeded 10%" when a Rate threshold failed.

Something like:

export let options = {
  thresholds: {
    "error rate": [{
      threshold: "rate<0.1",
      abortOnFail: true,
      exitCodeOnFail: 2
    }]
  }
};

With that, when the error rate metric exceeds the 0.1, the test will abort and k6 will exit with code 2.

@na--
Copy link
Member

na-- commented Jun 22, 2018

I think that the first part of the issue ("Respect thresholds in setup/teardown code") happens because at the moment k6 doesn't emit any metrics from setup() and teardown(), which was recently noticed in another context and mentioned in this issue. That should soon be fixed by this pull request, but I'll add a unit test there that specifically tests for thresholds to be sure.

Regarding the exitCodeOnFail - I think it's a good idea and could be potentially very useful in some CI use cases. We should probably restrict the allowed user-configurable exit codes to be in some specific range, so k6 can have "reserved" exit code we can use for non-thresholds errors, but other than that, I don't see any issues with implementing it.

@na--
Copy link
Member

na-- commented Jan 26, 2021

It might be a good idea to also allow 0 as an exit code for some thresholds, besides whatever range we reserve for them, as evidenced by #1778. Basically, a way to consider some thresholds only soft failures that don't abort the whole CI pipeline when they fail. They could still be useful, since users can now use the handleSummary() function (#1768) to take actions based on the failed thresholds.

edit: I renamed the issue, since the first part has already been released for a long time in k6 v0.22.0 (#678)

@na-- na-- changed the title Requests for threshold enhancements Custom exit codes per threshold Jan 26, 2021
@na-- na-- changed the title Custom exit codes per threshold Threshold enhancements - custom exit codes per threshold Jan 26, 2021
@na--
Copy link
Member

na-- commented Jan 26, 2021

Now that I think about it, what should happen if multiple thresholds fail? We can't really exit with the exit code of the top threshold, since the thresholds are defined as an unordered map. Maybe after #1443 / #1441 are done we can do that, but if we implement this issue before them, we can only use the generic thresholdHaveFailedErrorCode we use now, i.e. 99.

edit: and the actual reserved ranges for user-specified exit codes should be reserved and documented as a part of #870 before this as well

@na-- na-- added the evaluation needed proposal needs to be validated or tested before fully implementing it in k6 label Jan 26, 2021
@na-- na-- modified the milestones: v1.0.0, TBD Nov 9, 2022
@codebien codebien removed this from the TBD milestone Sep 27, 2023
@codebien
Copy link
Contributor

codebien commented Nov 2, 2023

Hey @dgzlopes, it would be good to have your opinion here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6 ux
Projects
None yet
Development

No branches or pull requests

3 participants