-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Emit an errors metric and have a default error rate script abort threshold #877
Comments
All exceptions during an iteration will add to the And ideally, we should improve the UX of import { fail } from "k6";
export default function () {
fail("test");
} will produce and error like this:
whereas something like |
@lukecusolito, sorry for the late response, I somehow missed your question until I just now looked at this issue for another reason 😞 To answer your question, a single check or a As shown in the script example from #735 (comment), you can sort of implement this even now, with a few lines of code, using custom metrics and thresholds. And using the same approach of custom metrics and thresholds, you can also implement a script abort if only a single check or iteration fails, if that's what your use case is. This issue is focused on actually emitting these iteration errors as a metric, and having some sensible default threshold that automatically abort the test if too many iterations result in an error. After all, if 50% of your iterations are interrupted by an error, your load test probably isn't very meaningful... |
Thinking a bit more about this, and considering that #1007 added a
Another reason for that is that it seems good UX for us to update the UI counter for interrupted iterations (i.e. the Finally, regarding thresholds... I'm not sure if we should add a default one, considering there might not be a way to users to overwrite it or remove it currently? This needs to be checked, but regardless, if we even want the possibility of adding a sensible threshold based on Unfortunately, that means k6 will emit more noise for each iteration (both So, I'm not sure if we should have a |
I like the idea of being able to differentiate between the different reasons k6 has interrupted an iteration. I did want to see if we can just add it on top of the current Iterations metric ... but
I also think that Rate is a better fit at least given the current shortcomings of the threshold api ... |
Even though it will be a minor breaking change, since we don't emit |
Somewhat connected issue that can probably be implemented at the same time as this one: #1250 |
Hey, I started exploring this issue, but it's a bit difficult to gather precise requirements because of the amount of related issues and discussions. So far I'm mostly working based on this comment. One question so far: #1007 also introduced This is not a metric, but an internal counter incremented by specific executors, and is shown in the test run UI ( At first glance it seems that the proposed
|
This is somewhat true, but the devil, as always, is in the details... The It'd be difficult to drop And while currently
The easiest way to detect failed iterations would be in
They don't really overlap all that much. A better name for
As I mentioned above, unless I am very much mistaken, you should be able to update both the atomic and emit the metric from a single place across all executors, the |
This avoids outputting "GoError" as mentioned in #877 (comment)
This avoids outputting "GoError" as mentioned in #877 (comment)
This avoids outputting "GoError" as mentioned in #877 (comment)
This avoids outputting "GoError" as mentioned in #877 (comment)
This avoids outputting "GoError" as mentioned in #877 (comment)
This avoids outputting "GoError" as mentioned in grafana/k6#877 (comment)
It would be very useful if we resurrect the currently unused
errors
metric and also add a default threshold that would abort the script if the error rate exceeds a certain configurable value.This issue is a summary of an idea that has been discussed in various other issues and PRs before:
The text was updated successfully, but these errors were encountered: