Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sub-metrics without values are rendered below wrong main metric #2518

Closed
efdknittlfrank opened this issue May 6, 2022 · 8 comments · Fixed by #2519
Closed

Sub-metrics without values are rendered below wrong main metric #2518

efdknittlfrank opened this issue May 6, 2022 · 8 comments · Fixed by #2519
Assignees
Labels
Milestone

Comments

@efdknittlfrank
Copy link

efdknittlfrank commented May 6, 2022

Brief summary

This one is only a minor bug/annoyance (and another question is: should you have metrics without values? it can happen …)

Now that #2512 is fixed (thank you very much!), metrics and thresholds now work a bit differently in 0.38.1.

One small problem is that thresholds for sub-metrics without any values ever recorded on the main-metric are rendered below a different main-metric.

k6 version

0.38.1

OS

Windows and Docker

Docker version and image (if applicable)

0.38.1

Steps to reproduce the problem

A simple script to reproduce the bug:

import { Counter } from 'k6/metrics';

const counter1 = new Counter("one");
const counter2 = new Counter("two");

export const options ={
	thresholds: {
		'one{tag:xyz}': [],
	},
};

export default function() {
	console.log('not submitting metric1');
	counter2.add(42);
}

Expected behaviour

Two possible implementations come to mind:

1. render (main-)metrics, even without recorded values

     data_received........: 0 B 0 B/s
     data_sent............: 0 B 0 B/s
     iteration_duration...: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
     iterations...........: 1   499.950005/s
     one..................: 0   0/s
       { tag:xyz }........: 0   0/s
     two..................: 42  20997.90021/s

2. Hide metrics including their sub-metrics if no value was ever recorded

     data_received........: 0 B 0 B/s
     data_sent............: 0 B 0 B/s
     iteration_duration...: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
     iterations...........: 1   499.950005/s
     two..................: 42  20997.90021/s

I think option 1 would be preferable, because it doesn't surprise users with "why is my metric gone?".

Actual behaviour

     data_received........: 0 B 0 B/s
     data_sent............: 0 B 0 B/s
     iteration_duration...: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
     iterations...........: 1   499.950005/s
       { tag:xyz }........: 0   0/s
     two..................: 42  20997.90021/s

As you can see, {tag:xyz} is a submetric of one, but it is rendered below iterations. Metric one is not rendered at all (which is somewhat plausible and understandable). Sub-metrics without values on their main metric seem to be rendered below the "alphabetically" preceeding main-metric, e.g. foobar{tag:xyz} without any values would be rendered below data_sent.

@na--
Copy link
Member

na-- commented May 6, 2022

Thanks for the report again 🙇 I can reproduce this 😞

I am not sure if this alone is worth releasing a v0.38.2 release, but I'll tag it as such for now, we might have other issues as well.

@na-- na-- added this to the v0.38.2 milestone May 6, 2022
@efdknittlfrank
Copy link
Author

@na-- IMHO this is too low prio to justify a 0.38.2 release. It only affects the output, not the actual execution of the test nor the metric collection. The output will be a bit confusing if there are metrics without values, but sub-metrics of those metrics are defined as threshold.

Confusing, but nothing to worry too much about. For Counter metrics, a simple workaround exists: call .add(0) once at the start of your test. For the other metrics, it's not as straightforward. But again, I wouldn't worry too much about this. A minor annoyance, nothing more :)

@na--
Copy link
Member

na-- commented May 6, 2022

This is a weird issue, because the last code line here should have prevented it 😕

k6/metrics/engine/engine.go

Lines 121 to 127 in 44cc995

// Mark the metric (and the parent metric, if we're dealing with a
// submetric) as observed, so they are shown in the end-of-test summary,
// even if they don't have any metric samples during the test run
me.markObserved(metric)
if metric.Sub != nil {
me.markObserved(metric.Sub.Metric)
}

See the comment, this case was exactly why it exists, so I think metric.Sub might not be properly getting initialized @oleiade ?

@oleiade
Copy link
Member

oleiade commented May 6, 2022

Thanks again for the report @efdknittlfrank, looking into it, and will update you as soon as I have news 🙇🏻

@na--
Copy link
Member

na-- commented May 6, 2022

Ah, I think I found the problem 🤦‍♂️ The above code is me.markObserved(metric.Sub.Metric) when it should be me.markObserved(metric.Sub.Parent)... 😭 I'll submit a PR shortly.

@na--
Copy link
Member

na-- commented May 6, 2022

Should be fixed by #2519

While this bug is probably not sufficient to release v0.38.2 by itself (after all, before v0.38.0 and the fix of #1346, you would not see anything for either the one metric or its one{tag:xyz} submetric, at all 😅), it seems like we might have another bug that is a bit more serious (#2520), so we'll likely release v0.38.2 with both fixes early next week.

@efdknittlfrank, thank you for an excellent bug report again! If you have any more bugs to report, I wouldn't say we would be happy, precisely, but we'd appreciate them very much! 😅

@oleiade
Copy link
Member

oleiade commented May 11, 2022

Hi @efdknittlfrank,

Heads up that the newly released v0.38.2 https://github.com/grafana/k6/releases/tag/v0.38.2 contains the fix for this issue 🤝

@efdknittlfrank
Copy link
Author

@oleiade wonderful! Thanks a bunch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants