Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Relax necessity of declaring labelset in advance #298

Open
dt-rush opened this issue Oct 23, 2019 · 14 comments · May be fixed by #368
Open

Relax necessity of declaring labelset in advance #298

dt-rush opened this issue Oct 23, 2019 · 14 comments · May be fixed by #368

Comments

@dt-rush
Copy link

dt-rush commented Oct 23, 2019

There is no inherent restriction in prometheus for this. Labels can appear or disappear from the timeseries a node exports. This restriction in this JS client appears to be a consequence of choices made in the design of the internal hashmap property used in the 4 metrics classes, the utils.setValue() function, and possibly the utils.getLabels() function. This issue is a placeholder for further investigation.

@zbjornson
Copy link
Collaborator

Not opposed to relaxing the requirement, but also note that #299 (comment) made it so Typescript/ts-lint users will get a warning if they try to use a label that wasn't pre-declared.

@dt-rush
Copy link
Author

dt-rush commented Nov 15, 2019

Thanks for pointing that out @zbjornson.

I'm working on a PR and tests for this atm.

For further justification for this change, I cite the robustperception.io blog post on predeclaring metrics:

The first reason relates to what was previously covered in Existential issues with metrics, metrics that only some times exist are difficult to deal with in PromQL. By declaring the metric before your application uses it, the client library can initialise it to 0. This applies even if the Counter is never subsequently incremented. However this doesn't work for metrics with labels: the client library doesn't know what labels might come up, so you have to do it yourself.

(emphasis added)

@dt-rush
Copy link
Author

dt-rush commented Apr 19, 2020

@zbjornson as mentioned in the discussion in #299, that'll be reverted as it adds to the unnecessary rigidity of using this client.

@dt-rush
Copy link
Author

dt-rush commented Apr 19, 2020

That is all sorted nicely, however, there is one more issue to bring up:

In the Writing client libaries # Labels docs, we read:

Client libraries MUST NOT under any circumstances allow users to have different label names for the same metric for Gauge/Counter/Summary/Histogram or any other Collector offered by the library.

The wording of this is not exactly easy to interpret, since its only possible meaning seems a contradiction with what is said (and observed in practice) elsewhere, that the appearance of a new labelset will create a new timeseries (that is, there's no problem having different label names for the same metric). There is no problem even in the current system of pre-declaring a labelset, {a, b, c, d} and then at various times exporting metrics labelled with subsets: {a, b}, {c, d},{c}, {b, d}, etc... that is, "different label names for the same metric".

This is brought up before in this SO question where the user similarly brings up this contradiction.

There is really no difference, if exporting

some_counter {f="1"} 1 
some_counter {f="1", g="1"} 1 
some_counter {g="1"} 1 
some_counter {g="1", h="1"} 1 
some_counter {f="1", h="1"} 1 

between

new Counter({name: "some_counter", labelNames: ["f", "g", "h"]})

and

new Counter({name: "some_counter"});

from the standpoint of the exported metrics, in cases like these, if we simply allow labels to appear whenever they occur.

Implicitly, the "MUST NOT" amounts to a statement (can't be correct, can it?) that the only metric permitted is one which uses its full labelspace all of the time. But that's nonsense, as shown above.

dt-rush added a commit to dt-rush/prom-client that referenced this issue Apr 19, 2020
…e labelset for metrics

+ improved validation testing

Change-type: minor
Connects-to: siimon#298
Signed-off-by: dt-rush <[email protected]>
@dt-rush
Copy link
Author

dt-rush commented Apr 19, 2020

@dt-rush
Copy link
Author

dt-rush commented Apr 19, 2020

Note: that branch can be merged, but it still won't work for typescript clients unless #299 https://github.com/siimon/prom-client/pull/299/files is reverted

@dt-rush
Copy link
Author

dt-rush commented Apr 19, 2020

@siimon I wonder if you'd be able to give any insight here, it seems this can go ahead, but not sure

@dt-rush
Copy link
Author

dt-rush commented Apr 19, 2020

I see other client libraries also (unnecessarily?) restricting label names in this way, and yet as explained above, this doesn't seem to be at odds with how exporters already work even in the case of predeclared labels. There's literally no difference to the prometheus process running the scrape; say we declared labelNames {a, b, c, ... , y, z} but only exported a through m for a year, then suddenly we export a metric with a through r. The "added" labels were "part of the labelset" from the very beginning, but they "just now appeared" from the perspective of the scrape.

@dt-rush
Copy link
Author

dt-rush commented Apr 24, 2020

bump @zbjornson, perhaps you can weigh in on this?

@zbjornson
Copy link
Collaborator

Sorry for the delay. Hoping to have time to look this weekend.

@siimon
Copy link
Owner

siimon commented Apr 28, 2020

Sorry for being late on the ball here, thanks for your suggestion and input. Me personally would love to drop this too since it just adds complexity.

My main concern though is to keep it aligned to other prometheus clients in other languages, specifically the offically supported ones. Have they dropped this requirement too? If not then I probably would wait until they have

@dt-rush
Copy link
Author

dt-rush commented May 6, 2020

@siimon this logic produces a deadlock, though, no? We can't proceed until all proceed? If this works, and the tests added in the PR #368 show that it does, could we not go ahead?

@dt-rush
Copy link
Author

dt-rush commented Jul 22, 2020

For posterity: As of now, until or unless this is implemented, simply declaring all possible labels in advance is an option if that set is finite and known in advance.

@natewaddoups
Copy link

natewaddoups commented Jul 29, 2024

simply declaring all possible labels in advance is an option if that set is finite and known in advance.

Or you could commit to to never learn anything new about the scenario that you're instrumenting.

...which is just not reasonable.

Would be OK to merge the changes proposed earlier if they're gated by a flag that disables them by default?

That would keep the existing behavior for people who for whatever want to keep the existing behavior, while providing a path forward for those of us who wish to improve our instrumentation as we learn new things about the scenarios that we're instrumenting.

I realize those changes are ~4 year old now, and might need work to get caught up with the current state of prom-client. But is the approach good?

I ask because my team at work is adopting prometheus for a service that is currently using DataDog and the lack of flexibility seems really unfortunate. Our code sometimes evolves in ways that benefit from additional insight from existing metrics. (Whose doesn't?)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants