-
Notifications
You must be signed in to change notification settings - Fork 17.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
testing: add Elapsed() method to testing.B #43620
Comments
CC @aclements |
This isn't too onerous a workaround; measuring elapsed time is both short and simple. An alternative fix here is to add methods on
|
Sure, not onerous in terms of lines of code. However, it seems to me that timing code execution is a core responsibility of a benchmark framework, so it seems odd to cede that responsibility to your users. I haven't even looked at the code under the hood, but you could imagine that the
Yes, this would be great. This is what I had in mind with the "Expose the measured time so users can compute the rate themselves and report it with the existing API" idea in the description. It does seem slightly awkward that you'd need some contract about calling |
I wouldn't be opposed to exposing the measured elapsed time of the benchmark. While it's true that often this isn't hard to measure for yourself, it becomes much harder in benchmarks that use |
Peeking at the internals, this could be as simple as adding a
|
Returning 0 seems like it would invite bugs, but I'm not sure between the other two options. |
A fourth option: stop the timer. Easy to document, yields the correct outcome in most situations with a minimum of fuss.
I would advocate for |
How do we progress this proposal through the process to get a decision? Based on the discussion I think the simplest concrete proposal is the addition of the following method to // Elapsed returns the measured elapsed time of the benchmark. If the timer is
// running, it will be stopped.
func (b *B) Elapsed() time.Duration {
b.StopTimer()
return b.duration
} Perhaps an additional note in the documentation for |
Leaving the timer running seems much less surprising than silently stopping the timer. |
This proposal has been added to the active column of the proposals project |
Great to see this taken up!
Yes, I can see the argument for that. |
Assuming Elapsed reports elapsed time but does not change the running state of the timer, does anyone object to adding Elapsed? |
Based on the discussion above, this proposal seems like a likely accept. |
No change in consensus, so accepted. 🎉 |
Change https://go.dev/cl/420254 mentions this issue: |
Elapsed returns the measured elapsed time of the benchmark, but does not change the running state of the timer. Fixes golang#43620. Change-Id: Idd9f64c4632518eec759d2ffccbf0050d84fcc03 Reviewed-on: https://go-review.googlesource.com/c/go/+/420254 Reviewed-by: Dmitri Shuralyov <[email protected]> TryBot-Result: Gopher Robot <[email protected]> Auto-Submit: Bryan Mills <[email protected]> Reviewed-by: Dmitri Shuralyov <[email protected]> Run-TryBot: hopehook <[email protected]> Reviewed-by: Bryan Mills <[email protected]>
Change https://go.dev/cl/449077 mentions this issue: |
Updates #43620. Change-Id: If2b6f37d79c055ca5799071bf70fcc9d12b8a2a0 Reviewed-on: https://go-review.googlesource.com/c/go/+/449077 Reviewed-by: Russ Cox <[email protected]> Run-TryBot: Bryan Mills <[email protected]> Auto-Submit: Bryan Mills <[email protected]> TryBot-Result: Gopher Robot <[email protected]>
Propose to add an
Elapsed
method totesting.B
, as follows:Motivation
Custom benchmark metrics are a powerful feature. Go introduced partial support with the
testing.B.ReportMetric()
API following #26037. However, this API seems to leave a gap in supporting rate metrics. Specifically, rate metrics are those that would be divided by the total benchmark time and reported with the/s
suffix, such as the built-inMB/s
. These cannot be reported using the existing API since there is no way to access the measured time.Rate metrics have many use cases. The existence of
MB/s
proves at least one. You could also imagine reportingpackets/s
for a network benchmark,points/s
for a geometry benchmark. I was just working on a benchmark for theavo
assembler and wanted to reportinstructions/s
.Rate metrics are supported by other benchmark libraries, such as Google Benchmark, which has become somewhat of a de-facto standard for C++ benchmarking. Google Benchmark's user counters are configurable to support rate metrics with the
kIsRate
flag, along with many others. Google Benchmark also supports the special-caseSetItemsProcessed
to support examples likepackets/s
orpoints/s
mentioned previously. As an example, theabseil
library usesSetItemsProcessed
to report the rate items are pushed to a vector.I don't think there are satisfactory workarounds for this at the moment. You can measure the time yourself and use it to report a rate metric with
ReportMetric
. Alternatively, you can abuseSetBytes
and post-process the output or inform your users to interpret it differently.Propose extending the
testing.B
API such that this is possible. Ideas:ReportRate
method, similar toReportMetric
but the value will be divided by time.SetItemsProcessed
method similar to Google Benchmark.Following discussion below, we propose the
Elapsed()
method given above.The text was updated successfully, but these errors were encountered: