-
Notifications
You must be signed in to change notification settings - Fork 624
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add performance tests to AWS SDK Extension #243
Add performance tests to AWS SDK Extension #243
Conversation
@codeboten @lzchen Would it be possible for us to get a |
f427f9c
to
49d9390
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool! interested to see this in action.
@@ -49,6 +49,12 @@ See | |||
[`tox.ini`](https://github.com/open-telemetry/opentelemetry-python-contrib/blob/master/tox.ini) | |||
for more detail on available tox commands. | |||
|
|||
### Benchmarks | |||
|
|||
Performance progression of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Performance progression of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results. | |
Performance benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results. |
Throughput I'm a bit confused about. What does that mean? number of iterations over some time period?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggesting against "progression" as I think that implies that we are trying to trend benchmarks one direction or another. I don't think we're consciously trying to improve performance, but we are trying to monitor it so it doesn't get out of control.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From what I understand, pytest-benchmark
runs the test as many times as it can in a second (this is configurable). So the graph this action outputs shows how many times it was able to run this function (in the mean case) per second as a measure of performance and shows how this metric changes between commits.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@toumorokoshi That makes sense! What if we called it "history" though to confirm that this is with respects to commits (and not a load test or something that would have "time" on the x access)?
Performance progression of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results. | |
Performance history of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results. |
.gitignore
Outdated
@@ -35,6 +35,9 @@ coverage.xml | |||
.cache | |||
htmlcov | |||
|
|||
# Benchmarks | |||
*.json |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment as core PR. Otherwise LGTM! 🚢
1b512e9
to
ced5b69
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding these, i've enabled the GH pages for this repo
@codeboten Thanks for enabling gh-pages! Unfortunately I don’t think it will show up because we decided in open-telemetry/opentelemetry-python#1443 to just push to |
0e69d10
to
9ac09ef
Compare
Co-authored-by: Aaron Abbott <[email protected]>
1ba5c64
to
43804d2
Compare
Description
Follow up to the Core repo PR which adds a way to add performance tests: open-telemetry/opentelemetry-python#1443
Type of change
Please delete options that are not relevant.
How Has This Been Tested?
Running
tox -e test-sdkextension-aws
locally I can produce the following results:Does This PR Require a Core Repo Change?
Checklist:
See contributing.md for styleguide, changelog guidelines, and more.
- [ ] Changelogs have been updated- [ ] Unit tests have been added