-
Notifications
You must be signed in to change notification settings - Fork 651
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add benchmark tests for common operations #335
Comments
I would like to work on this. @toumorokoshi |
@rajibmitra sounds great! I think just getting the pattern established would be a good first task. We don't have any targets right now, so I think just having the timing run during CI (by adding to tox) would be great. |
@rmitr are you still interested in working in this one? |
As a part of #514 I did some research and I think pytest-benchmark is a good framework to use. It integrates nicely with pytest and provides some useful features to make the implementation of performance tests easier. |
Follow up to ensure its part of GA |
Here also a link to a related PR open-telemetry/opentelemetry-specification#748 |
* fix: dont trace ourselves closes open-telemetry#332 Signed-off-by: Olivier Albertini <[email protected]> * fix: add mayurkale22 recommendations Signed-off-by: Olivier Albertini <[email protected]>
Closing in favour of #1767 |
A common concern in instrumentation is the performance hit on applications. To help us quantify and measure how well we're doing there, we should add some benchmark tests that run as part of the CI.
There is a pytest module that could help here: https://pypi.org/project/pytest-benchmark/
The text was updated successfully, but these errors were encountered: