-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can we add free threading benchmarks to the suite? #40
Comments
There are a lot of things we learned in getting reliable FT benchmarks. We can absolutely either figure out how to integrate the ft_ulits benchmarks here or port the learnings over. One challenge is getting good benchmarks requires some of the machinery from ft_utils. So we need to manage that dependency somehow. |
pyperformance has a "manifest" mechanism by which you can pull in benchmarks from another repo, and I think @ericsnowcurrently's intent was to start collecting concurrency benchmarks in https://github.com/faster-cpython/concurrency-benchmarks. But that's all just a starting point, and we can deal with the details however seems most appropriate -- we can either move the necessary ft_utils machinery there, or just specify ft_utils as a dependency (it can even be a github dependency to start with). Alternatively, But I think at this point we should do whatever is easiest to get something working. I'm happy to take a first crack at getting one of the ft_utils benchmarks running within the pyperformance ecosystem and report back here about how it goes. |
Thanks Michael, my suggestion would be to take a look at benchmark_utils
and for example the slots bench.
https://github.com/facebookincubator/ft_utils/blob/main/slots_bench.py
It's fairly simple stuff really, the tricky part is stopping the benchmarks
just measuring themselves by using stuff like Local Wrapper and
BatchExecutor. The utilals also use barriers to ensure threads actually run
concurrently.
I'll look into how I/we can improve the implementation via using pypref. My
current approach is nor very sophisticated.
The future will become clearer when we get there...
…On Wed, 11 Sept 2024, 22:34 Michael Droettboom, ***@***.***> wrote:
pyperformance has a "manifest" mechanism by which you can pull in
benchmarks from another repo, and I think @ericsnowcurrently
<https://github.com/ericsnowcurrently>'s intent was to start collecting
concurrency benchmarks in
https://github.com/faster-cpython/concurrency-benchmarks. But that's all
just a starting point, and we can deal with the details however seems most
appropriate -- we can either move the necessary ft_utils machinery there,
or just specify ft_utils as a dependency (it can even be a github
dependency to start with).
Alternatively, pyperf is the current place where all the "magic" to
accurately measure single-threaded things lives, so I could imagine it
growing free-threaded skills as well. Longer term, this seems like the
right "community" place for this to be.
But I think at this point we should do whatever is easiest to get
something working.
I'm happy to take a first crack at getting one of the ft_utils benchmarks
running within the pyperformance ecosystem and report back here about how
it goes.
—
Reply to this email directly, view it on GitHub
<#40 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAHA2DWLZQEA3AYRSL4AKMTZWCZONAVCNFSM6AAAAABOBV47YSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNBUG42DEMRZGE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Is it feasible to integrate the benchmarks from https://github.com/facebookincubator/ft_utils/tree/main into this repo?
@SonicField
The text was updated successfully, but these errors were encountered: