-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Profiling workflow on workflow_dispatch
#1012
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great - no specific comments from me. I notice you're using doxygen in the profiling repo. I'd prefer to use sphinx to keep things consistent.
No problem - I've swapped it to |
- .gitignore will ignore profiling outputs - dev.in and dev.txt list pyinstrument as a requirement - MANIFEST.in ignores the profiling/ directory
921fdb5
to
71c6ea5
Compare
@tamuri This OK to be merged in? (I don't have merge authority to |
Yes, fine to merge - I've given you permissions. |
Short Description
Provides the "skeleton" for implementation of profile tracking of runs across the repository history (#686).
Adds the
run_profiling.yml
workflow which profiles thescale_run.py
script and exports the resulting.pyisession
file to theTLOmodel-profiling
repository - repo is under the UCL organisation so everyone should be able to see it, but I can fix that if this isn't the case. Workflow runs on self-hosted runners and uses atox
environment to run the profiling script.Preview the profiling site here
Full Changes
Adds
.github/workflows/run_profiling.yml
workflow.year/month/day/HHMM/{files.out}
on theTLOmodel-profiling
repository. Deployment targets theresults
branch (so that in future we can trigger rendering onpush
es to this branch).src/scripts/profiling
directory:scale_run.py
has the "main" refactored so that it can be imported from the profiling script_paths.py
file to fix some important paths to avoid filesystem errors_parameters.py
file to reliably the track the parameters that should be passed to the models that we are profiling. NOTE: For the purposes of testing new features, the profiling only runs a month-long simulation. This can be edited once we're happy with the workflow and how the outputs are presented/moved to the output repo.run_profiling.py
; new script that can be run from the command-line. Profiles thescale_run
model and writes the outputs to either/both HTML and JSON.vscode
's linter has also zealously run over a couple of other files likeshared.py
, which is bloating the diff.