Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set up Slurm pipeline for performance regression testing. #280

Closed
wants to merge 1 commit into from

Conversation

ali-ramadhan
Copy link
Member

Right now only 1 benchmark job is run to check for performance regression, which is important seeing as sometimes we introduce rogue bugs that kill performance.

cc @simonbyrne will this work with the Slurm CI/CliMA bot framework you've set up? Will be awesome to start using it. No MPI stuff yet but will probably add some soon.

@coveralls
Copy link

coveralls commented Jun 9, 2019

Pull Request Test Coverage Report for Build 717

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage remained the same at 62.429%

Totals Coverage Status
Change from base Build 698: 0.0%
Covered Lines: 550
Relevant Lines: 881

💛 - Coveralls

@codecov
Copy link

codecov bot commented Jun 9, 2019

Codecov Report

Merging #280 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #280   +/-   ##
=======================================
  Coverage   62.51%   62.51%           
=======================================
  Files          23       23           
  Lines         883      883           
=======================================
  Hits          552      552           
  Misses        331      331

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update fc12f60...2fad83c. Read the comment docs.

@codecov
Copy link

codecov bot commented Jun 9, 2019

Codecov Report

Merging #280 into master will decrease coverage by 0.09%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##           master     #280     +/-   ##
=========================================
- Coverage    62.6%   62.51%   -0.1%     
=========================================
  Files          70       23     -47     
  Lines        2054      883   -1171     
=========================================
- Hits         1286      552    -734     
+ Misses        768      331    -437
Impacted Files Coverage Δ
src/utils.jl 26.08% <0%> (-45.49%) ⬇️
src/fields.jl 46.66% <0%> (-6.09%) ⬇️
src/Oceananigans.jl 71.42% <0%> (-3.58%) ⬇️
src/clock.jl 100% <0%> (ø) ⬆️
src/AbstractOperations/show_abstract_operations.jl
src/AbstractOperations/computations.jl
src/Operators/derivative_operators.jl
src/Operators/tracer_advection_operators.jl
...ure_implementations/leith_enstrophy_diffusivity.jl
...nce_closure_implementations/blasius_smagorinsky.jl
... and 75 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 91e5626...45bf8a8. Read the comment docs.

@ali-ramadhan
Copy link
Member Author

@simonbyrne Sounds like you guys had some issues with Slurm CI?

If so I'll close this PR and search for another place to test and benchmark multi-GPU stuff.

@simonbyrne
Copy link
Member

Sorry @ali-ramadhan for not replying.

I'm still trying to iron out the kinks in our current set up (we get failures pretty often for unknown reasons). Also I was hoping to figure out a container system.

@ali-ramadhan
Copy link
Member Author

Closing this PR for now. Will reopen once we consider Slurm CI again, potentially once we add MPI support with PR #590.

@ali-ramadhan ali-ramadhan deleted the slurmci-setup branch August 4, 2020 13:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants