Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Depletion restart with mpi #2778

Merged
merged 6 commits into from
Apr 9, 2024

Conversation

church89
Copy link
Contributor

@church89 church89 commented Nov 16, 2023

Description

This PR fixes an inconsistency when restarting a depletion simulation with mpi enabled, due to a call to openmc/deplete/coupled_operator._load_previous_results() method from openmc/deplete/abc._get_start_data(), which double the size of depletion results file and start appending new ones from there.
To reproduce simply run:
mpirun -n 2 --bind-to numa --map-by numa python -m mpi4py run_depletion.py
and
mpirun -n 2 --bind-to numa --map-by numa python -m mpi4py restart_depletion.py
from the pincell_depletion example

Checklist

  • I have performed a self-review of my own code
  • I have run clang-format (version 15) on any C++ source files (if applicable)
  • I have followed the style guidelines for Python source files (if applicable)
  • I have made corresponding changes to the documentation (if applicable)
  • I have added tests that prove my fix is effective or that my feature works (if applicable)

@gridley
Copy link
Contributor

gridley commented Dec 22, 2023

Can you add some comments to the code as to why this division by two is necessary only the in the MPI case? Also, why is the length of this doubled when using MPI?

@church89
Copy link
Contributor Author

church89 commented Jan 5, 2024

Hi @gridley ,
Is this bit here in _load_previous_results method:

`if comm.size != 1:
            prev_results = self.prev_res
            self.prev_res = Results()
            mat_indexes = _distribute(range(len(self.burnable_mats)))
            for res_obj in prev_results:
                new_res = res_obj.distribute(self.local_mats, mat_indexes)
                self.prev_res.append(new_res)`

that appends `new_res` objects twice.  

Copy link
Contributor

@drewejohnson drewejohnson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there an implied bug here where we double the size of the depletion results if we're doing MPI? The logic you pointed out in

            for res_obj in prev_results:
                new_res = res_obj.distribute(self.local_mats, mat_indexes)
                self.prev_res.append(new_res)

is that intentional or a bug or is it doing something we're not quite aware of?

openmc/deplete/abc.py Outdated Show resolved Hide resolved
examples/pincell_depletion/restart_depletion.py Outdated Show resolved Hide resolved
@church89
Copy link
Contributor Author

church89 commented Feb 5, 2024

@drewejohnson @gridley , I think I've figured out what's happening here:
self.prev_res = Results() does not reinitialize to a null Results class (as the default argument in results.py is filename='depletion_results.h5'), and that's why it's size got double after the for loop.
I've simply set the depletion result filename argument to None and now works as expected, without having to do the strange division later on.

Copy link
Contributor

@paulromano paulromano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a style fix and this should be good to go. Sorry for the delay on this!

openmc/deplete/coupled_operator.py Outdated Show resolved Hide resolved
@paulromano
Copy link
Contributor

@drewejohnson Any further comments or do you approve of the change here?

@paulromano paulromano enabled auto-merge (squash) April 4, 2024 20:24
@paulromano paulromano merged commit 569c087 into openmc-dev:develop Apr 9, 2024
18 checks passed
@drewejohnson
Copy link
Contributor

Thanks for your patience!

church89 added a commit to openmsr/openmc that referenced this pull request Jul 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants