You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
to handle particel access completely via the scrip interface and eliminate the particle-related mpi callbacks
The non-scriptinterface mpi callbacks are in teh way of having more than one espressomd.System object per script
They are also the last "client" of the current particle memory layout which we may wish to replace for performance reasons
Background
the current ParticleHandle Python object is already backed by the script interface. However, the script object only lives on the head node (defined via __so_creation_policy__ in the class definition in particle_data.py)
the script interface implementaiton src/scriptinterface/particle_data/ParticleHandle.hpp/cpp) then uses manually written mpi calls to send/get ghe particle information from a remote mpi node. The goal is to eliminate these callbacks and let the script interface do all the communication.
Steps
Switch the creation policy of the ParticleHandle script object ot global (src/python/espressomd/particle_data.py). this means that all commands (parameter set/get, method calls) on the object will run on all mpi nodes
To ease the transition, guad all exisitng setters and getters in src/scriptinterface/particle_data/ParticleHandle.cpp to only run mpi rank 0. Before each call, e.g., the v setters and getters defined in the constructor via add_parameters, and in the method calls in do_call_method()
use m_comm.rank() to determine the mpi rank
if it is !=0 return None
After this is done, the particle access from Python should work unchanged, eventhough the creation policy is now global.
If it is not yet present, copyin the scr/script_interface/communicaiton.hpp file from the Walberla branch (on github.com/RudolfWeeber/espresso). This contains necessary communicaiton helpers
Now the setterws/getters and method calls can be refactored step by step
Getters (defined in the constructor via add_parameters()
Declare a result variable as a boost::optional, e.g., boost::optional<Utils::Vector3d> for the velocity
Get the particle information via cell_structure.get_local_particle()
If it is a nullptr (the particle is not local on this node), set the result to an empty optional {}.
If the particle is a ghost (i.e., the specific mpi node has only a ghost but not the original particle), set the result to an empty optional as well (determine via Particle::is_ghost()
Otherwise, the current node has the particle and can return the info. Set the result to an optional with the respective particle property, e.g. {p.v() (see src/core/Particle.hpp)
Use reduce_optional() from scr/script_interface/communicaiton.hpp to make the result available on the head node, where python runs, and make sure that only one mpi rank provided an answer.
Return teh result. Casting to a Variant should wokr automatically for all standrad cases
Test. If it works, consider putting the entire code in a function template (use decl_type to find out the return type of the particle property in a generic fashion)`
To be continued for setters and meethod calls
The text was updated successfully, but these errors were encountered:
Fixes#4616
Description of changes:
- replace most of the particle property setters MPI callbacks by fully MPI-parallel code
- a few MPI callbacks remain for properties that are relevant to reaction methods (position, velocity, type, charge), but these will be removed in the near future by #4617
- some code duplication was introduced in `particle_node.cpp` and will be removed in the future, when the particle management will become more parallel (i.e. no more particle cache, id tracking and type tracking)
- remove 300 lines of C++ code
- deprecate `system.auto_exclusions(n)` in favor of `system.part.auto_exclusions(distance=n)`
Objective
espressomd.System
object per scriptBackground
ParticleHandle
Python object is already backed by the script interface. However, the script object only lives on the head node (defined via__so_creation_policy__
in the class definition inparticle_data.py
)src/scriptinterface/particle_data/ParticleHandle.hpp/cpp
) then uses manually written mpi calls to send/get ghe particle information from a remote mpi node. The goal is to eliminate these callbacks and let the script interface do all the communication.Steps
Switch the creation policy of the
ParticleHandle
script object ot global (src/python/espressomd/particle_data.py
). this means that all commands (parameter set/get, method calls) on the object will run on all mpi nodesTo ease the transition, guad all exisitng setters and getters in
src/scriptinterface/particle_data/ParticleHandle.cpp
to only run mpi rank 0. Before each call, e.g., thev
setters and getters defined in the constructor viaadd_parameters
, and in the method calls indo_call_method()
m_comm.rank()
to determine the mpi rank!=0
returnNone
After this is done, the particle access from Python should work unchanged, eventhough the creation policy is now global.
If it is not yet present, copyin the
scr/script_interface/communicaiton.hpp
file from the Walberla branch (on github.com/RudolfWeeber/espresso). This contains necessary communicaiton helpersNow the setterws/getters and method calls can be refactored step by step
Getters (defined in the constructor via
add_parameters()
boost::optional
, e.g.,boost::optional<Utils::Vector3d>
for the velocitycell_structure.get_local_particle()
nullptr
(the particle is not local on this node), set the result to an empty optional{}
.Particle::is_ghost()
{p.v()
(seesrc/core/Particle.hpp
)reduce_optional()
fromscr/script_interface/communicaiton.hpp
to make the result available on the head node, where python runs, and make sure that only one mpi rank provided an answer.Variant
should wokr automatically for all standrad casesdecl_type
to find out the return type of the particle property in a generic fashion)`To be continued for setters and meethod calls
The text was updated successfully, but these errors were encountered: