Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide support for the mpi_f08 module #497

Merged
merged 5 commits into from
Feb 29, 2024
Merged

Conversation

tukss
Copy link
Contributor

@tukss tukss commented Feb 24, 2024

This PR add the new option --mpi-f08 to configure, which bases all of Rayleigh's MPI calls on the mpi_f08 module instead of the mpi module. The default (mpi) is left untouched but the hope is that the new option helps us work around many compiler warnings and errors due to stricter type checking. In the long term we probably have switch to mpi_f08 as this is what the MPI standard highly recommends and the old mpi module might go away at some point.

Ideally we should have the CI build and test all MPI variants. I don't want to replicate all the code in the Github actions though. If anyone knows of a more elegant way of running the same code with different configure parameters, I'd appreciate it.

Rationale for adding this:

  • mpi_f08 is the only standard compliant way to use MPI from Fortran. See the discussion in Section 19.1.1 of the MPI-4.1 standard.
  • We have had issues with the F90 module (mpi) and the F77 bindings (mpif.h), where the compiler inferred the type of the buffer argument on the first call and issued errors when the same routine was subsequently called with a different type. mpi_f08 uses Type(*) buffers to avoid this.

This also includes some other MPI cleanups:

  • We have supplied a variable for the status argument of many MPI calls without ever doing anything with it. By passing MPI_STATUS_IGNORE instead we make the code simpler and don't force the MPI library to fill a status variable we never use.
  • The MPI calls used MPI_DOUBLE_PRECISION as the data type, even though all our buffers are declared as Real*8. While these are the same in most situations, the latter type of declaration is nonstandard and there is no guarantee this always yields an IEEE 754 binary64 (double precision) number. MPI provides the MPI_REAL8 type to have something that's compatible with the Real*8 declaration in Rayleigh. In the long run we should switch to the standard way of declaring variables using the kind parameter.

This was not used anywhere and makes the F08 transition easier.
No need to let MPI fill the information if we are going to ignore it anyway.
All our buffers are using Real*8 instead of Double Precision. While
these are virtually always the same, this is cleaner and might save us
errors on unconventional architectures/compilers.
@tukss tukss force-pushed the mpi_f08 branch 3 times, most recently from de6f88b to 7932e74 Compare February 24, 2024 06:39
@tukss
Copy link
Contributor Author

tukss commented Feb 24, 2024

I've set up a matrix build for the CI testing all the MPI configurations. It's a bit wasteful because it builds the documentation for each.

The mpi_f08 module currently doesn't work on the conda build because the MPICH 3.4 we're using there doesn't seem provide that module, at least not in the configuration we're using.

Copy link
Contributor

@feathern feathern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @tukss ,

This looks good to me -- approving.

Also, I completely agree with switching to the more standard declaration format using 'kind.' Back in 2020, I started a branch of Rayleigh where I was doing just that, but then got sidetracked. Perhaps we should look back into this at this summer's workshop.

-Nick

@feathern feathern merged commit 75f6667 into geodynamics:main Feb 29, 2024
7 checks passed
@tukss tukss deleted the mpi_f08 branch March 2, 2024 01:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants