Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openPMD: Support for mesh refinement #1842

Merged
merged 3 commits into from
May 20, 2021
Merged

openPMD: Support for mesh refinement #1842

merged 3 commits into from
May 20, 2021

Conversation

guj
Copy link
Contributor

@guj guj commented Mar 27, 2021

Implements the openPMD mesh-refinement extension: openPMD/openPMD-standard#252

  • Field naming structure (ADIOS2):
 double  /data/1000/fields/j/x              {2048, 2048, 2048}
 double  /data/1000/fields/j/y              {2048, 2048, 2048}
 double  /data/1000/fields/j/z              {2048, 2048, 2048}
 double  /data/1000/fields/j_lvl1/x         {4096, 4096, 4096}
 double  /data/1000/fields/j_lvl1/y         {4096, 4096, 4096}
 double  /data/1000/fields/j_lvl1/z         {4096, 4096, 4096}
 double  /data/1000/fields/rho              {2048, 2048, 2048}
 double  /data/1000/fields/rho_lvl1         {4096, 4096, 4096}

(future: _<P> patch number for HDF5 as a fallback)

  • particles live on the finest level of each patch but get by default initialized on the coarse mesh

@guj guj requested a review from ax3l March 27, 2021 17:35
@ax3l ax3l self-assigned this Mar 29, 2021
@ax3l ax3l added the component: openPMD openPMD I/O label Mar 29, 2021
@ax3l ax3l changed the title Added support for mesh refinement openPMD: Support for mesh refinement Mar 29, 2021
@ax3l ax3l changed the title openPMD: Support for mesh refinement [WIP] openPMD: Support for mesh refinement Apr 1, 2021
Source/Diagnostics/WarpXOpenPMD.cpp Outdated Show resolved Hide resolved
Source/Diagnostics/WarpXOpenPMD.cpp Outdated Show resolved Hide resolved
@ax3l ax3l changed the title [WIP] openPMD: Support for mesh refinement openPMD: Support for mesh refinement May 20, 2021
@ax3l
Copy link
Member

ax3l commented May 20, 2021

After a little cleanup commit, I did more tests today (ADIOS2, HDF5) and the particles and fields look great, as well as particles transitioning between patches.

I am double-checking with @RevathiJambunathan that the back-transformed diagnostics is still working and after that I think it can go in :)

@RevathiJambunathan
Copy link
Member

Thanks for this PR!

I tested this for Back Transformed diagnostics and get an error:

STEP 64 starts ...
STEP 64 ends. TIME = 6.814271277e-13 DT = 1.064729887e-14
Walltime = 19.13449687 s; This step = 0.184028665 s; Avg. per step = 0.2989765135 s
terminate called after throwing an instance of 'std::runtime_error'
  what():  [Series] Detected illegal access to iteration that has been closed previously.
SIGABRT
/usr/bin/addr2line: DWARF error: could not find variable specification at offset 240e1
/usr/bin/addr2line: DWARF error: could not find variable specification at offset 240e1
See Backtrace.0.0 file for details
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 DUP FROM 0
with errorcode 6.

Here is the input file

inputs_3d_slice_reformatted.txt

@guj
Copy link
Contributor Author

guj commented May 20, 2021

Thanks for this PR!

I tested this for Back Transformed diagnostics and get an error:

STEP 64 starts ...
STEP 64 ends. TIME = 6.814271277e-13 DT = 1.064729887e-14
Walltime = 19.13449687 s; This step = 0.184028665 s; Avg. per step = 0.2989765135 s
terminate called after throwing an instance of 'std::runtime_error'
  what():  [Series] Detected illegal access to iteration that has been closed previously.
SIGABRT
/usr/bin/addr2line: DWARF error: could not find variable specification at offset 240e1
/usr/bin/addr2line: DWARF error: could not find variable specification at offset 240e1
See Backtrace.0.0 file for details
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 DUP FROM 0
with errorcode 6.

Here is the input file

inputs_3d_slice_reformatted.txt

Thanks I will take a look later today

@ax3l
Copy link
Member

ax3l commented May 20, 2021

The crash is likely due to additional patch parts that should remain in #1909 for the transition to group & variable based I/O options. I'll clean them out and re-test.

@guj
Copy link
Contributor Author

guj commented May 20, 2021

The crash is likely due to additional patch parts that should remain in #1909 for the transition to group & variable based I/O options. I'll clean them out and re-test.

Would be great see why 1909 is not passing the CI.

Move un-related streaming API changes out of this PR.
@ax3l
Copy link
Member

ax3l commented May 20, 2021

I removed the un-related streaming API changes from this PR.
I performed @RevathiJambunathan's test again and BTD works again :)

Let's also add BTD with openPMD to CI in a PR.

@ax3l ax3l merged commit 45e6483 into ECP-WarpX:development May 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants