You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I tried to calculate susceptibility (Lindhard and RPA) for a multiband metal but got an error as following:
Warning: could not identify MPI environment!
Starting serial run at: 2022-09-26 00:59:34.973154
num_wann = 28
(700, 28)
Segmentation fault (core dumped)
Can you please help with this problem?
Can the linearized Eliashberg equation be solved for a system having several bands (e.g. this system) with turf codes?
Can Eliashberg calculations be done with MPI on multiprocessors? I got different results for single and multicore processes with
Eliashberg gap solving on a square lattice, given in the documentation.
I am sharing the necessary files for the multiband metallic system.
Thank you for reaching out. I have tested running your script with N_k = 2^3, and it runs just fine.
I think the problem is that the calculation runs out of memory. Currently the script is trying to compute the susceptibility for a N_k=32^3 k-point grid and for the N_w = 28 Wannier bands, the generalized susceptibity will have N_w^4 * N_k = 10^12 elements. I estimate that this would require around 1 TB (terra byte) of ram to store.
This is an example on how the quartic scaling with respect to the number of orbitals puts hard constraints on what calculations currently can be done.
Regarding your other questions, could you please provide a guide how to reproduce the MPI issue you observe? A script and a description of the steps taken to run single and multi core calculations and how the results differ would be fantastic. Please consider posting separate questions as separate issues, this makes it easier to handle multiple issues in parallel.
Hi,
I tried to calculate susceptibility (Lindhard and RPA) for a multiband metal but got an error as following:
Warning: could not identify MPI environment!
Starting serial run at: 2022-09-26 00:59:34.973154
num_wann = 28
(700, 28)
Segmentation fault (core dumped)
Can you please help with this problem?
Can the linearized Eliashberg equation be solved for a system having several bands (e.g. this system) with turf codes?
Can Eliashberg calculations be done with MPI on multiprocessors? I got different results for single and multicore processes with
Eliashberg gap solving on a square lattice, given in the documentation.
I am sharing the necessary files for the multiband metallic system.
wann_tprf.py.txt
AuBe.wout.txt
AuBe-bands.dat.txt
AuBe_hr.dat.txt
Thanks,
Partha
The text was updated successfully, but these errors were encountered: