Page 1 of 1

Abnormal Termination

PostPosted: Tue Mar 07, 2017 7:19 am
by ganguly
Dear VASP user,
We are trying to run vasp in our cluster. However, it has not been completed successfully. I have pasted the error file along with OUTCAR file for your kind perusal. Please suggest me, how to solve this problem.

Error file
Last few lines:

MPI: (gdb) A debugging session is active.
MPI:
MPI: Inferior 1 [process 17792] will be detached.
MPI:
MPI: Quit anyway? (y or n) [answered Y; input not from terminal]
MPI: Detaching from program: /proc/17792/exe, process 17792

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----

MPI: -----stack traceback ends-----
MPI: MPI_COMM_WORLD rank 84 has terminated without calling MPI_Finalize()
MPI: aborting job

OUTCAR File

Last few lines:

--------------------------------------------------------------------------------------------------------


Maximum index for augmentation-charges 543 (set IRDMAX)


--------------------------------------------------------------------------------------------------------


First call to EWALD: gamma= 0.292
Maximum number of real-space cells 6x 6x 1
Maximum number of reciprocal cells 2x 2x11

FEWALD executed in parallel
FEWALD: cpu time 0.0480: real time 0.0468
entering main loop


----------------------------------------- Iteration 1( 1) ---------------------------------------


POTLOK: cpu time 0.0280: real time 0.0578
SETDIJ: cpu time 0.0920: real time 0.1248



Looking forward to hear from you.
With Regards,
Bishwajit

Re: Abnormal Termination

PostPosted: Tue Mar 07, 2017 2:58 pm
by admin
1. Looks like your MPI is not correctly installed.
2. Could be the memory problem, as well.
Try to calculate the smaller cell with the decreased precision.

Re: Abnormal Termination

PostPosted: Thu Mar 16, 2017 7:20 pm
by ganguly
Thank you very much for your reply. We have tried with smaller cell but got the same error. The MPI is working nicely with the other parallel program e.g. gromacs, cp2k. In our other cluster memory is an issue. Could you please tell me, how can we solve the memory issue?
Best regards,
Bishwajit

Re: Abnormal Termination

PostPosted: Fri Mar 17, 2017 10:41 am
by admin
If smaller cell gives the same error then this is not the memory issue.
Have a look at MPI installation of VASP. The correct installation
of other programs (gromacs, cp2k) does not mean that
MPI installation of VASP is correct, as well.