[Pw_forum] run PWscf on national grid service in UK
Pieremanuele Canepa
pc229 at kent.ac.uk
Thu May 14 11:08:30 CEST 2009
Dear All,
I am writing to you since I'm attempting to run PWscf jobs on the NGS
(national grid service http://www.grid-support.ac.uk/) provide here in UK
without any good result. Under the help of one person in charge of the ngs
I was able to compile PWscf (release 4.0.5) on it. This grid use particular
and non free mpi called Platform-mpi (
http://www.scali.com/Products/platform-mpi). Then, I compiled espresso
using Intel FORTRAN and C compiler and these mpi libraries.
Now when I try to run a job on the grid, what I get out is reported as
below:
Program PWSCF v.4.0.5 starts ...
Today is 14May2009 at 9:23: 5
Parallel version (MPI)
Number of processors in use: 8
R & G space division: proc/pool = 8
For Norm-Conserving or Ultrasoft (Vanderbilt) Pseudopotentials or PAW
Current dimensions of program pwscf are:
Max number of different atomic species (ntypx) = 10
Max number of k-points (npk) = 40000
Max angular momentum in pseudopotentials (lmaxx) = 3
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
from read_namelists : error # 17
reading namelist control
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
from read_namelists : error # 17
from read_namelists : error # 17
reading namelist control
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
TID HOST_NAME COMMAND_LINE STATUS TERMINATION_TIME
===== ========== ================ =======================
===================
00000 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
00001 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:06
00002 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
00003 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
00004 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
00005 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
00006 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
00007 cn54.ngs.r /home/ngs0802/bi Exit (2) 05/14/2009
09:23:05
Initially, I thought it was related to my input and I tried to run it on my
normal cluster PC, that use OpenMPI, and it seems to work as normally it
should be. What I am suppose to do? Do you have any suggestion? Anyone do
you know how to manage it ?
Best Regards, Piero
--
Pieremanuele Canepa
Room 230
School of Physical Sciences, Ingram Building,
University of Kent, Canterbury, Kent,
CT2 7NH
United Kingdom
-----------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.democritos.it/pipermail/pw_forum/attachments/20090514/fa7dc001/attachment.htm
More information about the Pw_forum
mailing list