Dear All, <br>I am writing to you since  I&#39;m attempting to run PWscf jobs on the NGS (national grid service <a href="http://www.grid-support.ac.uk/">http://www.grid-support.ac.uk/</a>) provide here in UK without any good result. Under the help of  one person in charge of the ngs I was able to compile PWscf (release 4.0.5) on it. This grid use particular and non free mpi called Platform-mpi (<a href="http://www.scali.com/Products/platform-mpi">http://www.scali.com/Products/platform-mpi</a>). Then,  I compiled espresso using Intel FORTRAN and C compiler and these mpi libraries. <br>
Now when I try to run a job on the grid, what  I get out is reported as below: <br> <br><font size="1"> Program PWSCF     v.4.0.5  starts ...<br>     Today is 14May2009 at  9:23: 5<br><br>     Parallel version (MPI)<br><br>
     Number of processors in use:       8<br>     R &amp; G space division:  proc/pool =    8<br><br>     For Norm-Conserving or Ultrasoft (Vanderbilt) Pseudopotentials or PAW<br><br>     Current dimensions of program pwscf are:<br>
     Max number of different atomic species (ntypx) = 10<br>     Max number of k-points (npk) =  40000<br>     Max angular momentum in pseudopotentials (lmaxx) =  3<br><br> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%<br>
     from  read_namelists  : error #        17<br>      reading namelist control<br> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%<br><br>     stopping ...<br> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%<br>
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%<br>     from  read_namelists  : error #        17<br>     from  read_namelists  : error #        17<br>      reading namelist control<br> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%<br>
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%</font><br><br><font size="1">TID   HOST_NAME   COMMAND_LINE            STATUS            TERMINATION_TIME<br>===== ========== ================  =======================  ===================<br>
00000 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05<br>00001 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:06<br>00002 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05<br>
00003 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05<br>00004 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05<br>00005 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05<br>
00006 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05<br>00007 cn54.ngs.r /home/ngs0802/bi  Exit (2)                 05/14/2009 09:23:05</font><br><br clear="all">Initially, I thought  it was related to my input and I tried to run it on my normal cluster PC, that use OpenMPI, and it seems to work as normally it should be. What I am suppose to do? Do you have any suggestion? Anyone do you know how to manage it ?   <br>
<br>Best Regards, Piero<br>-- <br>Pieremanuele Canepa<br>Room 230<br>School of Physical Sciences, Ingram Building,<br>University of Kent, Canterbury, Kent,<br>CT2 7NH<br>United Kingdom<br>-----------------------------------------------------------<br>
<br>