[Pw_forum] failure with npool

Eduardo Ariel Menendez Proupin eariel99 at gmail.com
Mon Jul 27 17:44:43 CEST 2009


Hi,
I have found a problem with a benchmark calculation with pw.x . It aborts
when running with the -npool option.

However, it runs normally without the npool option. The calculation uses 34
k-points.


I have tried with
mpirun -np 32 pw.x -npool 2
mpirun -np 32 pw.x -npool 4
mpirun -np 4 pw.x -npool 2
mpirun -np 4 pw.x -npool 2

in two machines, using OpenMPI and HPMPI, single nodes and multiples nodes.
All the times it fails.

Looking at the output I see the following messages

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
     from addusdens_r : error #         1
     from addusdens_r : error #         1
     expected  360.00000000, found  101.25021916: wrong charge, increase
ecutrho
     from addusdens_r : error #         1
     expected  360.00000000, found  101.25021916: wrong charge, increase
ecutrho
     expected  360.00000000, found  101.25021916: wrong charge, increase
ecutrho
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
     stopping ...


     stopping ...
     stopping ...
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
     from addusdens_r : error #         1
     expected  360.00000000, found  101.25021916: wrong charge, increase
ecutrho
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

     stopping ...

the previous output was with -npool 4. Using -npool 2 I get outpits like
this

 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
     from addusdens_r : error #         1
     expected  360.00000000, found  180.00040683: wrong charge, increase
ecutrho
     from addusdens_r : error #         1
     expected  360.00000000, found  180.00040683: wrong charge, increase
ecutrho

It looks like hving problems to sum the charge densities of the pools.

Here is the input (the pseudos are from the web site)

 &CONTROL
                 calculation = 'scf' ,
                restart_mode = 'from_scratch' ,
                       outdir = '.' ,
                   pseudo_dir = '.'
                      prefix = 'cdsebench' ,
                      wfcdir = '/tmp',
 /
 &SYSTEM
                       ibrav = 0,
                   celldm(1) = 1.8897261,
                         nat = 40,
                        ntyp = 2,
                     ecutwfc = 30.0 ,
                     ecutrho = 180.0 ,
                 occupations = 'smearing' ,
                     degauss = 0.02 ,
                    smearing = 'gaussian' ,
!                   qcutz=150., q2sigma=2.0, ecfixed=24.0
 /
 &ELECTRONS
            electron_maxstep = 60,
                    conv_thr = 1.0D-6 ,
                 startingpot = 'atomic' ,
                 startingwfc = 'random' ,
!                 mixing_mode = 'TF' ,
                 mixing_beta = 0.7D0,
                 mixing_beta = 0.7D0,
              diagonalization = 'david' ,
                        tqr = .true.
 /

CELL_PARAMETERS hexagonal
     4.373836756  0.000000000  0.000000000
    -2.187115631  3.787739854  0.000000000
     0.000000000  0.000000000 71.411016248

ATOMIC_SPECIES
   Cd  112.41000  Cd.pbe-van.UPF
   Se  78.960000  Se.pbe-van.UPF

ATOMIC_POSITIONS (crystal)
Cd    0.000000000    0.000000000    0.000000000
Cd    0.000000000    0.000000000    0.100000000
Cd    0.000000000    0.000000000    0.200000000
Cd    0.000000000    0.000000000    0.300000000
Cd    0.000000000    0.000000000    0.400000000
Cd    0.000000000    0.000000000    0.500000000
Cd    0.000000000    0.000000000    0.600000000
Cd    0.000000000    0.000000000    0.700000000
Cd    0.000000000    0.000000000    0.800000000
Cd    0.000000000    0.000000000    0.900000000
Cd    0.666663821    0.333312908    0.050000151
Cd    0.666663821    0.333312908    0.150000151
Cd    0.666663821    0.333312908    0.250000151
Cd    0.666663821    0.333312908    0.350000151
Cd    0.666663821    0.333312908    0.450000151
Cd    0.666663821    0.333312908    0.550000151
Cd    0.666663821    0.333312908    0.650000151
Cd    0.666663821    0.333312908    0.750000151
Cd    0.666663821    0.333312908    0.850000151
Cd    0.666663821    0.333312908    0.950000151
Se    0.000000000    0.000000000    0.037565413
Se    0.000000000    0.000000000    0.137565413
Se    0.000000000    0.000000000    0.237565413
Se    0.000000000    0.000000000    0.337565413
Se    0.000000000    0.000000000    0.437565413
Se    0.000000000    0.000000000    0.537565413
Se    0.000000000    0.000000000    0.637565413
Se    0.000000000    0.000000000    0.737565413
Se    0.000000000    0.000000000    0.837565413
Se    0.000000000    0.000000000    0.937565413
Se    0.666665290    0.333319516    0.087565394
Se    0.666665290    0.333319516    0.187565394
Se    0.666665290    0.333319516    0.287565394
Se    0.666665290    0.333319516    0.387565394
Se    0.666665290    0.333319516    0.487565394
Se    0.666665290    0.333319516    0.587565394
Se    0.666665290    0.333319516    0.687565394
Se    0.666665290    0.333319516    0.787565394
Se    0.666665290    0.333319516    0.887565394
Se    0.666665290    0.333319516    0.987565394
K_POINTS automatic
8 8 1 0 0 0


Testing the speed, version 4.1 is a bit slower than 4.0.4 (about 9% more
time in this benchmark: 39.5 vs 36 minutes, using 32 cpus). I guess this is
irrelevant in face of Moore's law.



Thank you,
Best regards

-- 
Eduardo Menendez
Departamento de Fisica
Facultad de Ciencias
Universidad de Chile
Phone: (56)(2)9787439
URL: http://fisica.ciencias.uchile.cl/~emenendez<http://fisica.ciencias.uchile.cl/%7Eemenendez>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.democritos.it/pipermail/pw_forum/attachments/20090727/8cea18f9/attachment-0001.htm 


More information about the Pw_forum mailing list