include("../../include/msg-header.inc"); ?>
From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2007-05-24 11:49:34
To add to what George said -- it looks like you have multiple
different implementations of MPI installed on your machine (LAM/MPI,
MPICH, MPICH2, ...?). Ensure that you completely compile and run
your application with *one* implementation of MPI (they are not
binary compatible).
Keep in mind that each MPI implementation is a wholly separate
project and code base. This list can only provide help with Open MPI
(not LAM/MPI, MPICH, or MPICH2). Other MPI implementations have
their own support mailing lists.
On May 24, 2007, at 11:44 AM, George Bosilca wrote:
> There are 2 problems. First, it look like you're using LAM and not
> Open MPI as there are some lam_ missing symbols. Second, please use
> mpicxx to link your application as it will add all the missing
> libraries.
>
> george.
>
> On May 24, 2007, at 10:38 AM, Jung, Soon-wook wrote:
>
>> Hello, users?
>>
>> Currently, Im trying to compile XOOPIC (2D plasma simulation
>> program, MPI parallel operation available) in conjunction with MPI.
>>
>> I had no problem with XOOPIC compilation in single-machine
>> operation mode; however, when MPI mode is turned enabled, it
>> generated about four or more page error messages.
>>
>> Linux cluster that Im using is supposed is under single-machine
>> operation due to some technical problems. (It means that only one
>> node is currently working.)
>>
>>
>>
>> The error shown below doesnt appear to originate from XOOPIC, but
>> rather from MPI linkage.
>>
>> (Ive made a simple mpi-test c program, compiled with mpicc and
>> run with mpirun. It just worked fine.)
>>
>>
>>
>> * I have MPI version 1.2.7p1
>>
>> * gcc, g++ version 3.4.6 (both)
>>
>> * OS: centOS
>>
>> * Both MPICH & MPICH2 are installed in the machine. (/usr/local/
>> mpich, /usr/local/mpich2)
>>
>>
>>
>> 1. This is PATH & LD_LIBRARY_PATH
>>
>> ---------------------------------------------
>>
>> Echo $PATH
>>
>> /usr/kerberos/sbin:/usr/local/mpich/bin:/usr/local/mpich2/bin:/usr/
>> local/pbs/sbin:/usr/local/pbs/bin:/usr/local/maui/sbin:/usr/local/
>> maui/bin:/usr/local/bwatch:/usr/local/hpc/bin:/opt/intel/compiler/
>> 9.1/bin:/opt/absoft/bin:/usr/local/ldap/bin:/usr/local/ldap/sbin:/
>> usr/kerberos/bin:/usr/local/mpich/bin:/usr/local/mpich2/bin:/usr/
>> local/pbs/sbin:/usr/local/pbs/bin:/usr/local/maui/sbin:/usr/local/
>> maui/bin:/usr/local/bwatch:/usr/local/hpc/bin:/opt/intel/compiler/
>> 9.1/bin:/opt/absoft/bin:/usr/local/ldap/bin:/usr/local/ldap/sbin:/
>> usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/local/ganglia/bin:/
>> home/jooilyoon/bin:/usr/local/ganglia/bin:/usr/local/mpich:/usr/
>> local/mpich2
>>
>>
>>
>> Echo $LD_LIBRARY_PATH
>>
>> /opt/intel/compiler/9.1/lib:/opt/intel/mkl/8.1.1/lib/em64t::/usr/
>> local/mpich/bin:/usr/local/mpich2/bin/:/usr/local/mpich:/usr/local/
>> mpich2:/usr/local/mpich/lib:/usr/local/mpich2/lib
>>
>> ------------------------------------------------
>>
>>
>>
>>
>>
>> 2. Below is a message relevant to MPI during configuration. (It
>> seems no problem occurs.)
>>
>> ------------------------------------------------
>>
>> checking for mpicxx... /usr/local/mpich/bin/mpicxx
>>
>> checking for mpicc... /usr/local/mpich/bin/mpicc
>>
>> ------------------------------------------------
>>
>>
>>
>> 3. Below is a message in config.log
>>
>> ------------------------------------------------
>>
>> configure:7702: checking for mpicxx
>>
>> configure:7720: found /usr/local/mpich/bin/mpicxx
>>
>> configure:7732: result: /usr/local/mpich/bin/mpicxx
>>
>> configure:7833: checking for mpicc
>>
>> configure:7851: found /usr/local/mpich/bin/mpicc
>>
>> configure:7863: result: /usr/local/mpich/bin/mpicc
>>
>> -------------------------------------------------
>>
>>
>>
>> 4. This is a error message when make is executed.
>>
>> --------------------------------------------------
>>
>> g++ -Wall -Wno-unused -g -DDEBUG -DUNIX -DMPI_VERSION -
>> DMPI_DEBUG -o xoopic main.o xgmain.o -L../otools -lotools -
>> L../advisor -ladvisor -L../physics -lphysics -lotools -ladvisor -
>> Wl,-rpath,. -L/usr/lib -ltk8.4 -L/usr/lib -ltcl8.4 -L/usr/local/
>> xgrafix/lib -Wl,-rpath,/usr/local/xgrafix/lib -lXGC250 -L/usr/
>> X11R6/lib64 -Wl,-rpath,/usr/X11R6/lib64 -lXpm -ldl -L/usr/lib -
>> lz -Wl,-rpath,/usr/lib -lm
>>
>>
>>
>> main.o(.text+0x18d): In function `main':
>>
>> /home/jooilyoon/xoopic2/xg/main.cpp:72: undefined reference to
>> `MPI_Init'
>>
>> main.o(.text+0x19c):/home/jooilyoon/xoopic2/xg/main.cpp:76:
>> undefined reference to `MPI_Comm_group'
>>
>> main.o(.text+0x1b1):/home/jooilyoon/xoopic2/xg/main.cpp:77:
>> undefined reference to `MPI_Comm_create'
>>
>> main.o(.text+0x1c1):/home/jooilyoon/xoopic2/xg/main.cpp:78:
>> undefined reference to `MPI_Comm_rank'
>>
>> ../otools/libotools.a(diagn.o)(.text+0x85de): In function
>> `Diagnostics::UpdatePreDiagnostics()':
>>
>> /home/jooilyoon/xoopic2/otools/diagn.cpp:867: undefined reference
>> to `lam_mpi_sum'
>>
>> ../otools/libotools.a(diagn.o)(.text+0x85e3):/home/jooilyoon/
>> xoopic2/otools/diagn.cpp:867: undefined reference to `lam_mpi_float'
>>
>> ../otools/libotools.a(diagn.o)(.text+0x85ed):/home/jooilyoon/
>> xoopic2/otools/diagn.cpp:867: undefined reference to `MPI_Reduce'
>>
>> ../otools/libotools.a(diagn.o)(.gnu.linkonce.r._ZTVN3MPI2OpE
>> +0x20): undefined reference to `MPI::Op::Init(void (*)(void
>> const*, void*, int, MPI::Datatype const&), bool)'
>>
>> ../otools/libotools.a(diagn.o)(.gnu.linkonce.r._ZTVN3MPI2OpE
>> +0x28): undefined reference to `MPI::Op::Free()'
>>
>> ../otools/libotools.a(diagn.o)
>> (.gnu.linkonce.t._ZN4PMPI8Datatype8Set_nameEPKc+0x1d): In function
>> `PMPI::Datatype::Set_name(char const*)':
>>
>> /usr/include/mpi2cxx/datatype_inln.h:260: undefined reference to
>> `MPI_Type_set_name'
>>
>> ../otools/libotools.a(diagn.o)
>> (.gnu.linkonce.t._ZN4PMPI8Datatype8Set_attrEiPKv+0x23): In
>> function `PMPI::Datatype::Set_attr(int, void const*)':
>>
>> /usr/include/mpi2cxx/datatype_inln.h:253: undefined reference to
>> `MPI_Type_set_attr'
>>
>> ../otools/libotools.a(diagn.o)
>> (.gnu.linkonce.t._ZNK4PMPI8Datatype8Get_nameEPcRi+0x25): In
>> function `PMPI::Datatype::Get_name(char*, int&) const':
>>
>> /usr/include/mpi2cxx/datatype_inln.h:246: undefined reference to
>> `MPI_Type_get_name'
>>
>> ../otools/libotools.a(diagn.o)
>> (.gnu.linkonce.t._ZNK4PMPI8Datatype12Get_envelopeERiS1_S1_S1_
>> +0x3e): In function `PMPI::Datatype::Get_envelope(int&, int&,
>> int&, int&) const':
>>
>>
>>
>> (about a few pages of similar errors follow)
>>
>>
.
>>
>> ../otools/libotools.a(dump.o)(.text+0xd1a): In function `Quit':
>>
>> /home/jooilyoon/xoopic2/otools/dump.cpp:375: undefined reference
>> to `MPI_Finalize'
>>
>> collect2: ld returned 1 exit status
>>
>> make[2]: *** [xoopic] Error 1
>>
>> make[2]: Leaving directory `/home/jooilyoon/xoopic2/xg'
>>
>> make[1]: *** [all-recursive] Error 1
>>
>> make[1]: Leaving directory `/home/jooilyoon/xoopic2'
>>
>> make: *** [all] Error 2
>>
>> [root_at_node1 xoopic2]#
>>
>> --------------------------------------------------------
>>
>>
>>
>> All messages shown during configure, make, make install +
>> config.log files are compressed into mpi-output.tar, attached in
>> this e-mail.
>>
>> Can anybody please advise me on this problem? Any suggestion would
>> be sincerely appreciated.
>>
>> Thanks.
>>
>>
>>
>> Jung, Soon-Wook
>>
>> <ompi-output.tar>
>> _______________________________________________
>> users mailing list
>> users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
-- Jeff Squyres Cisco Systems