Install EMAN2 » History » Revision 77
Revision 76 (Neil Voss, 05/23/2016 02:21 PM) → Revision 77/87 (Neil Voss, 05/23/2016 02:27 PM)
h1. Install EMAN2/SPARX *Before you begin:* Please note that Appion 2.2, Appion 2.2-redux and any earlier releases do not include any features that require EMAN2/SPARX to be installed. Appion 3.0, the development trunk and future releases will require this software. This documentation assumes you are using CentOS 6 (written as of CentOS 6.5) It is best to install EMAN2/SPARX from source to get MPI working on a cluster or to avoid conflicts with having two different versions of python on your system. Binaries of EMAN2/SPARX all come with their own python pre-installed. h2. Install pre-requisite packages for EMAN2 compiling h3. yum based packages * Make sure EPEL is install, if not go here: [[Download additional Software (CentOS Specific)]] * Use yum to install devel libraries: <pre> sudo yum install fftw-devel gsl-devel boost-python numpy libjpeg-devel \ PyQt4-devel cmake ipython hdf5-devel libtiff-devel libpng-devel \ PyOpenGL db4-devel python-argparse openmpi-devel python-pip </pre> h3. Download and install FLGL FLTK download <pre> wget 'http://downloads.sourceforge.net/project/ftgl/FTGL%20Source/2.1.3%7Erc5/ftgl-2.1.3-rc5.tar.gz?r=https%3A%2F%2Fsourceforge.net%2Fprojects%2Fftgl%2Ffiles%2FFTGL%2520Source%2F&ts=1464027951&use_mirror=pilotfiber' -O ftgl-2.1.3-rc5.tar.gz 'http://fltk.org/pub/fltk/1.3.3/fltk-1.3.3-source.tar.gz' </pre> unzip <pre> tar zxvf ftgl-2.1.3-rc5.tar.gz fltk-1.3.3-source.tar.gz </pre> setup, compile, and install setup <pre> cd ftgl-2.1.3~rc5 fltk-1.3.3 ./configure </pre> compile <pre> make </pre> install <pre> sudo make install </pre> h2. Download the source # To download the source code go to the link: #* http://blake.bcm.edu/emanwiki/EMAN2 # Click on *"Main EMAN2 Download Page"* # Click on *"Download EMAN 2"*, whichever is the latest release of EMAN. # Scroll down to *Source* to download the *eman2.1beta3.source.tar.gz* h2. Work with the source # go to the directory with the source code # extract the archive: <pre> tar zxvf eman2.12.source.tar.gz </pre> # go into directory <pre>cd EMAN2/src/build</pre> # start configure script: <pre>cmake ../eman2/</pre> #* Note: alternatively you can run @ccmake ../eman2/@ and configure all the parameters #* I had to disable ftgl (fonts in OpenGL) to get 2.12 to compile. <pre> -- The C compiler identification is GNU 4.4.7 -- The CXX compiler identification is GNU 4.4.7 -- Check for working C compiler: /usr/bin/cc -- Check for working C compiler: /usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ -- Check for working CXX compiler: /usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Looking for fseek64 -- Looking for fseek64 - not found -- Looking for fseeko -- Looking for fseeko - found -- Looking for ftell64 -- Looking for ftell64 - not found -- Looking for ftello -- Looking for ftello - found -- Configuring done -- Generating done -- Build files have been written to: /emg/sw/EMAN2/src/build </pre> # start compiling: <pre>make</pre> # install to directory: <pre>sudo make install</pre> h2. Set environmental variables h3. bash <pre>sudo nano /etc/profile.d/eman2.sh</pre> <pre> export EMAN2DIR=/usr/local/EMAN2 export PATH=${EMAN2DIR}/bin:${PATH} export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:${EMAN2DIR}/lib export PYTHONPATH=${EMAN2DIR}/lib:${EMAN2DIR}/bin </pre> h3. c shell <pre>sudo nano /etc/profile.d/eman2.csh</pre> <pre> setenv EMAN2DIR /usr/local/EMAN2 setenv PATH ${EMAN2DIR}/bin:${PATH} setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:${EMAN2DIR}/lib setenv PYTHONPATH ${EMAN2DIR}/lib:${EMAN2DIR}/bin </pre> h2. Install MyMPI * First, add the openmpi module so that MyMPI can find it. <pre> module add openmpi-i386 -or- module add openmpi-x86_64 </pre> * Download source: From the main software page: http://ncmi.bcm.tmc.edu/ncmi/software/software_details?selected_software=counter_222 scroll to *MPI Support*. Click on *Download MPI Support* Click on *pydusa-1.15es.tgz* * Extract: <pre>tar zxvf pydusa-1.15es.tgz</pre> * go into directory <pre>cd pydusa-1.15es</pre> If you are on a 64bit machine, you may have problems with the code finding the numpy headers: <pre>nano configure</pre> find the line: <pre> elif test -d ${PY_PREFIX}/lib/python$PY_VERSION/site-packages/numpy/core/include; then PY_HEADER_NUMPY="-I${PY_PREFIX}/lib/python$PY_VERSION/site-packages/numpy/core/include" </pre> and replace it with: <pre> elif test -d ${PY_PREFIX}/lib64/python$PY_VERSION/site-packages/numpy/core/include; then PY_HEADER_NUMPY="-I${PY_PREFIX}/lib64/python$PY_VERSION/site-packages/numpy/core/include" </pre> * Configure: <pre> setenv MPIROOT /usr/lib64/openmpi setenv MPIINC /usr/include/openmpi-x86_64 setenv MPILIB ${MPIROOT}/lib setenv MPIBIN ${MPIROOT}/bin ./configure </pre> <pre> export MPIROOT=/usr/lib64/openmpi export MPIINC=/usr/include/openmpi-x86_64 export MPILIB=${MPIROOT}/lib export MPIBIN=${MPIROOT}/bin ./configure </pre> Please ensure mpicc was found. If it is not found, a version of MPI will be installed that may cause errors. Look for the following error and correct this issue before moving on: <pre> configure: error: Unable to find mpicc! mpicc location can be specified with --with-mpicc </pre> * compile the source: <pre>make</pre> * copy the mpi.so to site-packages with a different name: <pre> sudo mkdir /usr/lib64/python2.6/site-packages/mympi/ sudo touch /usr/lib64/python2.6/site-packages/mympi/__init__.py sudo cp -v src/mpi.so /usr/lib64/python2.6/site-packages/mympi/mpi.so </pre> * create a wrapper around the wrapper: <pre>sudo nano /usr/lib64/python2.6/site-packages/mpi.py</pre> <pre> import ctypes mpi = ctypes.CDLL('libmpi.so.1', ctypes.RTLD_GLOBAL) from mympi.mpi import * </pre> This needs to be done to avoid the error: <pre> python: symbol lookup error: /usr/lib64/openmpi/lib/openmpi/mca_paffinity_hwloc.so: undefined symbol: mca_base_param_reg_int </pre> * test 1: <pre> python -c 'import mpi' python -c 'import sys; from mpi import mpi_init; mpi_init(len(sys.argv), sys.argv)' </pre> * test 2: <pre>sxisac.py start.hdf</pre> (Note: start.hdf does not need to exist, if it does not exist then it should exit with file not found) h2. Test to see if code works see http://blake.bcm.edu/emanwiki/EMAN2/FAQ/EMAN2_unittest <pre> cd EMAN2/test/rt ./rt.py </pre> h2. Install MyMPI for MPI functions see http://sparx-em.org/sparxwiki/MPI-installation or https://www.nbcr.net/pub/wiki/index.php?title=MyMPI_Setup This fixes this problem: <pre> from mpi import mpi_init ImportError: No module named mpi </pre> This module was very difficult to get working, it seems to be a poorly supported python wrapper for MPI. So, what we are going to do is compile the module, rename it, and create a wrapper. So, essentially we are creating a wrapper around the wrapper. We can only hope they switch to [http://mpi4py.scipy.org/ mpi4py] in the future. h2. Documentation * http://blake.bcm.edu/emanwiki/EMAN2/Install * http://blake.bcm.edu/emanwiki/EMAN2/FAQ/eman2BuildFAQ ______ [[Install EMAN|< Install EMAN 1]] | [[Install SPIDER|Install SPIDER >]] ______