Bug #2331
closedIMAGIC MSA
0%
Description
This is on behalf of Travis, who has done this analysis before, running on Guppy.
His calculation is here, /ami/data15/appion/13mar06b/align/imagicmsa1
The file 'imagicmsa1.appionsub.log' points to an IMAGIC log file:
IMAGIC: /ami/data15/appion/13mar06b/align/imagicmsa1/imagicMultivariateStatisticalAnalysis.batch {'expid': 11308, 'projectid': 391, 'mask_dropoff': 0.10000000000000001, 'lpfilt': 5, 'rundir': '/ami/data15/appion/13mar06b/align/imagicmsa1', 'description': 'alignment zero tilt', 'lpfilt_imagic': 0.43463599999999997, 'MSAdistance': 'modulation', 'bin': 1, 'hpfilt_imagic': 0.0036219666666666667, 'apix': 1.0865899999999999, 'hpfilt': 600, 'boxsize': 144, 'numiters': 50, 'sessionname': '13mar06b', 'nproc': 8, 'mask_radius': 0.90000000000000002, 'alignid': '814', 'overcorrection': 0.80000000000000004, 'runname': 'imagicmsa1', 'commit': True, 'jobtype': None} ... aligned stack pixel size: 1.08659 ... aligned stack box size: 144 Traceback (most recent call last): File "/opt/myamisnap/bin/imagicMSA.py", line 327, in <module> imagicMSA.start() File "/opt/myamisnap/bin/imagicMSA.py", line 294, in start apIMAGIC.checkLogFileForErrors(os.path.join(self.params['rundir'], "imagicMultivariateStatisticalAnalysis.log")) File "/opt/myamisnap/lib/appionlib/apIMAGIC.py", line 180, in checkLogFileForErrors apDisplay.printError("ERROR IN IMAGIC SUBROUTINE, please check the logfile: "+logfile) File "/opt/myamisnap/lib/appionlib/apDisplay.py", line 57, in printError raise Exception, colorString("\n *** FATAL ERROR ***\n"+text+"\n\a","red") Exception: *** FATAL ERROR *** ERROR IN IMAGIC SUBROUTINE, please check the logfile: /ami/data15/appion/13mar06b/align/imagicmsa1/imagicMultivariateStatisticalAnalysis.log
The file imagicMultivariateStatisticalAnalysis.log
ends with:
Next useful IMAGIC command: DISPLAY Choose mode of operation: FRESH_MSA REFINE Please specify option [FRESH_MSA] : YES **ERROR: Option specified incorrectly ******************************************************* ERROR in program/subroutine: UIB_OPTION_ERROR *******************************************************
Which makes it seem as if the input was malformed. His original job file is:
#!/bin/csh #PBS -l walltime=240:00:00 #PBS -l nodes=2:ppn=4 #PBS -m e #PBS -j oe #PBS -V updateAppionDB.py 6775 R 391 webcaller.py '/opt/myamisnap/bin/appion imagicMSA.py --projectid=391 --alignid=814 --lpfilt=5 --hpfilt=600 --mask_radius=0.9 --mask_dropoff=0.1 --bin=1 --numiters=50 --MSAdistance=modulation --overcorrection=0.8 --description="alignment zero tilt" --nproc=8 --commit --rundir=/ami/data15/appion/13mar06b/align/imagicmsa1 --runname=imagicmsa1 --expid=11308 ' /ami/data15/appion/13mar06b/align/imagicmsa1/imagicmsa1.appionsub.log updateAppionDB.py 6775 D 391 exit
Gabe already chatted with us and suggested that the version of IMAGIC had perhaps changed, breaking the syntax. But this was already mostly written so I'm submitting it anyway. Hopefully it'll be a quick fix. For now we're sourcing Gabe's .cshrc to try to sync up with his MSA invocation.
Updated by Ryan Hoffman over 11 years ago
- Status changed from New to Assigned
On Garibaldi, if we source Gabe's .cshrc, this now works fine.
Probably some modification could/should be made to the myami module so that all users get the right environment without sourcing Gabe's .cshrc?
Updated by Amber Herold over 11 years ago
I'll put in a request on Monday to update the myami package on Garibaldi.
What is added to the .cshrc file?
Updated by Melody Campbell over 11 years ago
could someone walk me through source gabes .cshrc on garibaldi, please? I tried creating the command through appion through 'just show command', then starting an interactive session on garibaldi, then sourcing /home/glander/.cshrc and then submitting and it doesn't seem to work as it says " $IMAGIC_ROOT directory is not specified."
thanks!!
Updated by Gabriel Lander over 11 years ago
Melody - if you log in to garibaldi & just load the imagic module, and then echo $IMAGIC_ROOT, does that show you "/opt/applications/imagic/100312"
My .cshrc is just setting the environment variables to use my local install of myami.
Updated by Melody Campbell over 11 years ago
awesome, this works, thanks so much gabe!! have a good weekend!!
Updated by Amber Herold over 11 years ago
Hey Gabe,
Do have any time today to come chat with me about this? Guppy is not working, and the imagic version has not changed.
Updated by Melody Campbell over 11 years ago
Hi,
On April 16th some change occurred between 13:31 and 17:45.
Here's the last imagic msa that worked for me: /ami/data15/appion/13jan26d/align/imagicmsa17
and here's the first one that DIDN'T work: /ami/data15/appion/13jan26d/align/imagicmsa18
Cheers,
Melody
Updated by Melody Campbell over 11 years ago
Here is one that I ran on garibaldi yesterday, generating the command through longboard/beta, then pasting into garibaldi after starting an interactive session on a myami node, loading the imagic module, and sourcing gabe's cshrc.
/ami/data15/appion/13mar22c/align/imagicmsa-bin4
Updated by Amber Herold over 11 years ago
- Assignee set to Amber Herold
OK.
Garibaldi should work as long as the imagic module is loaded. So in your .cshrc file in your home directory, you should have something like this:
#Additional modules module load myami/trunk module load relion/1.1 module load imagic
There is no need to source Gabe's .cshrc file, which should be avoided.
On Guppy, there was an update to openmpi on April 16th which has changed the way that imagic parses parameters in it's batch file.
Openmpi had to be recompiled specifically for using with Torque, so that our mpi jobs could run across nodes on Guppy. See #2295 for more info on that.
Today, I will attempt to add logic to our imagic scripts to detect if we are using Torque enabled openmpi, and if so, skip writing out the params that imagic does not parse (the ones between the mpi call and FRESH_MSA).
Updated by Gabriel Lander over 11 years ago
Ok I think we got to the bottom of this,
I'm incorporating the changes that will be necessary if we ever update our version of imagic.
r17527
Updated by Amber Herold over 11 years ago
Well, according to: http://www.open-mpi.org/faq/?category=building#build-rte-tm
I should be able to test if the torque enabled version of mpi is used by looking for the following components:
shell$ ompi_info | grep tm MCA pls: tm (MCA v1.0, API v1.0, Component v1.0) MCA ras: tm (MCA v1.0, API v1.0, Component v1.0)
We came to the conclusion that garibaldi works and guppy does not because they have different versions of open mpi, however,
garibaldi00 home/amber> ompi_info | grep tm MCA memory: ptmalloc2 (MCA v2.0, API v2.0, Component v1.4.3) MCA ras: tm (MCA v2.0, API v2.0, Component v1.4.3) MCA plm: tm (MCA v2.0, API v2.0, Component v1.4.3) amber@guppy ~] ompi_info | grep tm MCA ras: tm (MCA v2.0, API v2.0, Component v1.6.4) MCA plm: tm (MCA v2.0, API v2.0, Component v1.6.4) MCA ess: tm (MCA v2.0, API v2.0, Component v1.6.4)
it looks like they are both using torque enabled versions. Or perhaps they are different versions but I'm not sure how to find out?
Sargis, do you have any ideas on how to tells between the versions, if they are different?
Updated by Sargis Dallakyan over 11 years ago
Component v seems to indicate the version of ompi. I have checked IMAGIC on guppy and it includes openmpi version 1.4.1:
[root@guppy bin]# pwd /opt/Imagic/openmpi/bin [root@guppy bin]# ./ompi_info|grep tm MCA memory: ptmalloc2 (MCA v2.0, API v2.0, Component v1.4.1)
Updated by Amber Herold over 11 years ago
OK. Is imagic using the openmpi at /opt/Imagic/openmpi/bin? It looks like this version of openmpi has NOT been compiled for use with Torque since it does not list the tm components.
So from what I can tell,- The version of Open MPI that Imagic uses on Guppy, is not Torque enabled and is version 1.4.1
amber@guppy /opt/Imagic/openmpi/bin] ./ompi_info Package: Open MPI imagic@Obelix Distribution Open MPI: 1.4.1 Open MPI SVN revision: r22421 Open MPI release date: Jan 14, 2010
- Other software running on Guppy is using the Torque enabled version:
[amber@guppy ~]$ ompi_info Package: Open MPI root@guppy Distribution Open MPI: 1.6.4 Open MPI SVN revision: r28081 Open MPI release date: Feb 19, 2013
- On Garibaldi if I only load module imagic, openmpi is configured for torque and ompi_info gives:
garibaldi00 home/amber> ompi_info Package: Open MPI jcducom@garibaldi00 Distribution Open MPI: 1.4.3 Open MPI SVN revision: r23834 Open MPI release date: Oct 05, 2010
- Running imagic on garibaldi works fine with torque enabled v1.4.3
- Running on Guppy does not work.
Should we change the version of OpenMPI that Imagic uses on guppy to one that is Torque enabled?
Updated by Gabriel Lander over 11 years ago
I really don't like running openMPI out of the IMAGIC directory, and really think that all programs should use the same openMPI. IMAGIC should be compiled to work with the global torque-enabled install of openmpi.
Updated by Sargis Dallakyan over 11 years ago
Sounds good. As far as I know, the way it's currently setup in appion, IMAGIC is using the default openMPI on the system. I don't know how to compile IMAGIC with the global torque-enabled install of openmpi, but I have found some options that we can change at /opt/Imagic/imagic.drv:
# # IMAGIC LICENSE (template) # # Be careful with changing the lines below # RELEASE = 5.7 # IMAGIC major.minor release CHECK = ON # automatic version checking (ON/OFF) CPU = 1 # number of CPUs (processes) MPI = OPENMPI # MPI implementation (see: MPI_EXIST in mpi_lib.f) PBS = NO # PBS support (YES/NO) FFT = SINGLETON # Fourier transform (CUDA/FFTW/SINGLETON) DISP1 = 1,X_WINDOWS,MYUNX # XWindows display device PLOT1 = 1,X_WINDOWS,MYUNX # plot device XWindows PLOT2 = 2,PS,MYUNX # plot device PostScript PLOT3 = 3,FILE,MYUNX # plot device IMAGIC image file CAMERA1 = 1,HEEL-O-MAT,MYUNX # scanner device EOF
If someone can show me how to run IMAGIC MSA, I can change "PBS = NO" to "PBS = YES or ON" to see if that helps.
Updated by Amber Herold over 11 years ago
Here is a command to test with
/opt/myamisnap/bin/appion imagicMSA.py --projectid=303 --alignid=31 --lpfilt=5 --hpfilt=600 --mask_radius=0.9 --mask_dropoff=0.1 --bin=1 --numiters=5 --MSAdistance=modulation --overcorrection=0.8 --description="test for guppy" --nproc=1 --commit --rundir=/ami/data15/appion/zz07jul25b/align/imagicmsa5 --runname=imagicmsa5 --expid=8556 --jobtype=alignanalysis
Updated by Sargis Dallakyan over 11 years ago
Thanks Amber. It runs fine for me:
[sargis@guppy ~]$ /opt/myamisnap/bin/appion imagicMSA.py --projectid=303 --alignid=31 --lpfilt=5 --hpfilt=600 --mask_radius=0.9 --mask_dropoff=0.1 --bin=1 --numiters=5 --MSAdistance=modulation --overcorrection=0.8 --description="test for guppy" --nproc=1 --commit --rundir=/ami/data15/appion/zz07jul25b/align/imagicmsa5 --runname=imagicmsa5 --expid=8556 --jobtype=alignanalysis ... Time stamp: 13apr24l51 ... Function name: imagicMSA ... Appion directory: /opt/myamisnap/lib ... Using split database Connected to database: 'ap303' ... Committing data to database ... Run directory: /ami/data15/appion/zz07jul25b/align/imagicmsa5 ... Writing function log to: imagicMSA.log ... Running on host: guppy ... Uploading ScriptData.... ... Running on host: guppy ... Found 8 processors on this machine ... Running on host: guppy ... Running Appion version 'trunk' ... copying aligned stack into working directory for operations with IMAGIC {'bin': 1, 'lpfilt_imagic': 0.65199999999999991, 'hpfilt_imagic': 0.0054333333333333326, 'expid': 8556, 'overcorrection': 0.80000000000000004, 'description': 'test for guppy', 'projectid': 303, 'boxsize': 120, 'mask_dropoff': 0.10000000000000001, 'apix': 1.6299999999999999, 'hpfilt': 600, 'alignid': '31', 'runname': 'imagicmsa5', 'nproc': 1, 'commit': True, 'rundir': '/ami/data15/appion/zz07jul25b/align/imagicmsa5', 'MSAdistance': 'modulation', 'lpfilt': 5, 'mask_radius': 0.90000000000000002, 'jobtype': 'alignanalysis', 'numiters': 5} ... aligned stack pixel size: 1.63 ... aligned stack box size: 120 Running IMAGIC .batch file: See imagicMultivariateStatisticalAnalysis.log for details IMAGIC: /ami/data15/appion/zz07jul25b/align/imagicmsa5/imagicMultivariateStatisticalAnalysis.batch ... completed in 25.27 sec EMAN: proc2d /ami/data15/appion/zz07jul25b/align/imagicmsa5/eigenimages.img /ami/data15/appion/zz07jul25b/align/imagicmsa5/eigenimages.img inplace ... inserting Align Analysis Run parameters into database ... Alignment time: 25.28 sec ... Database Insertion time: 31.48 msec ... Closing out function log: imagicMSA.log ... Ended at Wed, 24 Apr 2013 11:52:19 ... Memory increase during run: 2.918 MB Total run time: 28.49 sec ... Run directory: /ami/data15/appion/zz07jul25b/align/imagicmsa5 [sargis@guppy ~]$ which mpirun /usr/bin/mpirun [sargis@guppy ~]$ more /ami/data15/appion/zz07jul25b/align/imagicmsa5/ eigenimages.hed imagicMSA.log my_msa.lis start.img eigenimages.img imagicMultivariateStatisticalAnalysis.batch* my_msa.plt thread000.log eigenpixels.hed imagicMultivariateStatisticalAnalysis.log pixcoos.hed thread001.log eigenpixels.img msamask.hed pixcoos.img _imagic.dff msamask.img start.hed [sargis@guppy ~]$ more /ami/data15/appion/zz07jul25b/align/imagicmsa5/ eigenimages.hed imagicMSA.log my_msa.lis start.img eigenimages.img imagicMultivariateStatisticalAnalysis.batch* my_msa.plt thread000.log eigenpixels.hed imagicMultivariateStatisticalAnalysis.log pixcoos.hed thread001.log eigenpixels.img msamask.hed pixcoos.img _imagic.dff msamask.img start.hed [sargis@guppy ~]$ more /ami/data15/appion/zz07jul25b/align/imagicmsa5/imagicMSA.log [ sargis@guppy: Wed Apr 24 11:51:51 2013 ] /opt/myamisnap/bin/imagicMSA.py \ --projectid=303 --alignid=31 --lpfilt=5 --hpfilt=600 --mask_radius=0.9 \ --mask_dropoff=0.1 --bin=1 --numiters=5 --MSAdistance=modulation \ --overcorrection=0.8 --description='test for guppy' --nproc=1 \ --commit --rundir=/ami/data15/appion/zz07jul25b/align/imagicmsa5 \ --runname=imagicmsa5 --expid=8556 --jobtype=alignanalysis [Wed Apr 24 11:52:19 2013] finished run of imagicMSA
Updated by Gabriel Lander over 11 years ago
the example Amber sent only runs on 1 processor, and doesn't use the MPI version.
Run with more than 1 processor by changing the "nproc" parameter in the command.
Also, check the batch file created by appion to make sure that it is using the global mpirun install, and not IMAGIC's.
you should see a file "imagicMultivariateStatisticalAnalysis.batch" in the run directory, and it should contain a line that looks like:
mpirun -np 8 -x IMAGIC_BATCH /opt/Imagic/align/msa.e_mpi << EOF
Updated by Gabriel Lander over 11 years ago
mpirun isn't working for me on guppy. Using nodes=2:ppn=4, executing the command:
mpirun -np 8 -x IMAGIC_BATCH /opt/Imagic/msa/msa.e_mpi
I get this error:
--------------------------------------------------------------------------
A requested component was not found, or was unable to be opened. This
means that this component is either not installed or is unable to be
used on your system (e.g., sometimes this means that shared libraries
that the component requires are unable to be found/loaded). Note that
Open MPI stopped checking at the first component that it did not find.
Host: guppy-16
Framework: ess
Component: tm
--------------------------------------------------------------------------
[guppy-16:07162] [[INVALID],INVALID] ORTE_ERROR_LOG: Error in file runtime/orte_init.c at line 120
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
orte_ess_base_open failed
--> Returned value Error (1) instead of ORTE_SUCCESS
-------------------------------------------------------------------------
[guppy-16:07162] [[INVALID],INVALID] ORTE_ERROR_LOG: Error in file orted/orted_main.c at line 325
Updated by Amber Herold over 11 years ago
Sorry sargis, on the test command you need to use nproc > 1 to use mpi.
Updated by Amber Herold over 11 years ago
Gabe, looks like it could not find the torque component of openmpi on guppy-16.
Updated by Sargis Dallakyan over 11 years ago
- Status changed from Assigned to In Code Review
My attempt to compile IMAGIC module msa.e fails with the following message.
[root@guppy Imagic]# ./fori_mpi.b msa.e mpi +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Compiling IMAGIC module msa.e -- Linux version -- +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ ABSOFT or INTEL Fortran required for MPI version ERROR in compilation of msa.e.f
I'm now trying to make complied version of msa.e_mpi to work on guppy. I'm using imagicMSA.php to generate imagicMSA.py command:
/opt/myamisnap/bin/appion imagicMSA.py --projectid=303 --alignid=31 --lpfilt=5 --hpfilt=600 --mask_radius=0.9 --mask_dropoff=0.1 --bin=1 --numiters=50 --MSAdistance=modulation --overcorrection=0.8 --description="" --nproc=8 --commit --rundir=/ami/data15/appion/zz07jul25b/align/imagicmsa9 --runname=imagicmsa9 --expid=8556 --jobtype=alignanalysis
When I run this on guppy I'm getting:
[sargis@guppy ~]$ /opt/myamisnap/bin/appion imagicMSA.py --projectid=303 --alignid=31 --lpfilt=5 --hpfilt=600 --mask_radius=0.9 --mask_dropoff=0.1 --bin=1 --numiters=50 --MSAdistance=modulation --overcorrection=0.8 --description="" --nproc=8 --commit --rundir=/ami/data15/appion/zz07jul25b/align/imagicmsa9 --runname=imagicmsa9 --expid=8556 --jobtype=alignanalysis ... Time stamp: 13apr26k15 ... Function name: imagicMSA ... Appion directory: /opt/myamisnap/lib ... Using split database Connected to database: 'ap303' ... Committing data to database ... Run directory: /ami/data15/appion/zz07jul25b/align/imagicmsa9 ... Writing function log to: imagicMSA.log ... Running on host: guppy ... Uploading ScriptData.... ... Running on host: guppy ... Found 8 processors on this machine ... Running on host: guppy ... Running Appion version 'trunk' ... copying aligned stack into working directory for operations with IMAGIC {'bin': 1, 'lpfilt_imagic': 0.65199999999999991, 'hpfilt_imagic': 0.0054333333333333326, 'expid': 8556, 'overcorrection': 0.80000000000000004, 'description': '', 'projectid': 303, 'boxsize': 120, 'mask_dropoff': 0.10000000000000001, 'apix': 1.6299999999999999, 'hpfilt': 600, 'alignid': '31', 'runname': 'imagicmsa9', 'nproc': 8, 'commit': True, 'rundir': '/ami/data15/appion/zz07jul25b/align/imagicmsa9', 'MSAdistance': 'modulation', 'lpfilt': 5, 'mask_radius': 0.90000000000000002, 'jobtype': 'alignanalysis', 'numiters': 50} ... aligned stack pixel size: 1.63 ... aligned stack box size: 120 Running IMAGIC .batch file: See imagicMultivariateStatisticalAnalysis.log for details IMAGIC: /ami/data15/appion/zz07jul25b/align/imagicmsa9/imagicMultivariateStatisticalAnalysis.batch
It stays like this forever; did Control-C to get:
^CTraceback (most recent call last): File "/opt/myamisnap/bin/imagicMSA.py", line 341, in <module> Traceback (most recent call last): File "/opt/myamisnap/bin/appion", line 185, in <module> imagicMSA.start() File "/opt/myamisnap/bin/imagicMSA.py", line 308, in start rcode = subprocess.call(commandLine) apIMAGIC.checkLogFileForErrors(os.path.join(self.params['rundir'], "imagicMultivariateStatisticalAnalysis.log")) File "/usr/lib64/python2.6/subprocess.py", line 480, in call File "/opt/myamisnap/lib/appionlib/apIMAGIC.py", line 180, in checkLogFileForErrors apDisplay.printError("ERROR IN IMAGIC SUBROUTINE, please check the logfile: "+logfile) File "/opt/myamisnap/lib/appionlib/apDisplay.py", line 57, in printError raise Exception, colorString("\n *** FATAL ERROR ***\n"+text+"\n\a","red") Exception: *** FATAL ERROR *** ERROR IN IMAGIC SUBROUTINE, please check the logfile: /ami/data15/appion/zz07jul25b/align/imagicmsa9/imagicMultivariateStatisticalAnalysis.log return p.wait(timeout=timeout) File "/usr/lib64/python2.6/subprocess.py", line 1296, in wait pid, sts = _eintr_retry_call(os.waitpid, self.pid, 0) File "/usr/lib64/python2.6/subprocess.py", line 462, in _eintr_retry_call return func(*args) KeyboardInterrupt
The log file /ami/data15/appion/zz07jul25b/align/imagicmsa9/imagicMultivariateStatisticalAnalysis.log has the same error message as in the original bug Description:
Please specify option [MODULATION] : MODULATION Input (= output) file (aligned "images") [my_ali] : start Use local scratch files [YES] : MSAMASK **WARNING: answer corrected to YES **WARNING: Local directories for scratch files not defined in lognames.drv No local scratch files used Input MSA mask file [my_msamask] : eigenimages **ERROR: No such IMAGIC image file available ******************************************************* ERROR in program/subroutine: UIB_CHECK_EXIST
From /ami/data15/appion/zz07jul25b/align/imagicmsa9/imagicMultivariateStatisticalAnalysis.batch I see that it needs "YES" or "NO" between start and msamask:
... /opt/Imagic/openmpi/bin/mpirun -np 8 -x IMAGIC_BATCH /opt/Imagic/msa/msa.e_mpi <<EOF >> imagicMultivariateStatisticalAnalysis.log YES 8 FRESH_MSA modulation start msamask eigenimages
The working imagic msa has NO between start and msamask:
[sargis@guppy bin]$ more /ami/data15/appion/13jan26d/align/imagicmsa17/imagicMultivariateStatisticalAnalysis.batch #!/bin/csh -f setenv IMAGIC_BATCH 1 ... /opt/Imagic//openmpi/bin/mpirun -np 8 -x IMAGIC_BATCH /opt/Imagic//msa/msa.e_mpi <<EOF >> imagicMultivariateStatisticalAnalysis.log YES 8 FRESH_MSA modulation start NO msamask ...
I've made changes in r17539 to add the missing "NO\n". Please review this change. We can test it after /opt/myamisnap is updated overnight.
Updated by Gabriel Lander over 11 years ago
- Assignee changed from Amber Herold to Sargis Dallakyan
This will have to be done for all mpi version of IMAGIC programs (see below), our sys admin at Berkeley was able to compile Imagic to work with openmpi - I'm adding him to the thread, hopefully he has some advice.
/opt/Imagic/align/alimass.e_mpi
/opt/Imagic/align/mralign.e_mpi
/opt/Imagic/incore/incprep.e_mpi
/opt/Imagic/pick/pick_m_all.e_mpi
/opt/Imagic/threed/true3d.e_mpi
/opt/Imagic/align/mralign_brute.e_mpi
/opt/Imagic/angrec/euler.e_mpi
/opt/Imagic/msa/msa.e_mpi
/opt/Imagic/threed/true_3d.e_mpi
Updated by Sargis Dallakyan over 11 years ago
Which version of IMAGIC was at Berkeley and did you have ABSOFT or INTEL Fortran compiler? We have IMAGIC 5.7 on guppy that includes Open MPI: 1.4.1. I have compiled openmpi-1.4.5-2.src.rpm with Torque support and can replace /opt/Imagic/openmpi so that it can run parallel (MPI) jobs across nodes, if needed.
Updated by Sargis Dallakyan over 11 years ago
- Status changed from In Code Review to Assigned
Thank you Tom. Sounds good. I'm archiving this message for now. Looking forward to implementing similar changes in our system for imagic as well.
On 04/26/2013 01:47 PM, Tom Houweling wrote:
This is what I do with all versions of imagic on our torque cluster here at Berkeley and so far it has been working fine. I do not use the openmpi that is bundled with imagic but instead build my own openmpi 1.4.x with infiniband and torque support. (1.6.x does not work) Here is my config for Centos 6.x: configure --prefix=/opt/qb3/openmpi-1.4.5 --with-tm=/opt/torque --with-openib=/usr -enable-orterun-prefix-by-default --enable-shared --enable-static --disable-dlopen At runtime openmpi is in the path and LD_LIBRARY_PATH before imagic. We use the "modules" environment to pick the version on imagic we want. I only use the GNU compilers. Tom -- Tom Houweling - QB3 Nogales Lab Computer Analyst @ Howard Hughes Medical Institute University of California Berkeley, 708D Stanley Hall, Berkeley, CA 94720
Updated by Sargis Dallakyan over 10 years ago
- Status changed from Assigned to In Test
- Assignee changed from Sargis Dallakyan to Gabriel Lander
- Affected Version changed from Appion/Leginon 2.1.0 to Appion/Leginon 2.2.0
Searched for the calls to *.e_mpi programs and made changes in topologyAlignment.py (r18486). This was the only place left running mpi version of IMAGIC programs with global mpirun instead of str(self.imagicroot)+"/openmpi/bin/mpirun. This is expected to work on guppy, garibaldi and external distributions like the one in Berkeley.
Updated by Gabriel Lander over 10 years ago
- Status changed from In Test to Closed