Project

General

Profile

Actions

Bug #4348

open

CL2D upload to appion

Added by Reginald McNulty about 8 years ago. Updated about 8 years ago.

Status:
New
Priority:
Normal
Assignee:
-
Category:
-
Target version:
-
Start date:
07/28/2016
Due date:
% Done:

0%

Estimated time:
Affected Version:
Appion/Leginon 3.2
Show in known bugs:
No
Workaround:

Description

I get an MPI error when running CL2D with myami/3.2. When I try with an older myami/trunk, the CL2D run directory has class average stacks but the results have not been uploaded to appion.The same command has been successful with other runs. Is there a way to continue the upload? Errors associated with both runs are below:

MPI Error:
he local scratch directory for this session is: /scratch/rmcnulty/1329166.garibaldi01-adm.cluster.net
the distributed scratch directory for this session is: /gpfs/work/rmcnulty/1329166.garibaldi01-adm.cluster.net
Warning: no access to tty (Bad file descriptor).
Thus no job control in this shell.
... Time stamp: 16jul28u07
... Function name: runXmippCL2D
... Appion directory: /opt/applications/myami/3.2/lib/python2.7/site-packages
... Processing hostname: nodeb0226
... Using split database
Connected to database: 'ap367'
... Committing data to database
... Getting stack data for stackid=388
Old stack info: 'box192bin2'

Traceback (most recent call last):
File "/opt/applications/myami/3.2/bin/runXmippCL2D.py", line 659, in <module>
cl2d = CL2D
File "/opt/applications/myami/3.2/lib/python2.7/site-packages/appionlib/appionScript.py", line 85, in init
self.checkConflicts()
File "/opt/applications/myami/3.2/bin/runXmippCL2D.py", line 106, in checkConflicts
apDisplay.printError("There is no MPI installed")
File "/opt/applications/myami/3.2/lib/python2.7/site-packages/appionlib/apDisplay.py", line 65, in printError
raise Exception, colorString("\n * FATAL ERROR *\n"+text+"\n\a","red")
Exception: * FATAL ERROR *
There is no MPI installed

No Results uploaded to Appion error with myami/trunk:
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-4096.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-8192.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-12288.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-16384.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-20480.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-24576.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-28672.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-32768.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-36864.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-40960.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-45056.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-49152.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-53248.hed (4096 kB)
... /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack-54986.hed (1738 kB)
... wrote 54986 particles to file /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack.hed
... size match 53.7 MB vs. 53.7 MB
... finished stack merge of /gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack.hed in 2 min 25 sec
merged 54986 particles in 21 min 12 sec
... Getting stack data for stackid=388

lines= ['\tlibmpi.so.0 => /opt/applications/openmpi/1.4.3/gnu/lib/libmpi.so.0 (0x00007f74d95a5000)\n', '\tlibmpi_cxx.so.0 => /opt/applications/openmpi/1.4.3/gnu/l
ib/libmpi_cxx.so.0 (0x00007f74d938a000)\n', '\tlibopen-rte.so.0 => /opt/applications/openmpi/1.4.3/gnu/lib/libopen-rte.so.0 (0x00007f74d90fd000)\n', '\tlibopen-pa
l.so.0 => /opt/applications/openmpi/1.4.3/gnu/lib/libopen-pal.so.0 (0x00007f74d8ea4000)\n']
Wrote 54986 particles to file
/gpfs/group/em/appion/rmcnulty/16jun10b/align/cl2d1/alignedStack.hed
Traceback (most recent call last):
File "/opt/applications/myami/trunk/bin/runXmippCL2D.py", line 660, in <module>
cl2d.start()
File "/opt/applications/myami/trunk/bin/runXmippCL2D.py", line 632, in start
self.apix = apStack.getStackPixelSizeFromStackId(self.runparams['stackid'])*self.runparams['bin']
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/appionlib/apStack.py", line 706, in getStackPixelSizeFromStackId
stackdata = getOnlyStackData(stackId, msg=msg)
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/appionlib/apStack.py", line 198, in getOnlyStackData
stackdata = appiondata.ApStackData.direct_query(stackid)
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/sinedon/data.py", line 428, in direct_query
result = db.direct_query(cls, dbid, **kwargs)
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/sinedon/dbdatakeeper.py", line 59, in direct_query
result = self.dbd.multipleQueries(queryinfo, readimages=readimages)
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/sinedon/sqldict.py", line 248, in multipleQueries
return multipleQueries(self.db, queryinfo, readimages)
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/sinedon/sqldict.py", line 522, in init
self.execute()
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/sinedon/sqldict.py", line 535, in execute
c = self._cursor()
File "/opt/applications/myami/trunk/lib/python2.6/site-packages/sinedon/sqldict.py", line 525, in _cursor
self.db.ping()
_mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away')
Exception _mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away') in <bound method CL2D.
_del__ of <__main__.CL2D object at 0x1ca94d0>> ignored

Actions

Also available in: Atom PDF