Bug #997
openEMDB to Model memory error
0%
Download the EMDB, bin by a factor using itp, convert map back to mrc using tomrc, move map to /ami/data00/temp, then "Upload Model" in Appion.
Description
EMDB to Model fails to properly upload structures that are "very large". The job either crashes due to a memory error or the model density is inverted (i.e. the threshold is negative, and the structure is inside a box when viewed in chimera).
****Example output from /ami/data00/appion/10oct21z/models/emdb/emdb1571-10nov03o02/emdb1571-10nov03o02.appionsub.log:
!!! WARNING: Volume is very large
EMAN 1.9 Cluster ($Date: 2009/02/18 05:12:22 $).
Run '/usr/local/EMAN/bin/volume help' for detailed help.
Min=-2.739318 Max=4.777329 Mean=-0.000000 Sigma=1.000000 Threshold=1.000000
Threshold Mass
1.000000 25473.826080
1.000000 25473.826080
-2.750000 25473.826080
-2.750000 25473.826080
-2.750000 25473.826080
-2.750000 25473.826080
-2.750000 25473.826080
-2.750000 25473.826080
-2.750000 25473.826080
Image has been modified so threshold=1 is 25473.83 kDa
... finished scaling by mass in 1 min 55 sec
!!! WARNING: Volume is very large
****Example command line output from /ami/data00/appion/10oct21z/models/emdb/emdb1571-10nov04k27/modelFromEMDB.log:
... retrieving emdb file: ftp://ftp.ebi.ac.uk/pub/databases/emdb/structures/EMD-1571/map/emd_1571.map.gz
Traceback (most recent call last):
File "/ami/sw/packages/myami-2.1/appion/bin/modelFromEMDB.py", line 214, in ?
emdbmodel.start()
File "/ami/sw/packages/myami-2.1/appion/bin/modelFromEMDB.py", line 181, in start
emdbfile = self.fetchEMDB(self.params['emdbid'], ccp4name)
File "/ami/sw/packages/myami-2.1/appion/bin/modelFromEMDB.py", line 133, in fetchEMDB
g = gzip.open(data,'r').read()
File "/usr/lib64/python2.4/gzip.py", line 218, in read
self._read(readsize)
File "/usr/lib64/python2.4/gzip.py", line 278, in _read
self._add_read_data( uncompress )
File "/usr/lib64/python2.4/gzip.py", line 295, in _add_read_data
self.extrabuf = self.extrabuf + data
MemoryError
Updated by Neil Voss about 14 years ago
It is failing on the part where it is trying to set threshold by using the mass. I am not really set up to test this, but we have an 800^3 volume that it is trying to read into memory, which is 440MiB zipped/1.9GiB unzipped... In this case it dies when it tries to unzip the file in memory, but reading it into memory should not kill the program; I assume we are running on guppy. So, maybe we should not unzip it in python and unzip it using the terminal. Later, I think I like to read it into memory to calculate some statistics
Also, maybe we should add a binning option to the program for people that want to use large models in a smaller form, but we want to work with the big ones too.