Project

General

Profile

How to add a new refinement method » History » Version 13

Dmitry Lyumkis, 08/01/2011 01:54 PM

1 1 Dmitry Lyumkis
h1. How to add a new refinement method
2
3
h2. database architecture for refinement methods
4
5
The current database scheme for every refinement method (both single-model and multi-model) is shown below:
6
7 7 Dmitry Lyumkis
"database architecture for refinements":http://emg.nysbc.org/attachments/955/database_scheme_diagram.pdf
8 1 Dmitry Lyumkis
9
For reference, below is a diagram of the modifications to the refinement pipeline that have been performed for the refactoring. Color coding is as follows: 
10 7 Dmitry Lyumkis
11
"changes to the database architecture for refinements":http://emg.nysbc.org/attachments/954/database_scheme_diagram_changes.pdf
12 1 Dmitry Lyumkis
13
* all previous database tables / pointers that have remained unchanged during refactoring are blue. 
14
* database tables that are completely new are outlined AND filled in red
15
* database tables that have existed, but are modified are outlined in red, filled in white. The new additions are highlighted
16
* new pointers to other database tables are red; unmodified pointers are blue
17
* pointers to other database tables are all combined under "REFS"; if "REFS" is highlighted, this means that new pointers have been added
18
19 8 Dmitry Lyumkis
h2. How to add a new refinement
20
21
# determine the name of the new table in the database. In most cases, this will only be called "ApYourPackageRefineIterData." Unless there are specific parameters for each particle that you would like to save, this should probably contain all of your package-specific parameters. 
22
# write a job setup script in python (see example below). 
23
# write an upload script in python (see example below). Another option would be to have a script that converts your parameters into Appion / 3DEM format (see below), then upload as an external package. 
24
25
h2. Upload refinement script in python
26
27
The script should be titled 'uploadYourPackageRefine.py'
28
29
This script performs all of the basic operations that are needed to upload a refinement to the database, such that it can be displayed in AppionWeb. The bulk of the job is performed with the ReconUploader.py base class, which is inherited by each new uploadYourPackageRefine.py subclass script. this means that the developer's job is simply to make sure that all of the particle / package parameters are being passed in a specific format. Effectively, the only things that need to be written to this script are: 
30
31 9 Dmitry Lyumkis
# define the basic operations that will be performed: this will setup basic package parameters and call on converter functions. In the single-model refinement case (example Xmipp projection-matching):
32 8 Dmitry Lyumkis
<pre>
33
def start(self):
34
	
35
	### database entry parameters
36
	package_table = 'ApXmippRefineIterData|xmippParams'
37
	
38
	### set projection-matching path
39
	self.projmatchpath = os.path.abspath(os.path.join(self.params['rundir'], self.runparams['package_params']['WorkingDir']))
40
41
	### check for variable root directories between file systems
42
	apXmipp.checkSelOrDocFileRootDirectoryInDirectoryTree(self.params['rundir'], self.runparams['cluster_root_path'], self.runparams['upload_root_path'])
43
44
	### determine which iterations to upload
45
	lastiter = self.findLastCompletedIteration()
46
	uploadIterations = self.verifyUploadIterations(lastiter)	
47
48
	### upload each iteration
49
	for iteration in uploadIterations:
50
	
51
		apDisplay.printColor("uploading iteration %d" % iteration, "cyan")
52
	
53
		### set package parameters, as they will appear in database entries
54
		package_database_object = self.instantiateProjMatchParamsData(iteration)
55
		
56
		### move FSC file to results directory
57
		oldfscfile = os.path.join(self.projmatchpath, "Iter_%d" % iteration, "Iter_%d_resolution.fsc" % iteration)
58
		newfscfile = os.path.join(self.resultspath, "recon_%s_it%.3d_vol001.fsc" % (self.params['timestamp'],iteration))
59
		if os.path.exists(oldfscfile):
60
			shutil.copyfile(oldfscfile, newfscfile)
61
		
62
		### create a stack of class averages and reprojections (optional)
63
		self.compute_stack_of_class_averages_and_reprojections(iteration)
64
			
65
		### create a text file with particle information
66
		self.createParticleDataFile(iteration)
67
				
68
		### create mrc file of map for iteration and reference number
69
		oldvol = os.path.join(self.projmatchpath, "Iter_%d" % iteration, "Iter_%d_reconstruction.vol" % iteration)
70
		newvol = os.path.join(self.resultspath, "recon_%s_it%.3d_vol001.mrc" % (self.params['timestamp'], iteration))
71
		mrccmd = "proc3d %s %s apix=%.3f" % (oldvol, newvol, self.runparams['apix'])
72
		apParam.runCmd(mrccmd, "EMAN")
73
		
74
		### make chimera snapshot of volume
75
		self.createChimeraVolumeSnapshot(newvol, iteration)
76
		
77
		### instantiate database objects
78
		self.insertRefinementRunData(iteration)
79
		self.insertRefinementIterationData(package_table, package_database_object, iteration)
80
			
81
	### calculate Euler jumps
82
	self.calculateEulerJumpsAndGoodBadParticles(uploadIterations)	
83
	
84
	### query the database for the completed refinements BEFORE deleting any files ... returns a dictionary of lists
85
	### e.g. {1: [5, 4, 3, 2, 1]} means 5 iters completed for refine 1
86
	complete_refinements = self.verifyNumberOfCompletedRefinements(multiModelRefinementRun=False)
87
	if self.params['cleanup_files'] is True:
88
		self.cleanupFiles(complete_refinements)
89
</pre>
90 9 Dmitry Lyumkis
in the multi-model refinement case (example Xmipp ML3D):
91 8 Dmitry Lyumkis
<pre>
92
def start(self):
93
	
94
	### database entry parameters
95
	package_table = 'ApXmippML3DRefineIterData|xmippML3DParams'
96
			
97
	### set ml3d path
98
	self.ml3dpath = os.path.abspath(os.path.join(self.params['rundir'], self.runparams['package_params']['WorkingDir'], "RunML3D"))
99
		
100
	### check for variable root directories between file systems
101
	apXmipp.checkSelOrDocFileRootDirectoryInDirectoryTree(self.params['rundir'], self.runparams['cluster_root_path'], self.runparams['upload_root_path'])
102
					
103
	### determine which iterations to upload
104
	lastiter = self.findLastCompletedIteration()
105
	uploadIterations = self.verifyUploadIterations(lastiter)				
106
107
	### create ml3d_lib.doc file somewhat of a workaround, but necessary to make projections
108
	total_num_2d_classes = self.createModifiedLibFile()
109
	
110
	### upload each iteration
111
	for iteration in uploadIterations:
112
		
113
		### set package parameters, as they will appear in database entries
114
		package_database_object = self.instantiateML3DParamsData(iteration)
115
		
116
		for j in range(self.runparams['package_params']['NumberOfReferences']):
117
			
118
			### calculate FSC for each iteration using split selfile (selfile requires root directory change)
119
			self.calculateFSCforIteration(iteration, j+1)
120
			
121
			### create a stack of class averages and reprojections (optional)
122
			self.compute_stack_of_class_averages_and_reprojections(iteration, j+1)
123
				
124
			### create a text file with particle information
125
			self.createParticleDataFile(iteration, j+1, total_num_2d_classes)
126
					
127
			### create mrc file of map for iteration and reference number
128
			oldvol = os.path.join(self.ml3dpath, "ml3d_it%.6d_vol%.6d.vol" % (iteration, j+1))
129
			newvol = os.path.join(self.resultspath, "recon_%s_it%.3d_vol%.3d.mrc" % (self.params['timestamp'], iteration, j+1))
130
			mrccmd = "proc3d %s %s apix=%.3f" % (oldvol, newvol, self.runparams['apix'])
131
			apParam.runCmd(mrccmd, "EMAN")
132
			
133
			### make chimera snapshot of volume
134
			self.createChimeraVolumeSnapshot(newvol, iteration, j+1)
135
			
136
			### instantiate database objects
137
			self.insertRefinementRunData(iteration, j+1)
138
			self.insertRefinementIterationData(package_table, package_database_object, iteration, j+1)
139
			
140
	### calculate Euler jumps
141
	self.calculateEulerJumpsAndGoodBadParticles(uploadIterations)			
142
		
143
	### query the database for the completed refinements BEFORE deleting any files ... returns a dictionary of lists
144
	### e.g. {1: [5, 4, 3, 2, 1], 2: [6, 5, 4, 3, 2, 1]} means 5 iters completed for refine 1 & 6 iters completed for refine 2
145
	complete_refinements = self.verifyNumberOfCompletedRefinements(multiModelRefinementRun=True)
146
	if self.params['cleanup_files'] is True:
147 1 Dmitry Lyumkis
		self.cleanupFiles(complete_refinements)
148
</pre>
149 11 Dmitry Lyumkis
# write python functions that will convert parameters. Examples of these converters can be found in the python scripts below:
150
151 13 Dmitry Lyumkis
source:"http://emg.nysbc.org/svn/myami/trunk/appion/bin/uploadXmippRefine.py"
152 11 Dmitry Lyumkis
153
Below is a list of necessary functions, everything else is optional: 
154 9 Dmitry Lyumkis
155 10 Dmitry Lyumkis
* def __init__(): defines the name of the package
156
* def findLastCompletedIteration(): finds the last completed iteration in the refinement protocol
157
* def instantiateProjMatchParamsData(): this is for projection-matching in Xmipp; it needs to be specific to each package that is added
158
* def compute_stack_of_class_averages_and_reprojections(): creates .img/.hed files that show, for each angular increment: (1) projection and (2) class average correspond to that projection
159
* def createParticleDataFile(): this makes a .txt file that will put all parameters in Appion format. Information in this file is read by ReconUploader.py class and uploaded to the database. 
160
* def cleanupFiles(): this will remove all the redundant or unwanted files that have been created during the refinement procedure. 
161
* (optional) def some_function_for_computing_FSC_into_standard_format(): this will be called in start(). It should only be written if the FSC file is not in the specified format 
162
* (optional) def parseFileForRunParameters(): This is a BACKUP. It parses the output files created by the refinement to determine the parameters that have been specified. It is only needed if the parameters were not found in the .pickle created during the job setup.