Feature #2839
openImplement Iterative Stable Alignment and Clustering (ISAC) on a 2-D image stack
0%
Files
Updated by Neil Voss over 10 years ago
- File combine_generations.py combine_generations.py added
- File sort_align_averages2.py sort_align_averages2.py added
adding neil files
Updated by Dmitry Lyumkis over 10 years ago
Neil, my initial code is here:
/ami/data16/appion/13feb21b/align/sxisac_test/old/align.py
It has tested, but not completely thoroughly. You will need to modify lines 47-51 with original particle numbers. You will also need to build in the combined class average mapping, because at the moment, I am aligning to each generation consecutively.
Updated by Neil Voss over 10 years ago
http://longboard.scripps.edu/betamyamiweb/processing/alignlist.php?expId=12679
First alignment uploaded
Updated by Neil Voss over 10 years ago
Updated by Dmitry Lyumkis over 10 years ago
here's the command that I'm using to test:
rm /ami/data00/appion/12dec05a/align/isactest/* ; /home/dlyumkis/myami/appion/bin/runSparxISAC.py \ --stack=355 --generations=2 --projectid=354 --num-part=100 \ --remoterundir=/ami/data00/appion/12dec05a/align/isactest \ --rundir=/ami/data00/appion/12dec05a/align/isactest \ --nproc=8 --runname=isactest --localhost=guppy.scripps.edu \ --jobtype=sparxisac
we still need to pack the results, move them over to remote rundir, and qsub the resulting .commands file on the cluster.
Dmitry
Updated by Sargis Dallakyan over 10 years ago
I have test page for runSparxISAC.py Launcher at http://cronus3/~sargis/myamiweb/processing/runISAC.php?expId=8556:
Just Show Command currently displays this:
/ami/data00/dev/sargis/appion runSparxISAC.py --description="test" --stack=129 --num-part=107 --lowpass=10 --highpass=2000 --bin=2 --nproc=8 --commit --nodes=2 --ppn=4 --walltime=240 --cput=2400 --rundir=/ami/data15/appion/zz07jul25b/align/ISAC41 --runname=ISAC41 --projectid=303 --expid=8556 --jobtype=sparxisac
Updated by Neil Voss over 10 years ago
- Status changed from New to Assigned
- Assignee set to Dmitry Lyumkis
you're in charge
Updated by Dmitry Lyumkis over 10 years ago
- Status changed from Assigned to New
- Assignee deleted (
Dmitry Lyumkis)
committing my revisions so far for running. To test:
runJob.py --stack=355 --generations=2 --projectid=354 --num-part=1000 --remoterundir=/ami/data00/appion/12dec05a/align/isactest2 --rundir=/ami/data00/appion/12dec05a/align/isactest2 --nodes=8 --ppn=4 --mem=48 --runname=isactest2 --localhost=guppy.scripps.edu --jobtype=sparxisac --ou=25 --expid=10755 --lp=20 --hp=400 --bin=2 --thld_err=5:10
The above command will create a jobfile in the rundir with all the relevant commands to pre-process the stack and run ISAC. It will then launch the jobfile. This needs to be synced with the webserver launching and with the upload, i.e. the webserver needs to past the appropriate parameters, and the uploader needs to read in the pickle file that is generated by this command.
Updated by Dmitry Lyumkis over 10 years ago
post-Appion workshop todo, 2014.07.16:¶
- make sure that the stack is transferred to remoterundir by the webserver when the job is launched
- interconnect (1) webpage, (2) python run wrapper, and (3) python uploader
- take into account rotation angles in the database from aligned stack and append those to rotation angles from ISAC (useful for RCT)
- upload webpage, make sure that the webpage knows when upload is ready (is this built in???)
- test with real RCT data and check whether rotation angles are uploaded correctly!
Updated by Dmitry Lyumkis over 10 years ago
Hi Neil, Sargis,
I am hoping that in the next few weeks we can finalize this issue and have a working version of ISAC that we can test. Let me know if you want me to put in any specific code, test things out, or help out in any way.
Dmitry
Updated by Neil Voss over 10 years ago
Hi Dmitry, Last I checked I was still waiting for the launcher to be finish so I could check my uploader.
Updated by Gabriel Lander over 10 years ago
while we're waiting for the web gui, is there a way to run this on the garibaldi cluster?
When I execute:
runJob.py --stack=90 --generations=2 --projectid=329 --num-part=7052 --remoterundir=/gpfs/group/em/appion/glander/14jul23a/align/isac1 --rundir=/gpfs/group/em/appion/glander/14jul23a/align/isac1 --nodes=8 --ppn=8 --mem=376 --runname=isac1 --localhost=garibaldi.scripps.edu --jobtype=sparxisac --ou=30 --bin=2 --expid=13753 --lp=20 --hp=300 --thld_err=10:20
I get the error:
... Looking up session, 13753
Traceback (most recent call last):
File "/gpfs/home/glander/myami/appion/bin/runJob.py", line 16, in <module>
agent.Main(sys.argv[1:])
File "/gpfs/home/glander/myami/appion/appionlib/apAgent.py", line 78, in Main
self.updateJobStatus(self.currentJob, hostJobId)
File "/gpfs/home/glander/myami/appion/appionlib/apAgent.py", line 134, in updateJobStatus
projDB = self.__initDB(jobObject, hostJobId)
File "/gpfs/home/glander/myami/appion/appionlib/apAgent.py", line 246, in __initDB
clustq['user'] = os.getlogin()
OSError: [Errno 2] No such file or directory
Updated by Gabriel Lander over 10 years ago
ignore that last question, I've never used the "runJob.py" script before, it seems it must be run on the head node.
We need to add an option to specify a specific queue on the cluster, but that's a different matter.
Updated by Dmitry Lyumkis over 10 years ago
Hi Neil,
I will test the launcher on one of my stacks later tonight, and then you should be able to just sync it with your upload based on the pickle file.
Gabe, I will post the command here. Note, however, that the transfer between clusters is still not working, as I believe that it is performed on the webserver side.
Dmitry
Updated by Amber Herold over 10 years ago
Gabe,
I was working on adding the option to specify a queue prior to the workshop, I'll try to finish that up today.
Updated by Amber Herold over 10 years ago
- Related to Feature #2869: Add an option to cluster parameters form to select which queue to submit a job to added
Updated by Amber Herold about 10 years ago
Haya all,
Is this in a good place for me to write the GUI for the uploader?
Looks like it will be a bit of a project because there is currently and upload page in place but it is very maxlikealign specific. I will need to separate out general align upload stuff and the creation of the specific upload command. (I'm assuming all align jobs that need to be uploaded will appear on a single page that is shown when the user clicks on the "# ready to upload" link. Correct me if I am wrong.)
I only have 9 more working days with the AMI lab, so I'll need to get started on this soon if I am going to do it.
Neil, can you point me to the command including parameters and validations required?
Updated by Neil Voss about 10 years ago
Hi Amber,
I was testing in four folders of Dmitry's (some have moved since workshop):
- little run:
- /ami/data16/appion/13nov14a/align/sxisac1
- run with skipped iterations:
- /ami/data16/appion/13sep27a/align/sxisac1
- test run:
- /ami/archive2/md0/appion/zz09apr14b/stacks/stack1/isac
- big run:
- /ami/data16/appion/13feb21b/align/sxisac1
A typical command was like (from /ami/data16/appion/13sep27a/align/sxisac1)
uploadSparxISAC.py \ --projectid 224 --runname sxisac1 -d 'testing upload' \ --timestamp 08nov27e54 --alignstackid 288 \ --commit
I have not run ISAC from start to scratch, but our plan was to use the same structure as the maxlike, but create a new table ApISACJobData instead of ApMaxLikeJobData to check for active jobs. No progress was made on this front, but I think it should be very similar to maxlike in terms of validations.
Updated by Dmitry Lyumkis about 10 years ago
Hi Amber,
For the web gui, it should basically do the same thing as the standard aligners. There are several differences. First, one should be able to choose from either the regular stack or an aligned stack. The second difference is that, once the stack is chosen, the data needs to be transferred over to a remote host, so that it can be processed in a similar manner as the reconstructions are. ISAC will always be run on garibaldi, so it hsould be compatible with that cluster. The command that I was using to test:
runJob.py --stack=355 --generations=2 --projectid=354 --num-part=1000 --remoterundir=/ami/data00/appion/12dec05a/align/isactest2 --rundir=/ami/data00/appion/12dec05a/align/isactest2 --nodes=8 --ppn=4 --mem=48 --runname=isactest2 --localhost=guppy.scripps.edu --jobtype=sparxisac --ou=25 --expid=10755 --lp=20 --hp=400 --bin=2 --thld_err=5:10
Those are the options that the web gui should be generating. Please take a look at the python side of the launcher to see what other options need to be there.
Dmitry
Updated by Amber Herold about 10 years ago
Is it possible to test on Guppy? I have not had any luck getting it to run today. A few parameters I am not sure what they are:
- generations
- ou
- thld_err
- label
- default value
- help text
- validations
Updated by Amber Herold about 10 years ago
on garibaldi I get:
Cound not find sxisac.py in your PATH
Do I need to include another module?
Updated by Dmitry Lyumkis about 10 years ago
I have not been getting the same results on guppy and garibaldi. The above command on guppy creates and launches the jobfile appropriately. It looks like Garibaldi is using old runjob.py code, and the same command creates a different job file. Amber, erhaps you can coordinate with JC to update this.
I would strongly suggest to test on garibaldi. This job should never be run on guppy, as it is too computationally intensive. In doing that, we might be able to get through some of the other bugs (as per above) as well.
What I just did is launch /ami/data17/appion/12dec05a/align/isactest3 on guppy, then killed the job, changed up some paths for calling ISAC, and relaunched it on garibaldi. You can check the differences between the job file created by runjob.py (isactest3.appionsub.guppy.job) and the one that I submitted to garibaldi (isactest3.appionsub.garibaldi.job). The latter should work, which I'll find out once it has stopped running, hopefully tomorrow.
all the labels, defaults, help texts, etc. are already in apSparxISAC.py.
I have this line to load sparx/eman2: module load eman/2.04
All we really need to do now is sync the weblauncher to successfully transfer the stack to a remote host and execute an above-like command, and then get the uploader to sync with the results.
Updated by Amber Herold about 10 years ago
Dmitry,
There are about 20 parameters in the python script in setIterationParamList. Are ou and thld_err the only ones that should be added to the launch gui? Or is the launch gui complete as is and should not add those params at all? It looks like you have set defaults for all of them on the python side.
Updated by Amber Herold about 10 years ago
r18535 Adds a launch GUI that should now work with Dmitry's apSparxISAC.py.
Still to do:- Dmitry, should the ou param (outer radius) be set in the GUI to box/2 -2, or leave it blank so that the param is not passed to the python and you can set the default value there? I left the other advanced params blank so that the resulting command is more manageable.
- I've started on copying the stack to the remote cluster path, but have not completed that so it is not functional yet.
- I think I also need to add another stack selector for aligned stacks. Please confirm.
- database changes to the PHP side, plus new SQL
- added isac jobs to the pipeline menu tally
- created a new page to show all the alignments ready for upload, including isac and maxlikealign.
- complete ISAC upload page
- test the whole thing from start to end
Updated by Sargis Dallakyan about 10 years ago
Thank you Amber. isacForm.inc nicely shows how to use generateAdditionalFormRight and generateAdditionalFormLeft using new base form for appionloop web GUI forms.
Regarding Still to do: 3. Yes indeed, it needs another stack selector. Neil showed me myamiweb/processing/runMakeStack2.php page that has an example on how to do it. Would be happy to continue working with you on this.
Updated by Clint Potter about 10 years ago
Dmitry says thus needs web launcher to transfer the stack. 2 ways to do this. #1 Have cluster read input stack from file server #2 transfer in stack during refinements. Needs a web GUI to setup a command and transfer the stack. ISAC runs at least guppy but doesn't run on garibaldi (Dmitry thinks AMber is looking into this issue). Sargis will update myami trunk on Garabaldi.
Updated by Amber Herold about 10 years ago
Clint,
I have already started on the code to transfer the stack. I have NOT had a chance to look into why ISAC is not working on Garibaldi. It would be great if Sargis could give that a try while I'm out next week. Then I could try a complete run with upload perhaps Wednesday of the following week.
Sargis,
Thanks for the info regarding the stack selector. I can add it when I return, or if you have a chance before then to work on it, feel free.
Updated by Amber Herold about 10 years ago
r18573 adds many GUI changes to support ISAC. The Aligned Stack selector is now available. Also includes menu integration and page to show isac jobs ready for upload.
TODO:- Complete the upload GUI. Sargis you said you might like to work on this. I realized I had already started on it, so I'll check in what I've got at the end of today and you can continue with it if you like.
- I did not complete the copy of the stack to a remote cluster. A function is in place for it but needs some guts. I'm not sure on the details of this. Looks like the python file is looking for the stack based on it's location in the DB...Do we just need to copy the stack in the case where the cluster does not have access to the stack?
- Add localhost to the command - looks like this is needed for results rsync
- Every time the launch page is reloaded, the base file is added to the remoterundir resulting in path/align/align. I think this is also happening to refinements...path/recon/recon. Should be an easy enough bug fix, but keep an eye on your remoterundir before submitting.
- Probably need to add a DB update script to add isac run params field to ApAlignRunData in existing DBs.
There are certainly bugs in r18573. Some are minor and noted in the code with TODO's which I hope to address. The files edited are used by many other things, so keep an eye out for new bugs in old features.
Updated by Amber Herold about 10 years ago
r18576 adds a start on the Upload GUI.
Sargis, this page still needs quite a bit of work. Since ISAC is not really a refinement run and not really an Alignment run but straddles both worlds, you'll need to take a close look at when info is added to the database and weather what you need for the upload command is available. The run step may need to add some info on the python side.
Also, we need to identify a directory with a completed run to see what the output files look like in terms of file names so the upload gui can list and display those.
Updated by Clint Potter about 10 years ago
- Assignee set to Amber Herold
Discussed during Appion conference call. Dmitry needs to create a picl file to pass parameters from ISAC. Amber will note details in issue. Amber will change Neil's code to make this work for now.
Updated by Amber Herold about 10 years ago
- The head node of the cluster has access to the appion database
- The results have been copied to the local run directory during the run step using rsync and the local helper host and these results include a pickle file with all the run parameters.
- need to add localhost to the run launch page to use for the rsync. Dmitry, can you confirm that this is needed...this is just what I implied from reading your code.
- I need a complete ISAC run that includes the results and the pickle file to test the uploader. So far I get stuck in the upload because the test run I have does not have the pickle file.
- Once the upload is run, I need to confirm that the status of the run is set correctly, namely that it is marked as uploaded and does not appear under the ready to upload menu item.
- ApSparxISACJobData
- ApMaxLikeJobData
- ApTopolRepJobData
Perhaps these could be merged or the fields added ApAppionJobData. It is used to track if a job is finished, hidden, and what the timestamp is that is appended to filenames. There should be a set procedure for how these remote jobs are setup and run so that common tasks can be consolidated and we make sure things that need to be done are not forgotten. For example, the ISAC jobs can not be hidden with the current implementation, which is not ideal.
Updated by Dmitry Lyumkis about 10 years ago
Amber, as far as I can tell, your assumptions are fine.
from what I recall, if the cluster does not have access to the filesystem, then it requires --localhost and --remoterundir
in the web-launcher, I couldn't find a box for "threshold error" (that's the pixel error). That parameter, as well as the inner and outer radii, should not be advanced parameters.
I also can't launch an ISAC job anymore, neither using my old command, nor from the webserver, so not sure what is going on. switching to longboard just gives me a blank page in the launch page.
Updated by Amber Herold about 10 years ago
r18586 adds localhost to the run command.
I'm unable to reproduce launch errors here with cronus3, longboard or my sandbox. Can troubleshoot Monday if needed.
Moving inner and outer radius to be non-Advanced and adding threshold error now. I see I missed that param entirely!
Updated by Amber Herold about 10 years ago
I ran a very small isac job on guppy to test the upload with.
The python upload script is not finding the timestamp that is prepended to the filenames correctly so could not find the pickle file.
I'll look into that Monday, unless you want to take a look before then Neil. I know you enjoy regular expressions!!!
Updated by Amber Herold about 10 years ago
OK, the timestamp was repeatedly incorrect, then I added a few debug lines, it started working and when I removed them it is still working. Not sure what's up with that. glob seemed to have trouble finding the files.
Now I have run into an issue that the small test run I did, does not have any class_averages_generation_*.hdf files. David said it looks like the run did not complete properly. So I need a good run to test the upload with. I've been working with this run: /ami/data00/appion/zz07jul25b/align/ISAC61
Updated by Neil Voss about 10 years ago
Hi Amber, I am following your posts, but have not been able to contribute. Yes, I like regular expressions, what is the name of the pickle file it is creating. I could adjust the regex to make sure to find it.
Updated by Amber Herold about 10 years ago
- Status changed from New to Assigned
- Assignee changed from Amber Herold to Dmitry Lyumkis
Neil, I think the expression is fine, there was something else going on, just not sure what.
I'm assigning this back to Dmitry since he is the lead on this feature.
All the GUI components should be in place.
- Run this from start to end.
- Confirm that the run and upload scripts work together.
- Confirm that uploaded runs are shown as complete.
I've been stalled because I don't have a properly completed run with all required output files to upload. However, I have confirmed that the upload gui launches the upload script correctly and reads the pickle file.
Updated by Clint Potter almost 10 years ago
Discussed during Appion conference call. Still in progress. Waiting for EMAN2 installation at Salk. Dmitry committed to doing this.
Updated by Dmitry Lyumkis over 9 years ago
This issue is on hold until we can get EMAN2 installed at Salk. We are running into python dependency issues and MPI issues.
Updated by Clint Potter over 9 years ago
Still on hold waiting for eman2. Scott has some hints for eman2.