Feature #2836
openbake appion processing recipes
0%
Description
Add function to save a stack or aligned stack as a recipe, enabling the same processing steps to be run for future experiments.
Updated by Gabriel Lander over 10 years ago
First step toward saving recipes. Stacks and aligned stacks can be saved and removed from the database, the recipes show up on the main page if they exist.
TO DO: update database schema:
myami/myamiweb/xml/appion_extra.xml
myami/dbschema/schema-new.py
myami/dbschema/updatelib.py
Have to figure out how this will be launched & executed
Updated by Clint Potter about 10 years ago
8/27 Appion conference call. Need to generate parent command that works on multiprocessors. Need to know how many processors to ask for. Currently not tracking number of processors. Try with defaults for now. New table stores recipes.
Updated by Amber Herold about 10 years ago
Gabe,
The runJob.py command is being saved to a file in the run directory. You may be able to parse this file to find the number of nodes and processors requested in many cases, assuming the run directory still exists. Of course it would be better to have this info in the DB. Perhaps as another column in the existing run data table.
Updated by Neil Voss about 10 years ago
- Related to Bug #2939: Appion Developers Conference Call Notes added
Updated by Clint Potter about 10 years ago
Discussed during Appion conference call. Gabe added a button to start. Gabe doesn't have much time.
Updated by Neil Voss almost 9 years ago
- Assignee changed from Gabriel Lander to Neil Voss
I am going to take a look at this. Has anything been started on it. I assume it will be using the Script tables in the Appion database to generate commands or is it solely based on the runJob log files? I will have to look at new tables for this purpose.
The main problem I do not know the answer to... is how do we launch a new command after the first one completes. Pick particles then make stack (1). I could have the user click through the steps as they complete to start, but then it is not automated. (2) Or I could put multiple commands into the same job script, but different commands have different resource needs. Perhaps the new job submission setup can allow for delayed jobs or something. Any thoughts would be appreciated.
Updated by Gabriel Lander almost 9 years ago
I spent a lot of time trying to figure what to do on this front. I think my plan was to generate all the necessary individual scripts, then have a master script that launched them. The master script would wait for them to complete before launching subsequent ones. Using this approach, the master script could simultaneously launch particle picking & CTF estimation, but wait for them both to finish before starting stack creation.
Updated by Bridget Carragher almost 9 years ago
Could we add a trigger (either as a file that appears) or a database entry so that the next up scripts can query for completion and then start once they get the go ahead. They could query on a longish time scale....
Updated by Neil Voss over 8 years ago
- Blocked by Feature #3891: Distributed Resource Management Application API added