Index by title

Add a new option to a program

Add student's T distribution to maxlike alignment

Add an output directory to hierarchical classification


Create a program from scratch


Create an image uploader program from scratch

Problem statement:

We want to upload images, including defocal pairs into the database

determine what the inputs are and what is required:

  1. folder with mrc images – only support mrc files
  2. pixel size, specified in meters
  3. tiltseries/defocal information (# per set) – cannot have both
    --images-per-tilt-series=3 --images-per-defocal-group=3
  4. two session modes, create new session or append to existing session
  5. input defocus and tilt angle:
  6. four data collection modes:
  7. override session name???
  8. require a description

Create the program file

Preliminary setup

Setup program to do the work

Test the program

Create web interface


Organization of the PHP programs


Organization of the Python programs

We have two major folders in the appion folder:

Other useful directories:

Most important python files:

Database mapping file

see also source:trunk/leginon/leginondata.py and source:trunk/leginon/projectdata.py

Appion Script hierarchy

basicScript

appionScript

appionLoop2

filterLoop

particleLoop2


2010 Appion Developer Workshop

Setting up a development enviroment

Subversion

svn checkout http://ami.scripps.edu/svn/myami/trunk/ myami/

Sinedon.cfg

<center>
At AMI, you do not normally need a sinedon.cfg file, but if you have your own testing environment you will need one.
</center>
h4. Web page

At AMI, put myamiweb into your ami_html directory and it will be available on both cronus3 and fly as http://cronus3.scripps.edu/~username/myamiweb/

You will need to run Eric's web setup wizard to get it working: http://cronus3.scripps.edu/~username/myamiweb/setup/

Your PYTHONPATH should contain both the myami folder and the myami/appion folder.


Add a new option to a program

2010 ADW/Add a new option to a program


Organization of the Python programs

2010 ADW/Organization of the Python programs


Organization of the PHP programs

2010 ADW/Organization of the PHP programs


Create an alignment program from scratch

2010 ADW/Create an alignment program from scratch


Create an image uploader program from scratch

2010 ADW/Create an image uploader program from scratch


2 Way Viewer

The 2 Way Viewer allows you to view the selected image in two image view panes side by side. The following example shows the original mrc image next to its Fourier transform. For more details see Image Viewer Overview.

2 Way Viewer Screen:


< Image Viewer | 3 Way Viewer >



The 2 Way Viewer allows you to view the selected image in two image view panes side by side. The following example shows the original mrc image next to its Fourier transform. For more details see Image Viewer Overview.

2 Way Viewer Screen:


3 Way Viewer

The 3 Way Viewer allows you to view the selected image in 3 adjacent Image View panes. The following example shows the original mrc image along with a heat map view and Fourier transform. For more details see Image Viewer Overview.

3 Way Viewer Screen
3 Way Viewer


< 2 Way Viewer | Dual Viewer >



Ab Initio Reconstruction

This section contains procedures for calculating initial models from tilted and untilted datasets.

Tilted Data Procedures

  1. Random Conical Tilt Reconstruction (RCT Volume): Obtain 3D structure from tilted data (typically -55,0).
  2. Orthogonal Tilt Reconstruction (OTR Volume): Obtain a 3D structure from data collected at 2 tilts (typically -45,0,45)

Un-Tilted Data Procedures

  1. EMAN Common Lines Reconstruction: Obtain a 3D structure from untilted 2D data containing randomly oriented molecules.
  2. IMAGIC Angular Reconstitution: Obtain a 3D structure from untilted 2D data containing randomly oriented molecules.

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:


< Particle Alignment | Refine Reconstruction >



Ace for EMAN

Hi everybody,

I think we are close to gettting a working version of ACEMAN -- ACE for EMAN.

Scott - If you want to test it you can copy the directory ace_cvs and run

aceman

from inside matlab. It basically reads in an imagic file and writes out another imagic file which has ctf information embedded in it. So all you need to do to check how good the fits are is do

ctfit outputfilename.hed

A typical fit looks like this

http://graphics.ucsd.edu/~spmallick/ctf/acemanfit.png

I have changed the way envelope is calculated. See for example

http://graphics.ucsd.edu/~spmallick/ctf ... envfit.png

I have also gone through EMANs source code to figure out how exactly the parameters of ACE and EMAN are related. There is a few things though which I do not understand yet -- noise_const is off around 1% ( I have hardcoded a compensation ) . Secondly it is not clear to me if the we can embed the astigmatism parameter in the imagic files.

I will work on ACEMAN again today evening/night to find the story behind the 1% error.

Satya


Hello everybody,

I was wondering if people want to test an experimental version of ACEMAN -- ACE for EMAN. ACEMAN takes in a stack of picked particles in imagic ( hed/img ) format and embeds the ctf parameters into it. You could then use

ctfit output_file.hed

to see how good the ctfits are. ACE and EMAN define the Envelope function in a slightly different way and so I had to make some changes in the core ace function.

Here are the steps for installation

1. Change to your ace directory.

cd ace_directory

2. Download the above file in the ace directory.

wget http://graphics.ucsd.edu/~spmallick/ctf/aceman.tgz

3. Untar unzip it

tar -zxvf aceman.tgz

A few files will be extracted into the ace directory.

4. Start MATLAB

5. Inside MATLAB do

aceman

There is not documentation yet but if you have used acedemo, it should be straight forward. Please let me know if you do get good/bad results.

Again, this is an experimental version, so I would not recommend it for real experiments yet.

Regards

Satya


Ace Estimation ^



Ace 2 Estimation

This algorithm is faster than ACE 1 and includes astigmatism estimation.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Select the leginon preset corresponding to the images you'd like to process. Generally "_en" images in leginon are the raw micrographs, but uploaded film data will have a different present. Selecting "all" will simply process all images.
  3. Check boxes allow the option to run Ace 2 concurrently with data collection. "Wait for more images" will wait until collection times out before stopping ACE 2 processing. The "Limit" box allows restrictions on the number of images to process, which is useful when testing parameters initially.
  4. Radio buttons under "Images to Process" allows a level of pre-processing image filtering. Images that were rejected, hidden, or made exmplars in the image viewer can here be included or exluded.
  5. Radio buttons under "Image Order" sets the order in which images are processed and radio buttons under "Continuation" gives the option of continuing or rerunning a previous ACE 2 run.
  6. If you have a previous ACE1 or Ace 2 run, check this box and set a confidence value to reprocess images that scored below that confidence value. Otherwise, leave unchecked.
  7. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  8. Click on "Run Ace 2" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  9. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Ace 2" submenu in the appion sidebar.
  10. Now click on the "1 Complete" link under the "Run Ace 2" submenu. This opens a summary of all CTF Estimation (Ace 1, Ace 2, CtfFind) runs with a summary histogram of confidence values.
  11. Clicking on "download ctf data" opens a dialog for exporting CTF Estimation results for use in another application.
  12. Clicking on the "acerun#" name opens a summary page of the parameters used for the particular run.
  13. The CTF parameters and astigmatism estimation determined by Ace 2 can be applied to particles during particle boxing in appion with the Create Particle Stack tool.

Notes, Comments, and Suggestions:

  1. Ace 2 is faster than ACE 1
  2. If multiple CTF estimation runs are performed for a single dataset, the Appion stack creation tool will select the CTF parameters determined with the highest confidence value (regardless of algorithm used) for a particle from a given micrograph.

< CTF Estimation | Create Particle Stack >



Ace Estimation

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Select the leginon preset corresponding to the images you'd like to process. Generally "_en" images in leginon are the raw micrographs, but uploaded film data will have a different preset. Selecting "all" will simply process all images.
  3. Check boxes allow the option to run ACE concurrently with data collection. "Wait for more images" will wait until collection times out before stopping ACE processing. The "Limit" box allows restrictions on the number of images to process, which is useful when testing parameters initially.
  4. Radio buttons under "Images to Process" allows a level of pre-processing image filtering. Images that were rejected, hidden, or made exemplars in the image viewer can be included or excluded.
  5. Radio buttons under "Image Order" sets the order in which images are processed and radio buttons under "Continuation" gives the option of continuing or rerunning a previous ACE run.
  6. Select the sample preparation method. Default parameters differ for ice and negative stain.
  7. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  8. Click on "Run ACE" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  9. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run ACE" submenu in the appion sidebar.
  10. Now click on the "1 Complete" link under the "Run ACE" submenu. This opens a summary of all CTF Estimation (Ace 1, Ace 2, CtfFind) runs with a summary histogram of confidence values.
  11. Clicking on "download ctf data" opens a dialog for exporting CTF Estimation results for use in another application.
  12. Clicking on the "acerun#" name opens a summary page of the parameters used for the particular run.
  13. The CTF parameters determined by ACE1 can be applied to particles during particle boxing in appion with the Create Particle Stack tool.

Notes, Comments, and Suggestions:

  1. ACE1 does not work on tilted images and can be quite slow.
  2. If multiple ACE1 runs are performed for a single dataset, the Appion stack creation tool will select the CTF parameters determined with the highest confidence value for a particle from a given micrograph.
  3. Script to run ACE
  4. ACE for EMAN

< CTF Estimation | Create Particle Stack >



Ace script

Hello everybody,

I few people had asked me for scripts ( instead of the GUI ) to run ace. So here it is

http://graphics.ucsd.edu/~spmallick/ctf/acescript.m

I will add it to the next ace release if I do not hear any complains. You will have to edit the script for using it. So start MATLAB and do

edit acescript.m

Read the comments, and if you have used acedemo, it should be fairly easy to follow. If you do

help acescript

it will give you the list of variables you might want to edit. If you do not understand what a particular variable means, just start acedemo and do a side by side comparison.

Regards

Satya


Ace Estimation ^



Anchi - please edit this

*


Adding refine job

Add job type to Agent.

After you have added a new refinement job class it needs to be added to the job running agent by editing the file apAgent.py in appionlib.

  1. Add the name of the module you created to the import statements at the top of the file.
  2. In the method createJobInst add the new refinment job type to the condition statements.
      Ex.
      elif "newJobType" == jobType:
                jobInstance = newModuleName.NewRefinementClass(command)
    


Additional Setup After Webserver initialization

If your webserver installation is successful, a number of tables will be propagated in the databases. There were several options for setting up databse user privileges recommended in Database Server Installation. The following additional steps should be taken, depending on which option you previously used.


< Web Server Installation | Create a Test Project >



Add Instruments

  1. Go to the webpage `http://your_host/dbem_1_5_1/addinstrument.php`
  2. Create a fake TEM instrument like this:
name: my_scope
hostname: whatever
type: Choose TEM
  1. Create a fake CCDCamera instrument like this:
name: my_scope
hostname: whatever
type: Choose CCDCamera

[Note] If you use Leginon, and still want to upload non-Leginon images, make sure that you create a pair of fake instruments like these on a host solely for uploading. It will be a disaster if you don't, as the pixelsize of the real instrument pair will be overwritten by your upload.


Administration

The Administration tool allows you to:

Note: The Administration tool is only available to users who belong to a Group with administrative privileges.

  1. Groups
  2. Users
  3. Instruments
  4. Default Settings
  5. Applications
  6. Goniometer

User Management >



Administration Tools Guide

After a new installation, you will have to input Groups,Users and Instrument to the database you have just created. Applications will need to be imported, too. These tasks can be performed through the web-based Administration Tools.

Recommandation for setup at a new institute

First user : administrator

The user named "administrator" is a special user in Leginon. Once the setting preferences in a node that shares the same class and alias are defined by the administrator, all newly created users get these settings when they launch the node until they make changes themselves. This allows a faster setup per database (institute) for the beginners. Therefore, the first user should be named as "administrator" and it should be used normally to ensure the stability of these default preferences.

Additional users :

A Leginon user set in the adminstration tool defines his/her own preferences once changed from the "adminstrator" user default above. It is also not related to the computer login user. Therefore, it is is necessary to go through the steps outlined in "<link linkend="admin_adduser">Set up for a new regular user</link>" section.

Steps involved in the installation

See <link linkend="Inst_Adm">Installation Troubleshooting</link> and Leginon Bulletin Board searching for "admin" if you run into problems.

Go to administration page

Open a web browser. Go to http://localhost/myamiweb/admin.php

Add a Group

Groups are used to associate users together. At the moment, Leginon does not use the group association for anything.

Add Adminstrator User

Add your mircoscope as an TEM instrument

Add your CCD camera as an CCDCamera instrument

See the section on <link linkend="instrument_names">Instrument Tool</link> for more details.

Load the default settings of Legion nodes

Import Applications

<blockquote>

The most commonly used Leginon applications are included as part of the Leginon download. These XML files are in subdirectory of your Leginon download and installation called "applications". The XML files should be imported either using the web based application import tool. Each application includes "(1.5)" in its name to indicate that it will work with this new version of Leginon. The applications that carry the older version name are compatible with the older Leginon.

To find Leginon installation path on Linux:

 >start-leginon.py -v

On Windows, You should find a shortcut to your Leginon installation folder in the "Start Menu/All Programs/Leginon". If not, it is likely

C:\Python25\Lib\site-packages\Leginon\

</blockquote>

Proceed to First Leginon Test Run Chapter

<link linkend="runleg_chapter">Leginon test runs</link> test for tem/ccd controls and network communications. The rest of this chapter is for references.

Set up for a new regular user

A Leginon user set in the adminstration tool defines his/her own preferences once changed from the "adminstrator" user default above. It is also not related to the computer login user. Therefore, it is is necessary to go through the following steps to set up an existing computer user as a new Leginon user:

Add a User From Administration web page

Get a copy of the configuration files

Copy <link linkend="leginon_cfg">leginon.cfg</link> and <link linkend="sinedon_cfg">sinedon.cfg</link> (if not set globally for all users) from an existing user to the home directory of the new user.

Modify leginon.cfg

Modify the [user] "Fullname" field in <command moreinfo="none">leginon.cfg</command> to correspond the "full name" field in the Leginon Administration User Tools.

More about Groups

Groups are used to associate users together. At the moment, Leginon does not use the group association for anything.

Add/Edit a Group

Remove a Group

More about Instruments

This is used to add details about the microscope and CCD Leginon will be connected to. More than one instrument can be added with different configurations. The "import" function is useful if the instrument information has been stored on different machines in different Leginon databases.

Valid Instrument Names

TEM names: Camera names:

Add/Edit an Instrument

Remove an Instrument

Import an Instrument Pair

Applications

Applications define how nodes are linked together in order to form a specialized Leginon application or program. Because Leginon uses a nodal or modular archetecture, multiple applications can be created by linking together nodes in different fashions suitable for the current experiment. Several default Leginon applications are distributed with the release. This section enables the Leginon user to import and export applications.

Import Applications online

Export Applications online

to another Host

to a Leginon application XML file

to the screen in XML format

to the screen in "easy-to-read" format

It should contain tables of Application Data, NodeSpec Data, and likely BindingSpec Data.

Calibrations

Good calibrations are absolutely essential to running Leginon. They can also be very time consuming. As a way of rudimentarily starting up without calibrating the current instrument specifically or perhaps to revert to a previously saved calibration, this import/export calibration tool can be quite useful.

Import Calibrations

Export Calibrations

to another Host

to a Leginon application XML file

to the screen in XML format

to the screen in "easy-to-read" format

Goniometer

The goniometer movement must be modeled for finer movements. Leginon calibrates this movement through the Gon Modeler node. Through this feature, the models for these movements can be graphically seen.

View the Goniometer Model


Alignment SubStacks

Use this option if you want to create substacks of already aligned particles. This option is useful to clean your dataset by excluding bad classaverages.

General Workflow:

Notes, Comments, and Suggestions:

<More Stack Tools | Particle Alignment >


Align and Edit Tilt Pairs

Use this tool if you want to correlate and box the particles of a Random Conical Tilt Session manually. Before you can run this program you need to pick the particles with either one of the available picking tools Dog Picking, Manual Picking or Template Picking.

General Workflow:

  1. If you want to edit particle picks select the desired run
  2. Enter a diameter
  3. Usually the default parameters work pretty good but feel free to play around with them!
  4. Click on "Just Show Command", copy the command, and past it in a Unix shell
  5. Pre-processing of the images can take a little while, but once they are done you can go through images pretty quickly. A window will pop up as shown below.

Notes, Comments, and Suggestions:

  1. We use the Auto Align Tilt Pairs function in Appion with great success.
  2. To speed up pre-processing, limit the number of images to process to about 10 at a time, and then use the "continue" option when you run again.
  3. The manual tilt-picker is quite stable, and can be closed at any time without concern. Appion will save the particle picks and image assessment that had been done up until that point, and these can be accessed via the "continue" option.

< Particle Selection|CTF Estimation >


Align tilt series

Currently, there are three working methods for automated or semi-automated alignment:

  1. using the correlation-based image alignment from Imod packages as implemented in eTomo "Rough Alignment".
  2. using iterative geometry-refinement/reconstruction method from proTomo.
  3. using the phase-correlated image alignment determined during Leginon data collection (Available as the initial alignment when using proTomo method.

General Workflow:

  1. Select the tiltseries to align.
  2. Check the runname and enter a description.
  3. Use the radio button to select Protomo Refinement or IMOD shift-only alignment.
  4. To submit the job to the cluster click the "Align Tilt Series" button. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  5. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Aliogn tilt series" submenu in the appion sidebar.
  6. Now click on the "1 Complete" link under the "Align tilt series" submenu. This opens a summary of all tilt series alignment runs that have been completed for this dataset.
  7. Clicking on the "alignrun id" opens a summary page for the alignment cycles that were run.
  8. Clicking on the "refine cycle" number opens a report page for that cycle including input parameters and directory paths. Clicking on "Alignment Movie" opens a new web-browser with a movie of the aligned tilt series.
  9. The same alignment cycle can be repeated by clicking "Repeat Last Aligner Iteration" or another initiated by clicking "Setup Next Aligner Iteration" (See Step 11 below).
  10. In order to calculate a tomogram from the aligned tiltseries, click on the "Create full tomogram" link in the Tomography menu on the Appion Sidebar.
  11. If you selected "Repeat Last Aligner Iteration or "Setup Next Aligner Iteration" in step 9: Enter the tilt image that protomo should use as a starting point for refinement during this next round; default is to use zero degree image. Note that once you've entered this image number, the corresponding image is surrounded by a blue circle in the tilt-series graph. If the previous refinement run was good for only a subset of the tiltseries, then under the "Reset alignment outside the range.." caption, enter a range for the sub-set of images for which the refinement was good. A box will appear in the graph above, showing the subset of images for which the previous alignment parameters will be kept; the remaining images will retain their alignment parameters for the run before last. Note that a dropdown menu is available in case the alignment parameters to apply are from a run other than the previous. Default is to include all images and to use the last iteration.
  12. Enter a description of this run. For example "repetition of alignrun id 5".
  13. Click "Align Tilt Series" to submit to the cluster. Alternatively, click "Just Show Command," to copy and paste the command into a unix shell.
  14. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Aliogn tilt series" submenu in the appion sidebar.
  15. Now click on the "1 Complete" link under the "Align tilt series" submenu. This opens a summary of all tilt series alignment runs that have been completed for this dataset, including the latest refinement runs. Either proceed with more refinement as outlined in steps 9-14, or continue to calculate a tomogram from the aligned tiltseries.

Notes, Comments, and Suggestions:

For developers:
appiondata tables involved in this process


< Tomography | Create Full Tomogram >



AMI Eclipse Quick Start Guide

1 Get the Eclipse executable

First, you can try the executable that already has the plugins installed. Get eclipse.tar.gz. Just copy it to your machine, uncompress it and the executable is available within the eclipse directoy. No further installation steps are required.

If you want to install things yourself, this is what you need:

  1. Download eclipse from here.
  2. Get the plugins:
    1. Subversion
    2. PHP
    3. Subversion connector
    4. pydev
  3. Copy it to somewhere on your local machine.
  4. Double click the eclipse executable to open. If using a machine that was setup with the AMI kickstart program, you will need to update java to use the 64 bit version.

See also: PHP: Debugging with Eclipse

2 Create a Workspace and get the MyAMI code from Subversion

  1. Double click the eclipse executable found in the eclipse directory.
  2. When it opens, it will prompt you to choose a workspace. This workspace will hold a local copy of the myami code for you to work on. A good workspace location is a amiworkspace in your home directory. That is /home/username/amiworkspace.
  3. From the menu, select Window -> Open Perspective -> SVN Repository Exploring. This will open a view labeled SVN Repositories.
  4. Go to File -> New -> Repository Location.
  5. In the URL field type: http://ami.scripps.edu/svn/myami to get the Appion and Leginon code.
  6. Press the Finish button at the bottom of the dialog. A new repository will appear in the SVN Repositories view.
  7. Click on the arrow next to the repository icon to view the trunk, branches and tags associated with the repository.
  8. Click on the trunk to highlight it. Right click and select Checkout. This will get the code from the repo and put it in your workspace. When the operation completes, you will find a myami directory under amiworkspace in your home directory.

2.1 Import an existing project into a workspace

  1. open the PHP or pydev view
  2. right click and select import
  3. under General select Existing Projects into Workspace
  4. select the project
  5. after the workspace is loaded, if the project is not linked to SVN, select Team->Share Project and select the repository

3 Configure your Development Environment

There are 2 types of development that you will most often do with the MyAMI code, Python for core processing and PHP for the web interface.

1 Setup the Python environment

Go to Window -> Preferences -> PyDev -> Editor -> Interpreter – python. Press the Auto Config button then press OK.

2 Setup the PHP environment

There are two ways to view the web applications that you are developing in your home directory. If you are developing on a machine that does not have a local Apache server, you can use the Cronus3 web server. The advantage of this is that all the image processing plugins are already installed on Cronus3 so you don't have to worry about them and you don't have to worry about making Apache work. If you do run Apache locally, you can take advantage of integrated debugging tools in Eclipse and learn more about how all the pieces of the project fit together since will will have to set more things up.

Also note that the directions below will not get project_tools running. It is currently undergoing many changes and directions will be added when that process is complete.

Use Cronus3 or Fly to view your web app

Follow Use Cronus3 or Fly to view your web app

Create your config file

Use the setup wizard to create the config file by browsing to myamiweb/setup.
At AMI, you would go to cronus3/~YOUR_HOME_DIR/myamiweb/setup.

IMPORTANT: Never check your local copy of the config files into Subversion. We don't want to share our database user information with the world. You can right click on the config file and select team -> svn:ignore to tell svn to ignore this file.

  1. To edit the config file by hand:
    1. Change directories to the myamiweb project: $ cd ~/amiworkspace/myami/myamiweb
    2. Copy config.php.template to config.php: $ cp config.php.template config.php
    3. Open config.php for editting: $ vi config.php
    4. Make changes similar to this example config file, replacing "amber" with your home directory.

Use your local Apache server to view your web app (optional)

  1. Make sure you are logged in as root. ($ su)
  2. Make sure Apache is installed. There should be /etc/init.d/httpd on your machine. Start apache with $ /etc/init.d/httpd start.
  3. Point a web browser to http://localhost/ and make sure you see the apache test page.
  4. Make sure Apache is configured for php
  5. the Apache config file is at /etc/httpd/conf
  6. web files should be at /var/www/html
    1. create symbolic link to your workspace in /var/www/html
    2. Try to view at http://localhost/myamiweb
    3. Not working because need the files on phpami
      1. go to phpami in your myamiweb workspace and create symbolic links to each file that remove the .php file extension
      2. put a symbolic link to this folder from user/share ( $ ln -s /home/amber/amiworkspace/myami/phpami php )
  7. To debug PHP issues check http://localhost/myamiweb/info.php
  8. For everything to work, you need to install plugins like the MRC module

4 Get a local copy of the databases (optional)

If you want to work on the databases and you would prefer to have a local copy to play with, read How to set up a local copy of AMI databases.

You also have an option of creating a copy of the database that you wish to work with on the fly server. You will name your DB with your name prepended to the name of the DB that is copied. You will need to update your Config file accordingly. You can work with your DB without affecting formal testing on fly or the production databases.

5 Run a Python Script

  1. Change perspective to PyDev. (Window->Open Perspective)
  2. Right click on the myami project and choose Properties
  3. Select PyDev - PYTHONPATH
  4. Under Source Folders select Add source folder
  5. Make sure myami is selected and press OK
  6. Build numextension
    1. Right click on numextension/setup.py and select Properties
    2. Select Run/Debug settings
    3. Select Edit
    4. Select the Arguments tab
    5. Under program arguments type build
    6. Select OK
    7. Right click on setup.py and select Run As->Python Run
  7. Run Leginon
    1. Make sure the config files are set correctly (leginon.cfg, sinedon.cfg)
    2. Run leginon/syscheck.py by right clicking and selecting Run As->Python Run
    3. Right click on leginon/start.py and select Run As->Python Run

6 Merge a revision to another branch

  1. Commit your change and note the revision number.
  2. Go to the branch that you want to merge the code into and select Team->Merge.
  3. Under URL click Browse and select the trunk or branch that you already committed your changes to.
  4. In the revisions area, select the radio button next to Revisions: and click the Browse button.
  5. Select the revision that you wish to merge AND the revision immediately preceding yours.
  6. Select OK to close out the revision browser.
  7. Select the Preview Button to see what files will be merged.
  8. If it looks like the correct files, select the Ok button the close the preview and select OK again to execute the Merge.
  9. When it completes, Eclipse will open a synchronizing window where you can see the differences between your local, merged version of the files and the version held by SVN.
  10. If all the changes look correct, you may Commit the changes. Please include a comment in the commit message similar to the following:

Merge from trunk r14376 and r14383 to 2.0 branch, Fix for Post-processing does not work with FREALIGN jobs , refs #657

7 How to commit a change to svn

  1. In the Pydev Explorer or PHP explorer view, right click on the file, folder or project that you want to commit and select Team->Synchronize with repository. This will show you what files have been changed in your local sandbox.
  2. You may click on the changed files to view the specific differences that you will be checking in to ensure they are what you intended.
  3. Then right click and select Commit. Add a comment that references the redmine issue number (ex. refs #123). This will automatically link the revision number to the issue in redmine.

AMI Redmine Quick Start Guide

AMI is using Redmine for

1 Register as a user

If you have not registered, click on the Register link in the top right corner of the website. After submitting the information requested, an AMI administrator will be notified via email that your registration is pending approval. When you are approved, Sign in using the Sign in link at the top right corner of the website.

2 Post a question on the Forum

Once you have registered, you can post questions on the Forums.

3 Select a project to view

Select Projects from the top left corner of the website. You will see projects that you are a member of, as well as those that are public.

Although there is not a clear division in the software code between Appion and Leginon, we use these as project names to be consistent with how the products are presented to the rest of the world.
The single svn repository holding the code for both products is available from both Redmine projects.

Available Projects:

4 Add a new issue

When you are ready for the next level of issue settings check out the Issue Workflow Tutorial.

5 View issues

6 Edit a wiki page

To edit, click the Edit link at the top of the page.

7 Create a new wiki page

To create a new wiki page, edit an existing one and include double brackets around the name of the page that you wish to create like this:

[[My new wiki page]]

It will appear as a red link when it is saved.
Click on the link and add content to your new page.
Remember to save it before navigating away or you will lose all your hard work!
More information on Wiki creation is here.


An Introduction to Appion

  1. What is Appion?
  2. System Requirements
  3. Credits

Version Change Log >



Appion and Leginon Database Tools

  1. Administration
    1. Groups
    2. Users
    3. Instruments
    4. Applications
    5. Goniometer
  2. User Management
    1. Enable User Authentication
    2. New User Registration
    3. Retrieve Forgotten Password
    4. Modify Your Profile
    5. Logout
  3. Project Management
    1. View Projects
    2. Create New Project
    3. Edit Project Description
    4. Edit Project Owners
    5. Create a Project Processing Database
    6. Unlink a Project Processing Database
    7. Upload Images to a new Project Session
    8. Share a Project Session with another User
    9. View a Summary of a Project Session
    10. Grid Management
  4. Image Viewers
    1. Image Viewer Overview
    2. Image Viewer
    3. 2 Way Viewer
    4. 3 Way Viewer
    5. Dual Viewer
    6. RCT
  5. LOI - Leginon Observer Interface
  6. Tomography Tool
  7. Hole Template Viewer

Appion Processing >



Appion Citations

Appion includes software from the following packages (more information on most of these packages can be found in the Wikibook describing EM software packages):

ACE

Appion

CAN Reference-free Alignment

Chimera

Clustering by Affinity Propogation

CTFFind

CTFTilt

DogPicker

Ed's Iteration Alignment

EMAN

EM-BFACTOR

FindEM

Frealign

IMAGIC

IMOD

ProTomo

RMeasure

SIGNATURE

SPIDER

TiltPicker

Topology Alignment

XMIPP


  1. Enable the processing plug-in by uncommenting out the following line in the file`myamiweb/config.php`
    addplugin("processing");
    

     
  2. IMAGIC and Other features:
    // Check if IMAGIC is installed and running, otherwise hide all functions
    define('HIDE_IMAGIC', false);
    
    // hide processing tools still under development.
    define('HIDE_FEATURE', true);
    

     
  3. Add processing host information
     
    Appion version 2.2 and later:
    The following code should be added and modified for each processing host available.
    $PROCESSING_HOSTS[] = array(
    'host' => 'LOCAL_CLUSTER_HEADNODE.INSTITUTE.EDU', // for a single computer installation, this can be 'localhost'    
    'nproc' => 32,  // number of processors available on the host, not used
    'nodesdef' => '4', // default number of nodes used by a refinement job
    'nodesmax' => '280', // maximum number of nodes a user may request for a refinement job
    'ppndef' => '32', // default number of processors per node used for a refinement job
    'ppnmax' => '32', // maximum number of processors per node a user may request for a refinement job
    'reconpn' => '16', // recons per node, not used 
    'walltimedef' => '48', // default wall time in hours that a job is allowed to run
    'walltimemax' => '240', // maximum hours in wall time a user may request for a job
    'cputimedef' => '1536', // default cpu time in hours a job is allowed to run (wall time x number of cpu's) 
    'cputimemax' => '10000', // maximum cpu time in hours a user may request for a job
    'memorymax' => '', // the maximum memory a job may use
    'appionbin' => 'bin/', // the path to the myami/appion/bin directory on this host
    'appionlibdir' => 'appion/', // the path to the myami/appion/appionlib directory on this host
    'baseoutdir' => 'appion', // the directory that processing output should be stored in
    'localhelperhost' => '', // a machine that has access to both the web server and the processing host file systems to copy data between the systems
    'dirsep' => '/', // the directory separator used by this host
    'wrapperpath' => '', // advanced option that enables more than one Appion installation on a single machine, contact us for info 
    'loginmethod' => 'SHAREDKEY', // Appion currently supports 'SHAREDKEY' or 'USERPASSWORD' 
    'loginusername' => '', // if this is not set, Appion uses the username provided by the user in the Appion Processing GUI
    'passphrase' => '', // if this is not set, Appion uses the password provided by the user in the Appion Processing GUI
    'publickey' => 'rsa.pub', // set this if using 'SHAREDKEY'
    'privatekey' => 'rsa'      // set this if using 'SHAREDKEY'
    );
    

     
    Appion version 2.1 and prior:
    // --- Please enter your processing host information associate with -- //
    // --- Maximum number of the processing nodes                                    -- //
    // --- $PROCESSING_HOSTS[] = array('host' => 'host1.school.edu', 'nproc' => 4); -- //
    // --- $PROCESSING_HOSTS[] = array('host' => 'host2.school.edu', 'nproc' => 8); -- //
    
    // $PROCESSING_HOSTS[] = array('host' => '', 'nproc' => );
    

     
  4. Microscope spherical aberration constant
     
    Not needed for Appion version 2.2 and later. Version 2.1 and earlier only:
    $DEFAULTCS = "2.0";
    

     
  5. Redux server information
     
    This is a very new feature, post Appion 2.2, so you only need this if you are running the trunk.
    The Redux server replaces the old mrc_php module and will be released in Appion version 3.0.
    Information on redux parameters to add to the config.php

Appion Forums

Appion and Leginon use a shared forum

Appion is closely related to its sister product, Leginon.
They are built on the same code base, require a similar installation procedure, and share user, project and administration management tools.
Because of these similarities, Appion and Leginon also share a single user Forum.

The Forum can be found in the Leginon product's Forums tab.

To post or reply to a message, you must be logged into this site. If you have not yet registered, first go to the Registration page.


Appion Main Page

Overview

Appion is a "pipeline" for processing and analysis of EM images. Appion is integrated with "Leginon":http//leginon.org data acquisition but can also be used stand-alone after uploading images (either digital or scanned micrographs) or particle stacks using a set of provided tools. Appion consists of a web based user interface linked to a set of python scripts that control several underlying integrated processing packages. All data input and output within Appion is managed using tightly integrated SQL databases. The goal is to have all control of the processing pipeline managed from a web based user interface and all output from the processing presented using web based viewing tools.

The underlying packages integrated into Appion include EMAN, Spider, Frealign, Imagic, XMIPP, IMOD, ProTomo, ACE, CTFFind and CTFTilt, findEM, DogPicker, TiltPicker, RMeasure, EM-BFACTOR, and Chimera. These packages must be acknowledged by appropriate citations when used within Appion. Appropriate citations are provided on the individual pages in Appion as well as here.

Download Appion

Follow the Appion installation instructions to download and install Appion.

If you download Appion we strongly encourage you register as an Appion user.
This will allow us to keep you informed of new releases, bug fixes, and other useful information, and also allow us to keep track of the user base which is important to ensure future support of the software.

Appion User Manual

The Appion Manual includes:

Appion Developer's Guide

The developers guide is the primary resource for getting started with code development.
Appion is an open source project. You are free to contribute to it.

Publications

Primary Publications:

Other Citations:

Software Availability and Licensing Information

Appion is released under the Apache License, Version 2.0

Youtube videos

Citations

View the entire collection of Appion citations.

Contact information:

Please email with any questions.


Appion Manual

  1. An Introduction to Appion
    1. What is Appion?
    2. System Requirements
    3. Credits
       
  2. Version Change Log
     
  3. Complete Installation
     
  4. Upgrade Instructions
     
  5. Appion User Guide
    1. Appion and Leginon Database Tools - User management, Project management, Image Viewers, etc.
    2. Appion Processing - Image processing pipeline

Appion output

Viewing Appion Results


Appion Processing

  1. Terminology
  2. Common Workflow
  3. Processing Cluster Login
  4. Particle Selection
  5. CTF Estimation
  6. Stacks
  7. Particle Alignment
    1. Run Alignment
    2. Run Feature Analysis
    3. Run Particle Clustering
  8. Ab Initio Reconstruction
  9. Refine Reconstruction
  10. Helical Processing
  11. Tomography
    1. Align tilt series
    2. Create full tomogram
    3. Upload tomogram
    4. Create tomogram subvolume
    5. Average tomogram subvolumes
  12. Import Tools
  13. Image Assessment
  14. Region Mask Creation
  15. Synthetic Data



< Appion and Leginon Database Tools



Appion Testing

Index_of_test_scripts

related issues: #671

There is a framework in place to write a test script for a particular data set that can be run on demand. The tests may be launched from the website and results of each executed command may be viewed from the web.

The following steps should be taken to add a new test.
 
  1. Select a session to be used as the test dataset. At AMI, test data set projects are named with the following format: zz_parttype_partname
    The test session names begin with "zz_". Currently, each test session may have only one test class associated with it.
    See issue #1229 for more information on AMI's test data sets.
     
  2. Add the session name to the $TEST_SESSIONS array located in config.php in the myamiweb directory.
     
  3. Make a copy of test_zz07jul25b.py located in myami/appion/bin and rename it to match your session name with "test_" prepended.
     
  4. Edit the new "test_zz....py" file.
    1. Change the following class name at the beginning of the file to reflect your test session name:
       class Test_zz07jul25b(testScript.TestScript):

      This new class inherits all the properties and methods of the base class, TestScript, which is defined in myami/appion/appionlib/testScript.py. The TestScript class includes functions to create and execute commmands for many of the Appion programs like dogpicker, pyace, maxlikeAlign, uploadTemplate, etc.
    2. At the bottom of the file replace the name of the class to be instantiated in the main function to match your new class name:
      if __name__ == "__main__":
          tester = Test_zz07jul25b()
      

      This main function will create an instance of your new class, run the start() function and finally run the close() function.
    3. Modify the start() function of your new class to run the commands relevant to your data set. Review the functions available in testScript.py to decide which parameters need to be set for each function. If you need to execute a function that is not available in the TestScript class, you can add the function to the base class (TestScript) so that other test classes can use it, or, if the function is very specific to the data set that you are testing, you can add a new function directly to your new class. If you add a new type of program/command/function to the TestScript class, you will want to ensure that the report page path is included in the buildJobReportPageLink() function in myami/myamiweb/processing/testsuiterunreport.php.
       
  5. Run your test
    1. Select your test session in the Appion Image Viewer and select Processing.
    2. Scroll down to the bottom of the menu and select "Run Test Script"
    3. Under "Limit: only process images" enter a small number between 1 and 10.
    4. Select "just show command" and copy the resulting command to past into a terminal. See .... for more info on running from your sandbox.
    5. Once the test has run, click on the # complete link under the Testing Tools section of the menu. Notice your test run information.
    6. Select the test run to view a summary of the commands that were executed. Clicking on any of these will show more details of the run.

Appion tricks

How to move a Leginon session to a different Appion project (before any processing has been done)

  1. Find the projectexperiments table in the project database. This relates the Leginon sessions with appion projects and is the only place where a change needs to be made.
  2. Find the session id. In the web tools Project DB interface, browse to the project that currently holds the session and note the session id number of the session you wish to move.
  3. Find the old project id and the new project id. Look at the URL in the web browser while viewing the project. You should see "projectId=327" at the end of the url. The integer number is the project id to note. This number may also correspond to the appion database name (ex. ap327).
  4. In the projectexperiments table, search for the entry for your session and confirm the current project id is listed in its entry. Then edit the entry to include the new project id.
  5. Browse to the new project in the web interface and confirm the session is listed under the desired project.

Appion User Guide

  1. Appion and Leginon Database Tools - User management, Project management, Image Viewers, etc.
  2. Appion Processing - Image processing pipeline

< Upgrade Instructions



Applications

Applications are for use with the Leginon image acquisition software.

If you are not using Leginon, you may ignore the Applications settings. If you are using Leginon, please refer to the Leginon user manuals section on Applications.


< Revert Settings | Goniometer >



Credits

Authors

Current Team

Bridget Carragher, Anchi Cheng, Amber Herold, Gabe Lander, Dmitry Lyumkis, Arne Moeller, Clint S. Potter, Jim Pulokas, Joel Quispe, Scott Stagg, Neil R. Voss, Craig Yoshioka, Lauren Fisher

Alumni

Jonathan Brownell, Satya Mallica, Sunita Nayak, Denis Fellmann, Eric Hou, Christopher Irving, Pick-wei Lau, Anke Mulder

Software packages

From the EM community

Appion exists to provide an integrated interface to the following Image Processing software packages:

Appion also depends on several community supported Open Source packages, including:

Funding

Appion is supported by funding from the National Institutes of Health National Center for Research Resources’ P41 program:

< System Requirements | Version Change Log >



Auto Align Tilt Pairs

Use this tool if you want to correlate and box the particles of a Random Conical Tilt Session automatically. Before you can run this program you need to pick the particles with either one of the available picking tools Dog Picking, Manual Picking or Template Picking.

General Workflow:

  1. Decide if you want to box the particles on all tilt angles or just one (f.e. untilted images).
  2. Choose the picking run previously performed.
  3. Usually the default filter parameter work extremely well but feel free to play around with them.
  4. Submit the job.
  5. Once the job is finished, check the quality of tilt pair alignment using the Multi Image Assessment Tool.

Notes, Comments, and Suggestions:

  1. The Edit Particle Picks selection box indicates which particle runs to use for the alignment. You can use a single Dog pick run if all tilt angles were used during the run. Alternatively, you may select two particle selection runs, one for each tilt angle.

< Particle Selection|Multi Image Assessment >


Average tomogram subvolumes

Subvolume averaging implemented here aligns the 3D subvolumes using
  1. The 2D alignment of the particles picked from the Z-projection of the full tomogram for xy plane alignment
  2. Center of mass of the central slice of each subvolume in z direction.

    h2. General Workflow:
  1. Particles need to be picked on the Z-projection images,
  2. A stack made from these picked particles
  3. The stack need to be align and classified in 2D
  4. Select 2D Classes to create a substack
  5. Create tomogram subvolume using the substack so that the alignment parameters are linked.
  6. Select "Average subvolumes" in Appion sidebar under Tomography submenu.
  7. Choose the substack
  8. Enter runname (The directory where it is saved is automatically determined now)

Notes, Comments, and Suggestions:


< Create Tomogram Subvolumes




Center Particles

This function centers the particles in a stack based on a radial average of all the particles in the stack. This program functions iteratively, using only integer shifts to avoid interpolation artifacts. Particles that do not consistently center are removed from the stack.

General Workflow:

  1. Check the run name and write a description for the new stack.
  2. Set the outer mask radius for centering (determined by particle or box size), and set a maximum number of pixels that any image can be shifted. If, in order to be centered, an image needs to be shifted more pixels than specified by the user, it will be eliminated from the stack.
  3. If you want to commit the new stack to the database, make sure this box is checked.
  4. Click "Center Particles" to submit to the cluster. Alternatively, click "Just Show Command" to obtain a command that can be copied and pasted in a unix shell.

Notes, Comments, and Suggestions:

<Filter by MeanStdev | Sort Junk >


CentOS6 installation

  1. Processing Host
    Note: You may not have permission to view the following pages.
     
    1. install SIMPLE
    2. install Free-Hand Test
    3. install CCP4
    4. install image2010
    5. install Protomo2
    6. install appion/leginon packages (myami-2.2)
       
  2. Issues related to CentOS6 install
    1. #2024
    2. #1989
    3. #1685
    4. #1492
    5. #1972
    6. #1971
    7. #1951
    8. #1833
    9. #1827
    10. #1685
    11. #1527
    12. #1525
    13. #1524
    14. #1516
    15. #1513
       
  3. Web server issues
    1. #1492
    2. #1271

Why CentOS?

If you have a new computer(s) for your Leginon/Appion installation, we recommend installing CentOS because it is considered to be more stable than other varieties of Linux.

CentOS is the same as Red Hat Enterprise Linux (RHEL), except that it is free and supported by the community.

We have most experience in the installation on CentOS and this installation guide has specific instruction for the process.

see Linux distribution recommendation for more.

Download the ISO disk of CentOS 5.x

Latest version tested at NRAMM: CentOS 5.8

Note: All formally released versions of Appion (versions 1.x and 2.x) run on CentOS 5.x. Appion developers, please note that the development branch of Appion is targeting CentOS 6.x and Appion 3.0 will run on CentOS 6.x.

  1. ISO files are available at
    1. http://wiki.centos.org/Download
    2. http://mirrors.kernel.org/centos/
  2. Click on i386 for 32bit machines or x86_64 for 64bit machines
  3. Pick a mirror and download 'CentOS-5.8-i386-bin-DVD-1of2.iso ' file

Confirm download went correctly

Perform a SHA1SUM confirmation:

sha1sum CentOS-5.8-i386-bin-DVD-1of2.iso

The result should be the same as in the sha1sum file provided by CentOS. This is found at the same location you downloaded the .iso file.
For example:

Burn ISO file to DVD disk

Use dvdrecord in Linux to burn disk.

dvdrecord -v -dao gracetime=10 dev=/dev/dvd speed=16 CentOS-5.8-i386-bin-DVD-1of2.iso 

Install CentOS with default packages

Add yourself to the sudoers file

Note: This step is optional, however you will need root access to complete the Appion Installation.

Make sure you have root permission.
Open the file in an editor. ex. vi /etc/sudoers
Look for the line: root ALL=(ALL) ALL.
Add this line below the root version:

your_username ALL=(ALL)       ALL

Logout and log back in with your username.

The CentOS installation is complete.


Check php information

Create the following info.php in your web server document root directory (/var/www/html on CentOS. /srv/www/htdocs on SuSE. You can find its location in httpd.conf mentioned above under the line starting DocumentRoot).

sudo nano /var/www/html/info.php

Copy and paste the following code into info.php:

<?php
phpinfo();
?>

Restrict access to your info.php file.

sudo chmod 444 /var/www/html/info.php

Visit this page at http://HOST.INSTITUTE.EDU/info.php or http://localhost/info.php

You will see comprehensive tables of php and apache information, including the location of the additional .ini files, extension, include path, and what extension is enabled.

Here is an example screen shot of the part of the info.php page that tells you where php.ini and other configuration files are. This information will be used while installing components of the Web Server.


< Install Apache Web Server | Download Appion and Leginon Files >



ClusterPhpSettings

A list of variables to set in your_cluster.php.

Introduction

These are variables you need to set in your_cluster.php that you create based on default_cluster.php we provide. The example in default_cluster.php should work if the appion processing disk can be accessed directly from the cluster.

Each define(var,value) sets up a number shown in your emanJobGen.php or restricts the value you can put there so that you do not accidentally set impossible numbers for your cluster when you use the form to submit the job in the future.

Details

Most variables end with _DEF or _MAX where _DEF means default values shows up in the webform and _MAX means the physical limit, normally defined by your cluster machine. Therefore, the web form will complain if you set a number larger than that.

C_NAME means cluster name.

C_NODES means number of nodes used by your job.

C_PPN_DEF means default number of processors per node show up on the web form as default

C_PPN_MAX means maximal number of processors per node

C_RPOCS_DEF means default number of processors per node the web will force if you did not specify how many nodes you want to use per node. This is most likely either equal to C_PPN_MAX or C_PPN_DEF if you don't want people waste processors. We recommend setting C_RPOCS_DEF to C_PPN_MAX.

C_WALLTIME (in hours) means the maximal real time your job is allowed to run. If your cluster is configured properly, it would suspend the job after that so that it does not delay others.

C_CPUTIME (in hours) means the maximal cpu time your job is allowed to run.


Code development process

  1. Developer writes code
  2. Developer checks code into Subversion
  3. Developer updates Redmine issue with:
    1. the svn revision number
    2. a description of the changes
    3. a reference to test cases or a description of how to test the changes
    4. set the Status to In Code Review
    5. assign the issue to another person to perform a code review
  4. Code reviewer receives an email that they have code to review
  5. Code reviewer inspects the changes to the code using the review guide.
    1. Revisions that involve complicated logic or widespread changes are better done in person. In this case the reviewer can ask the developer to do a walk through.
  6. If the reviewer finds a problem, the Redmine issue is updated with:
    1. a description of the problem
    2. the Assigned to field is set back to the developer and the process starts over.
  7. If no problems are found, the Redmine issue is updated with:
    1. The Status is set to In Test
    2. The Assigned to field is set to someone who can readily test it.
  8. The fly server is updated nightly with the latest code in SVN. The code may be tested at http://fly/myamiweb the day after the code is checked in.
  9. If the tester finds a problem, the Issue is reassigned to the developer and the process starts over.
  10. If the tester does not find a problem the Issue Status is set to Closed.

Code Review Guide


Code Standards

When everyone uses the same coding style, it is much easier to read code that someone else wrote. That said, style is not important enough to enforce during a code review. It is much more important to ensure that best practices are followed such as implementing error handling .

Python

PHP

This one is a bit old, but lots of good stuff that goes beyond style. Some things are questionable. I prefer Getters/Setters over Attributes as Objects (at least how the example shows it) to allow for better error handling. I prefer no underscores in naming except for constants that use all caps...but that is only a style issue.

PHP Coding Standard

From the Zend framework folks:
http://framework.zend.com/manual/en/coding-standard.html

An intro:
http://godbit.com/article/introduction-to-php-coding-standards

Nice Presentation:
http://weierophinney.net/matthew/uploads/php_development_best_practices.pdf

PHP Unit testing
http://www.phpunit.de/pocket_guide/

For automatically checking code against the Pear standards use CodeSniffer:
http://pear.php.net/package/PHP_CodeSniffer/

Best Practices:
http://www.odi.ch/prog/design/php/guide.php

JavaScript

Improved performance:
http://blog.monitis.com/index.php/2011/05/15/30-tips-to-improve-javascript-performance/

Any Language

Web performance best practices

Database


Coding Best Practices


Combine Stacks

Use this option if you want to combine stacks you already created (for example stacks from two different sessions). Simply select the stacks you want from the list and submit the job.

General Workflow:

  1. Check the new combined stack name, output directory path, and enter a description.
  2. Select the stacks to combine by checking the appropriate check-boxes. Stacks from different sessions can be combined, and there is not a limit on the number of different stacks that can be combined.
  3. Click on "Run Combine Stack" to submit to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be copied and pasted into a unix shell.
  4. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Stacks" submenu in the appion sidebar.
  5. Once the job is finished, an additional link entitled "1 complete" will appear under the "Stacks" tab in the appion sidebar. Clicking on this link opens a summary of all stack runs that have been done on this project.

Notes, Comments, and Suggestions:

  1. You cannot yet combine stacks of different pixel sizes. For this, reset the pixel size and box size for your stack, then use the Upload Stack tool, followed by combine stacks.

<More Stack Tools | Particle Alignment >


Image Viewer Overview

Image Viewers allow you to view the images that are associated with a particular session (or experiment). You may select a project from a drop down list, then select a session in that project. The images belonging to that session appear in an Image List and the selected image is displayed in the Image View.

1 Anatomy of an Image Viewer

Primary Features:
Name Description
Project Drop Down List Projects that you own or have been shared will appear in the list. Select one to view.
Session Drop Down List Sessions that belong to the currently selected Project will appear in the list. Select one to view.
Image List The images belonging to the currently selected Session will appear in the Image List. The selected image will appear in the Image View. The total number of images is displayed at the top of the list.
Image View The selected image will be displayed in the Image View. The Image View may be configured using the Image Tools controls located directly above the Image View.
Image Tools Includes many basic image manipulation features such as filtering and Fourier Transform.

Image Viewer Screen:
Image Viewer Screen Marked

 

2 Features available with all viewers

2.1 From the header buttons

  1. View an aggregate summary of all the images in a session
  2. Launch the Appion Image processing application
  3. Launch a tool to create jpegs of all or many of the images in a session
     
    Image Viewer Header Buttons:

2.2 From the Image Tools panel

  1. View the mrc images associated with Projects and Sessions
  2. Adjust image properties
  3. Mark images as hidden or exemplar
  4. View a sideshow of the images in a session
  5. Download individual images in mrc, tiff or jpeg format
  6. View a detailed image report including mrc header information and calibrations
  7. View the Fourier Transform of the image
  8. View ACE graphs
  9. View Particle Picks
  10. View Leginon MSI focus and acquisition targets
  11. Overlay a scale ruler
     
    Features of the Image Tools Panel:

2.3 Dequeue Tool:

Used to remove queued targets on the image shown in the viewer and queued targets chosen on its direct descendant images. Clicking on it gives the number of active queued targets and the user can choose to remove them from the active list.

 

3 Chose the right viewer for the job

Image Viewer applications available in Appion and Leginon web tools:
Viewer Name Viewer Features
Image Viewer provides a single image pane
2 Way Viewer provides 2 image panes for viewing the same mrc file in different ways side by side
3 Way Viewer provides 3 image panes for viewing the same mrc file in different ways
Dual Viewer provides 2 image panes for viewing separate mrc images side by side
RCT provides 2 images panes for viewing Tomography tilt images

^ Image Viewers | Image Viewer >



Common variables used

variable dump

cd ~/myami/myamiweb/processing
cat *.php | grep '\$[A-Za-z]' | sed 's/\$_[A-Za-z]*//' | sed 's/[^$]*\(\$[A-Za-z0-9]*\)[^$]*/\1 \
/g' | sort | uniq -c | sort -rn | head -50

<# of occurrences> <variable name>

1066 $command 
1001 $particle 
 943 $ 
 854 $expId 
 630 $i 
 387 $formAction 
 385 $html 
 366 $javascript 
 349 $outdir 
 337 $projectId 
 327 $runname 
 326 $sessionId 
 299 $extra 
 213 $description 
 200 $graph 
 198 $stackid 
 198 $sessioninfo 
 186 $apix 
 180 $sessiondata 
 162 $display 
 160 $title 
 158 $templatetable 
 157 $user 
 136 $line 
 131 $javafunctions 
 127 $heading 
 126 $numpart 
 125 $jobinfo 
 117 $errors 
 114 $stackinfo 
 110 $t 
 110 $key 
 109 $s 
 108 $templateinfo 
 101 $sessionpath 
  98 $bin 
  96 $tomogram 
  96 $sub 
  96 $nproc 
  96 $filename 
  94 $stackId 
  91 $headinfo 
  90 $sessionname 
  90 $data 
  89 $j 
  89 $cmd 
  89 $box 
  89 $alignId 
  86 $r 


Common Workflow

Appion presents the user with a menu of options for image processing that is dynamically updated as each step is completed. When the user clicks on one of the menu options, Appion generates a new web page specific to the selected operation that requests inputs and allows the user to launch jobs on one of several processing machines or clusters. The job progress is monitored by updates to the menu. The user can check the progress of the job through the logfile, accessible through the the webpages after the job has been launched. Additionally, the user can kill the job from the webpage by clicking on the "kill job" button when viewing the logfile (note: if the job is manually killed from the terminal, the database does NOT get updated. The user must manually run the updateAppionDB.py script [updateAppionDB.py jobid status [projectid]], e.g. "updateAppionDB.py 1234 D 1") Once a completed job shows up in the menu, the user may click on its entry to generate a web page that reports on the results. Most input options are provided with defaults. Help options for each input are provided as pop-ups on the Appion web pages. Detailed step-by-step instructions for most of the procedures are available within the Appion documentation.

1. Step by Step Guide to 3D Reconstruction in Appion

2. Quality Assessment and Processing Output pages in Appion

3. Random-Conical Tilt Reconstruction Workflow in Appion



< Terminology | Step by Step Guide >



Compile Ace2 from source

Install supporting packages

Name: Download site: yum package name SuSE rpm name
gcc-objc gcc-objc
fftw3-devel fftw3-devel
gsl-devel gsl-devel

Compile Ace 2 from source with FFTW 3.2 or later

It is recommended that you use FFTW version 3.2 or later, because there are optimizations in FFTW 3.2 that make Ace2 run significantly faster than FFTW 3.1 (which is distributed with CentOS). But this is much harder. You will need to install FFTW 3.2 from source code and then add the -DFFTW32 flag to the CFLAGS line in the Makefile

Compile Ace 2 from source with FFTW 3.1

Install Ace2 ^



Compile FindEM

Install supporting packages:

Name: Download site: yum package name SuSE rpm name
compat-gcc-34-g77 compat-gcc-34-g77
gcc-gfortran gcc-gfortran

Test FindEM binary

Both 32 and 64 bit findem binaries are already available in the myami/appion/bin directory.
Test it by changing directories to myami/appion/bin and type the following commands:

./findem64.exe         (64 bit version)

or

./findem32.exe         (32 bit version)

If it does not crash you are good.

Install FindEM from source

If the binary included with Appion does not work, or you wish to compile it yourself follow the instructions to install FindEM from source.


< Install Grigorieff lab software | Install Ace2 >



Compile Radermacher

Pre-install

sudo yum install libgomp

Build and install

cd myami/modules/radermacher
$ python ./setup.py build
$ sudo python ./setup.py install

Quick test to see if installed properly

$ python
>>> import radermacher
>>> <Ctrl-D>

< Compile Ace2 | Install Xmipp >



Complete Installation

There are three main components of the Appion system: a Database Server, a Processing Server and a Web Server. These may be installed on separate computers, or on the same computer. Several installation options are listed below. If you are unsure which installation option to choose for your situation, please inquire on the Software Installation Forum. There are also instructions to register for a Redmine account which is needed to make a Forum post.

Automatic Installation Script:

The Automatic Installation Script installs a fully functional demo version of Appion. This script is intended to be used with a single computer running a fresh installation of the CentOS operating system. The process is very quick and easy and includes groEL images for you to begin processing right away.

  1. Install Appion and Leginon using the auto-installation tool

Manual Installation Instructions:

The Manual Installation Instructions are intended for a production system. We recommend using the CentOS operating system, but we include instructions for Fedora under the Alternative Options below. Also under Alternative Options you will find instructions for installing Appion with an existing Leginon installation.

  1. Select Linux distribution to use
  2. CentOS Installation:
    1. Instructions for installing CentOS on your computer
    2. Download additional Software (CentOS Specific)
  3. Database Server Installation
  4. File Server Setup Considerations
  5. Processing Server Installation
  6. Web Server Installation
  7. Additional Database Setup After Web Server Initialization
  8. Create a Test Project
  9. Setup Remote Processing
  10. Security Considerations

Alternative Options:

  1. Fedora Installation:
    1. Instructions for installing Fedora on your computer
    2. Download additional Software (Fedora Specific)
  2. Installing Appion with an existing Leginon installation

Troubleshooting your installation:

  1. Installation Troublshooting Guide

< Version Change Log | Upgrade Instructions >



Configure .appion.cfg

  1. Create a hidden file called .appion.cfg in the myami directory, at the same level as the myami lib and bin folders.
    Note: This file may be added to a users home directory on the processing host to override the configuration found in the installation directory.
     
  2. Add the following contents:
    ProcessingHostType=Torque
    
    Shell=/bin/csh
    
    ScriptPrefix=
    
    ExecCommand=/usr/local/bin/qsub
    
    StatusCommand=/usr/local/bin/qstat
    
    AdditionalHeaders= -m e, -j oe
    
    PreExecuteLines=
    
  3. Modify the settings for your Processing Host


< Configure sinedon.cfg | Install External Packages >



Configure .appion.cfg

  1. Create a hidden file called .appion.cfg in the myami directory, at the same level as the myami lib and bin folders.
    Note: This file may be added to a users home directory on the processing host to override the configuration found in the installation directory.
     
  2. Add the following contents:
    ProcessingHostType=Torque
    
    Shell=/bin/csh
    
    ScriptPrefix=
    
    ExecCommand=/usr/local/bin/qsub
    
    StatusCommand=/usr/local/bin/qstat
    
    AdditionalHeaders= -m e, -j oe
    
    PreExecuteLines=
    
  3. Modify the settings for your Processing Host

Configure leginon.cfg

For older versions of Appion and Leginon (pre-2.2), please use the following instructions:
Instructions for Appion and Leginon versions prior to 2.2

The order in which Leginon/Appion looks for leginon.cfg

  1. individual user home directory
  2. $PYTHONSITEPKG/leginon
     
    Note: You can discover the $PYTHONSITEPKG path by starting python:
    python
    import sys
    sys.path
    

    The first path to site-packages should hold the config file.
     
  3. /etc/myami (on Unix)

Locate the search directories and the currently loaded cfg files by running the following python script in $PYTHONSITEPKG/leginon

configcheck.py

Configuration file template

A skeleton (default) configuration file is available:

/path/to/myami/leginon/leginin.cfg.template

Create a global leginon.cfg

Copy leginon.cfg.template to leginon.cfg.

sudo cp -v /path/to/myami/leginon/leginin.cfg.template /etc/myami/leginon.cfg

Edit the newly created file and add a directory for images. Make sure you have permission to save files at this location. See File Server Setup Considerations for more details

You may put in a fake path on the microscope PC installation and ignore the error message at the start of Leginon if you follow our general rule of not saving any image directly from the microscope pc,

[Images]
path: your_storage_disk_path/leginon

The rest of the configuration options are fine left as default. leginon.cfg is not needed for individual users for Appion purpose


< Install Appion/Leginon Packages | Configure sinedon.cfg >



Configure leginon.cfg for Appion/Leginon v2.1 and lower

The order in which Leginon/Appion looks for leginon.cfg

  1. individual user home directory
  2. $PYTHONSITEPKG/leginon/config
     
    Note: You can discover the $PYTHONSITEPKG path by starting python:
    python
    import sys
    sys.path
    

    The first path to site-packages should hold the config file.

configuration file template

A skeleton (default) configuration file is available:

$PYTHONSITEPKG/leginon/config/default.cfg

where $PYTHONSITEPKG is your python site-packages directory

Create a global leginon.cfg

Copy default.cfg to leginon.cfg.

sudo cp -v $PYTHONSITEPKG/leginon/config/default.cfg $PYTHONSITEPKG/leginon/config/leginon.cfg

Edit the newly created file and add a directory for images. Make sure you have permission to save files at this location. See File Server Setup Considerations for more details

You may put in a fake path on the microscope PC installation and ignore the error message at the start of Leginon if you follow our general rule of not saving any image directly from the microscope pc,

[Images]
path: your_storage_disk_path/leginon


Configure leginon.cfg

For older versions of Appion and Leginon (pre-2.2), please use the following instructions:
Instructions for Appion and Leginon versions prior to 2.2

The order in which Leginon/Appion looks for leginon.cfg

  1. individual user home directory
  2. $PYTHONSITEPKG/leginon
     
    Note: You can discover the $PYTHONSITEPKG path by starting python:
    python
    import sys
    sys.path
    

    The first path to site-packages should hold the config file.
     
  3. /etc/myami (on Unix)

Locate the search directories and the currently loaded cfg files by running the following python script in $PYTHONSITEPKG/leginon

configcheck.py

Configuration file template

A skeleton (default) configuration file is available:

/path/to/myami/leginon/leginin.cfg.template

Create a global leginon.cfg

Copy leginon.cfg.template to leginon.cfg.

sudo cp -v /path/to/myami/leginon/leginin.cfg.template /etc/myami/leginon.cfg

Edit the newly created file and add a directory for images. Make sure you have permission to save files at this location. See File Server Setup Considerations for more details

You may put in a fake path on the microscope PC installation and ignore the error message at the start of Leginon if you follow our general rule of not saving any image directly from the microscope pc,

[Images]
path: your_storage_disk_path/leginon


Configure php.ini

Edit the following items in php.ini (found as /etc/php.ini on CentOS and /etc/php5/apache2/php.ini on SuSE)

sudo nano /etc/php.ini

so that they look like the following:

error_reporting = E_ALL & ~E_NOTICE & ~E_WARNING
 
display_errors = On
 
register_argc_argv = On
 
short_open_tag = On
 
max_execution_time = 300 ; Maximum execution time of each script, in seconds
max_input_time = 300 ; Maximum amount of time each script may spend parsing request data
memory_limit = 256M ; Maximum amount of memory a script may consume (8MB)

You may want to increase max_input_time and memory_limit if the server is heavily used. At NRAMM, max_input_time=600 and memory_limit=4000M.

You should also set timezone using one of the valid string found at http://www.php.net/manual/en/timezones.php like this:

date.timezone = 'America/Los_Angeles'

< Install Web Server Prerequisites | Install Apache Web Server >



Configure sinedon.cfg

Sinedon is an object relational mapping library designed to interact with the Leginon and Appion databases.

For older versions of Appion and Leginon (pre-2.2), please use the following instructions:
Instructions for Appion and Leginon versions prior to 2.2

The order in which Leginon/Appion looks for sinedon.cfg

  1. individual user home directory
  2. $PYTHONSITEPKG/sinedon
     
    Note: You can discover the $PYTHONSITEPKG path by starting python:
    python
    import sys
    sys.path
    

    The first path to site-packages should hold the config file.
     
  3. /etc/myami (on Unix)

Locate the search directories and the currently loaded cfg files by running the following python script in $PYTHONSITEPKG/leginon

configcheck.py

Create a sinedon.cfg for all users

Note: If you are a developer, and you need to use sinedon.cfg settings that are different from the global settings, you may create your own sinedon.cfg file and place it in your home directory. This version will override the global version located in the site packages directory.

Appion database is set dynamically through project database. No module entry is needed here.


< Configure leginon.cfg | Configure .appion.cfg >



Configure sinedon.cfg for Appion/Leginon versions pre-2.2

Sinedon is an object relational mapping library designed to interact with the Leginon and Appion databases.

Note: If you are a developer, and you need to use sinedon.cfg settings that are different from the global settings, you may create your own sinedon.cfg file and place it in your home directory. This version will override the global version located in the site packages directory.


Configure sinedon.cfg

Sinedon is an object relational mapping library designed to interact with the Leginon and Appion databases.

For older versions of Appion and Leginon (pre-2.2), please use the following instructions:
Instructions for Appion and Leginon versions prior to 2.2

The order in which Leginon/Appion looks for sinedon.cfg

  1. individual user home directory
  2. $PYTHONSITEPKG/sinedon
     
    Note: You can discover the $PYTHONSITEPKG path by starting python:
    python
    import sys
    sys.path
    

    The first path to site-packages should hold the config file.
     
  3. /etc/myami (on Unix)

Locate the search directories and the currently loaded cfg files by running the following python script in $PYTHONSITEPKG/leginon

configcheck.py

Create a sinedon.cfg for all users

Note: If you are a developer, and you need to use sinedon.cfg settings that are different from the global settings, you may create your own sinedon.cfg file and place it in your home directory. This version will override the global version located in the site packages directory.


Configure web server to submit jobs

This file is no longer needed as of Appion version 2.2. The information that was configured in the cluster.php file is not set in the main config.php file in the PROCESSING_HOST array. (instructions for editing the config.php file)

configuration at web server side

IMPORTANT: What we refer here as your_cluster.php should not be taken literally. For example, if you access your cluster through the network with a name "bestclusterever", you should name your cluster configuration php file bestclusterever.php, not your_cluster.php.
  1. Go to your myamiweb/processing directory (on CentOS this may be in /var/www/html)
  2. Copy default_cluster.php to your_cluster.php
  3. Edit your_cluster.php to correspond to your cluster configuration.
  4. Run the setup wizard found at http://YOUR_SERVER/myamiweb/setup to register the your_cluster.php you just created.

Setup Remote Processing ^



Configure web server to submit job to local cluster

Option 1: Use the setup wizard and add the local cluster headnode

  1. Go the setup wizard: http://localhost/myamiweb/setup or http://HOST.INSTITUTE.EDU/myamiweb/setup

Option 2: Manually edit the config.php and add the local cluster headnode to the config.php

Edit the file /var/www/html/myamiweb/config.php and ensure the following changes are made:

  1. Enable the processing plug-in by uncommenting out the following line in the file`myamiweb/config.php`
    addplugin("processing");
    

     
  2. IMAGIC and Other features:
    // Check if IMAGIC is installed and running, otherwise hide all functions
    define('HIDE_IMAGIC', false);
    
    // hide processing tools still under development.
    define('HIDE_FEATURE', true);
    

     
  3. Add processing host information
     
    Appion version 2.2 and later:
    The following code should be added and modified for each processing host available.
    $PROCESSING_HOSTS[] = array(
    'host' => 'LOCAL_CLUSTER_HEADNODE.INSTITUTE.EDU', // for a single computer installation, this can be 'localhost'    
    'nproc' => 32,  // number of processors available on the host, not used
    'nodesdef' => '4', // default number of nodes used by a refinement job
    'nodesmax' => '280', // maximum number of nodes a user may request for a refinement job
    'ppndef' => '32', // default number of processors per node used for a refinement job
    'ppnmax' => '32', // maximum number of processors per node a user may request for a refinement job
    'reconpn' => '16', // recons per node, not used 
    'walltimedef' => '48', // default wall time in hours that a job is allowed to run
    'walltimemax' => '240', // maximum hours in wall time a user may request for a job
    'cputimedef' => '1536', // default cpu time in hours a job is allowed to run (wall time x number of cpu's) 
    'cputimemax' => '10000', // maximum cpu time in hours a user may request for a job
    'memorymax' => '', // the maximum memory a job may use
    'appionbin' => 'bin/', // the path to the myami/appion/bin directory on this host
    'appionlibdir' => 'appion/', // the path to the myami/appion/appionlib directory on this host
    'baseoutdir' => 'appion', // the directory that processing output should be stored in
    'localhelperhost' => '', // a machine that has access to both the web server and the processing host file systems to copy data between the systems
    'dirsep' => '/', // the directory separator used by this host
    'wrapperpath' => '', // advanced option that enables more than one Appion installation on a single machine, contact us for info 
    'loginmethod' => 'SHAREDKEY', // Appion currently supports 'SHAREDKEY' or 'USERPASSWORD' 
    'loginusername' => '', // if this is not set, Appion uses the username provided by the user in the Appion Processing GUI
    'passphrase' => '', // if this is not set, Appion uses the password provided by the user in the Appion Processing GUI
    'publickey' => 'rsa.pub', // set this if using 'SHAREDKEY'
    'privatekey' => 'rsa'      // set this if using 'SHAREDKEY'
    );
    

     
    Appion version 2.1 and prior:
    // --- Please enter your processing host information associate with -- //
    // --- Maximum number of the processing nodes                                    -- //
    // --- $PROCESSING_HOSTS[] = array('host' => 'host1.school.edu', 'nproc' => 4); -- //
    // --- $PROCESSING_HOSTS[] = array('host' => 'host2.school.edu', 'nproc' => 8); -- //
    
    // $PROCESSING_HOSTS[] = array('host' => '', 'nproc' => );
    

     
  4. Microscope spherical aberration constant
     
    Not needed for Appion version 2.2 and later. Version 2.1 and earlier only:
    $DEFAULTCS = "2.0";
    

     
  5. Redux server information
     
    This is a very new feature, post Appion 2.2, so you only need this if you are running the trunk.
    The Redux server replaces the old mrc_php module and will be released in Appion version 3.0.
    Information on redux parameters to add to the config.php


< Install SSH module for PHP | Testing job submission >



Convert Mass into Volume

This page will help you convert the mass of a macromolecule into a diameter in a micrograph.

Denisities From the relation: We can convert the density into cubic Ångstroms per Dalton: Converting into a diameter *Case 1, Particle is spherical: *Case 2, Particle is flattened like an M&M candy, i.e. an oblate spheroid: Examples === 400 kDa protein === *Case 1, Particle is spherical: *Case 2, Particle is flattened:

Convert Stack into Particle Picks

This options allows you to regenerate a stack from the original images. It is extremely useful if you want to change the boxsize of the particles, use different filter or binning parameters.

General Workflow:

  1. Generate an inital dataset from the boxed paricles (bin the particles to allow quick processing)
  2. After a couple of iterations and cleaning of the stack regenerate an unbinned copy of the high quality images left - using this option.

Comments, Notes, and Suggestions:

<More Stack Tools | Particle Alignment >


Create Appion Session

If you want to test Appion processing, you need a session to work with. There is a script available to create a session for you, loaded with GroEL images.

Use appion/test/CreateTestSession.py found in the subversion repository.

Use -h to see help on how to use it.

You'll want to supply it with a project id and a run directory.

Test processing

testsuite.py - executes processing modules up to but not including reconstruction

Test Stacks

teststack.py - reads and writes stacks in all possible ways


Create a Processing Database for the Project

processing db: not set (create processing db) db name ap1
You can create the default numbered style database ap... or give it a new name with the same prefix. If you want to specify a database name that does not use the default prefix, please note that your db user specified in the config.php in project_1_2 needs to have the necessary privileges for that database. You may additionally want to change the value assigned to $DEF_PROCESSING_PREFIX in project_1_2/config.php if you want to use your new prefix all the time.
processing db: ap1

See next section on trouble shooting if you get the original page instead.

If you want all your processing databases combined in one single database (not recommended, as this becomes large very fast), just use the same name for all your projects.

The above procedure not only creates the database, but also create some of the tables that you need to start processing.


Create a Project Processing Database

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select your project by clicking on the project name.
  3. Click on the Create processing db button under the project Info section.

A name for the database is automatically assigned. You do not need to edit this name.

To change the database assigned to this project, select the unlink button.


< Edit Project Owners | Unlink a Project Processing Database >



Create a Test Project

Use the following guidelines for creating your first Appion project.

1 Go to the main web page

The url will vary based on your host name.

http://localhost/myamiweb/

2 Log into myamiweb

With the username "administrator" and the administrator password created in the wizard, log into myamiweb as shown below.
If you did not enable user login in the setup wizard, you will not be prompted for a password.


Default login screen


.
.
.
You will then see the default layout for the administrator.
.
.
.


Default layout for administrator


3 Add yourself as a user

  1. Select the Administration application.
  2. Select Users
  3. Select Add new user
  4. Complete the new user form
  5. There are four default groups available. You may also create a new group.
    1. administrators
    2. power users
    3. users
    4. guests
  6. Click on [add].

More about Users.

4 Create a project

  1. From the start page, select the ProjectDB application.
  2. Select "Add a new project"
  3. Complete the form
  4. Click on [add]
  5. Click "View Projects" to see your newly created project
  6. Select your new project to view it

5 Create the processing database for the project

Follow the instructions in Create a Processing Database. This will hold all the Appion processing data for your project.

6 Upload images to a new session

You can download sample images from here.
Then follow the steps in Upload Images.
You can use Pixel size of .83, binning of 1, magnification 100,00, high tension 120, defocus -0.89.

7 View the images

Once your images have been uploaded to a session, you can view them in the Image Viewer application.

  1. Return to ProjectDB and select your project
  2. Sessions are listed in a table, select the session that you just created.
  3. The image viewer will open with the first image in your session displayed.

8 Process Images

From the image viewer, click on the [processing] button at the top of the screen. This will open the Appion processing pipeline application.
From there, follow the directions in Process Images to confirm that your installation is functioning properly.


< Additional Database Server Setup | Setup Remote Processing >



Create full tomogram

General Workflow:

  1. Select the tiltseries for which a tomogram is to be calculated.
  2. Select the alignment run from Align tilt series to be used.
  3. Check the run name and enter a description.
  4. To submit the job to the cluster click the "Create Full Tomogram" button. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  5. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Create full tomogram" submenu in the appion sidebar.
  6. Now click on the "1 Complete" link under the "Create full tomogram" submenu. This opens a summary of all tomograms that have been calculated for this dataset.
  7. Clicking on the "id" opens a summary page for the calculated tomogram that includes input parameters and file directory paths.
  8. In order to upload this tomogram for further processing, click on the "Upload full tomogram" link in the Tomography menu on the Appion Sidebar.

Notes, Comments, and Suggestions:


< Align Tilt Series | Upload Tomogram >



Create New Project

To create a new project:
  1. Click on the Project DB icon in the Appion and Leginon Tools start page or browse to http://YOUR_HOST/myamiweb/project/project.php.
  2. Click on the Add a new project link just above the project table.
  3. Enter at least a Name and Short Description of the project as well as other relevant fields.
  4. Click on the add button
     
    New Project Screen

< View Projects | Edit Project Description >



Create Substack

This function allows creation of substacks.

General Workflow:

  1. Check run name and output directory.
  2. Enter a description for the substack
  3. Choose from three radio buttons to a) select a subset of particles, b) choose a certain number of particles to be sampled randomly, or c) split the current stack into several substacks.
  4. Make sure the "Commit to Database" box is checked if appropriate. Click "SortJunk" to submit to the cluster. Alternatively, click "Just Show Command" in order to copy and paste into a unix shell.

Notes, Comments, and Suggestions:

< Sort Junk | More Stack Tools >


Create tomogram subvolume

In order to extract subvolumes from a full-size tomogram, the user first needs to use the particle selection tool to pick which portions of the projected full tomogram the subvolumes are centered on. Subtomograms of uniform size can then be extracted using create subtomogram option under tomography tag.

General Workflow:

Select the center of the desired subtomogram as particle

You are most likely to use "manual picking" for unique object, which is what we outline here. If there are multiple copy of the particle projected to the same plan and you plan to do 3D averaging of the subtomograms, you might be able to use other particle selection methods, too.

  1. In the Appion Sidebar, select "manual picking" option under the "Particle Selection" submenu such as "manual picking".
  2. Check the run name and output directory.
  3. Select the "zproj" present from the dropdown menu. This means that you will be selecting portions of your tomogram using only these images.
  4. Set a diameter for the resulting images (won't affect actual boxing, just display).
  5. Check the commit to database box.
  6. Click the "Just Show Command" button.
  7. A new window will open with a command that can be copied and then:
  8. pasted into a unix terminal to run. NOTE: Manual picker cannot be run from the webpage!

Extract subvolumes of the full tomogram

  1. In the Appion Sidebar, select "create tomogram subvolumes" option under the "Tomography" submenu
  2. Choose to get the subvolume center from either particle selection run or stack of particles (If the subvolumes will be averaged using appion, a substack chosen from one or more classes of aligned particle from the 2D Z-projection images must be used here.)
  3. Select from the valid runs or stacks
  4. The default run name is based on the id of the particle selection or particle stack.
  5. Choose the size of the subvolume in pixels of the tilt series images.
  6. If the subvolume is offset in Z direction, a non-zero subvolume center should be put in.
  7. Enter the binning applies to the subvolumes
  8. Invert image density if this is an cryo-tomogram since most rendering program expect the object is brighter than the background.
  9. Specify the full tomogram from which the subvolume is extracted from.
  1. To average multiple tomogram subvolumes, click on the "Average tomogram subvolumes" link in the "Tomography" submenu on the Appion sidebar.

Notes, Comments, and Suggestions:


< Create Full Tomogram | Average Tomogram Subvolumes >



Crud Finding

General Workflow:

Notes, Comments, and Suggestions:


< Region Mask Creation | Manual Masking >



CtfFind Estimation

This algorithm works on tilted images.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Using the radio buttons select carbon or ice medium.
  3. Select the leginon preset corresponding to the images you'd like to process. Generally "_en" images in leginon are the raw micrographs, but uploaded film data will have a different present. Selecting "all" will simply process all images.
  4. A dropdown menu allows the user to specify whether CtfFind will estimate CTF for all tilt angles, zero degree tilt angles, large tilt angles, or small tilt angles. The "Wait for more images" check box turns on the option that will wait until image collection is done before stopping CtfFind processing. The "Limit" box allows restrictions on the number of images to process, which is useful when testing parameters initially.
  5. Radio buttons under "Images to Process" allows a level of pre-processing image filtering. Images that were rejected, hidden, or made exmplars in the image viewer can here be included or exluded.
  6. Radio buttons under "Image Order" sets the order in which images are processed and radio buttons under "Continuation" gives the option of continuing or rerunning a previous CtfFind run.
  7. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  8. Click on "Run CtfFind" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  9. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run CtfFind" submenu in the appion sidebar.
  10. Now click on the "1 Complete" link under the "Run CtfFind" submenu. This opens a summary of all CtfFind runs that have been done on this dataset, including a summary histogram of confidence values for all CTF estimation (Ace, Ace 2, and CtfFind) runs.
  11. Clicking on "download ctf data" opens a dialog for exporting CTF estimation results for use in another application.
  12. Clicking on the "CtfFind#" name opens a summary page of the parameters used for the particular run.
  13. The CTF parameters and micrograph tilt estimation determined by CtfFind can be applied to particles during particle boxing in appion with the Create Particle Stack tool.

Notes, Comments, and Suggestions:

  1. If multiple CTF estimation runs are performed for a single dataset, the Appion stack creation tool will select the CTF parameters determined with the highest confidence value (regardless of algorithm used) for a particle from a given micrograph.

< CTF Estimation | Create Particle Stack >



CTF Estimation

Image formation in EM is distorted by modulation of a contrast transfer function (CTF). Distortion depends on the physical parameters of the microscope, such as keV and lens abberations. Correcting for these aberrations is done by comparing the experimentally observed power spectral density (PSD) of EM images to a theoretically generated CTF.

Available CTF Procedures:

  1. Ace Estimation
  2. Ace 2 Estimation
  3. CtfFind Estimation

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:

If you have a large range of defocus in your data, one parameter set or even one estimation method may not work for all of them. In this case, you can start a separate run but choose a different parameters that only process those with low confidence results. Next appion processing (stack making) will pick the result with highest confidence value image by image to be used. Note that Ace and Ace 2 have in general equivalent confidence values while CtfFind values tend to lower which makes it harder to mix and match currently. See forum http://ami.scripps.edu/redmine/boards/13/topics/990



< Particle Selection | Create Particle Stack >



Database Server Installation

Install MySQL

The following is for the computer that hosts the databases. This involves installing MySQL server and creation/configuration of the leginondb and projectdb databases.

1 Install MySQL-Server and MySQL-Client

Note: You may already have MySQL Server and Client installed. Check by typing mysql at the command line.
If you see a MySQL prompt (mysql>), you may skip this step.

To install Mysql on Linux you have two options (the first option is better):

  1. Use your package installer (yum, zypper, YaST, apt-get). For example:
    sudo yum install mysql mysql-server

    For Suse
    yast2 -i mysql mysql-client
  2. Download the latest MySQL-server package for Linux from http://www.mysql.com

2 Locate Example MySQL configuration files

They are usually located in /usr/share/mysql.

ls /usr/share/mysql/my*
    /usr/share/mysql/my-huge.cnf
    /usr/share/mysql/my-innodb-heavy-4G.cnf
    /usr/share/mysql/my-large.cnf
    /usr/share/mysql/my-medium.cnf
    /usr/share/mysql/my-small.cnf

If that does not work try the locate function
locate my | egrep "\.cnf$" 
    /etc/my.cnf
    /usr/share/mysql/my-huge.cnf
    /usr/share/mysql/my-innodb-heavy-4G.cnf
    /usr/share/mysql/my-large.cnf
    /usr/share/mysql/my-medium.cnf
    /usr/share/mysql/my-small.cnf

3 Configure my.cnf in /etc using my-huge.cnf as the template

  1. Copy my-huge.cnf to my.cnf
    sudo cp -v /usr/share/mysql/my-huge.cnf /etc/my.cnf
  2. Edit /etc/my.cnf to add or change query cache variables like these (be sure to place them under the [mysqld] section):
    query_cache_type = 1
    query_cache_size = 100M
    query_cache_limit= 100M
    
  3. Search for the text default-storage-engine in /etc/my.cnf. If it exists and is set to other than MyISAM, you should change it to:
    default-storage-engine=MyISAM

4 Start the MySQL Server

For CentOS/Fedora/RHEL system use the service command:

sudo /sbin/service mysqld start

For other Unix systems:

sudo /etc/init.d/mysqld start

or on some installations (Suse),

sudo /etc/init.d/mysql start

For future reference: start | stop | restart MySQL Server with similar commands:

For Centos, Fedora

sudo /etc/init.d/mysqld start
sudo /etc/init.d/mysqld stop
sudo /etc/init.d/mysqld restart

or
sudo /sbin/service mysqld start
sudo /sbin/service mysqld stop
sudo /sbin/service mysqld restart

or for Suse
sudo /etc/init.d/mysql start
sudo /etc/init.d/mysql stop
sudo /etc/init.d/mysql restart

5 Configure MySQL to start automatically at boot

sudo /sbin/chkconfig mysqld on

or for SuSe:
sudo /sbin/chkconfig --add mysql

6 For future reference, the database location will be:

ls /var/lib/mysql
    ibdata1  ib_logfile0  ib_logfile1  mysql  mysql.sock  test

7 Create the Leginon database, call it leginondb

sudo mysqladmin create leginondb

8 Create the Project database, call it projectdb

sudo mysqladmin create projectdb

9 Connect to mysql db

If starting from scratch, the mysql root user will have no password. This is assumed to be the case and we will set it later.

mysql -u root mysql

You should see a mysql prompt: mysql>

You can view the current mysql users with the following command.

select user, password, host from user;
      +------+----------+-----------+
      | user | password | host      |
      +------+----------+-----------+
      | root |          | localhost |
      | root |          | host1     |
      |      |          | host1     |
      |      |          | localhost |
      +------+----------+-----------+
      4 rows in set (0.00 sec)

10 Create user

Create and grant privileges to a user called usr_object for the databases on both the localhost and other hosts involved. For example, use wild card '%' for all hosts. You can set specific (ALTER, CREATE, DROP, DELETE, INSERT, RENAME, SELECT, UPDATE) privileges or ALL privileges to the user. See MySQL Reference Manual for details. The following examples demonstrate some of the options available.

11 Give create and access privileges for the processing databases which begin with "ap".

# if your web host is local
GRANT ALTER, CREATE, INSERT, SELECT, UPDATE ON `ap%`.* to usr_object@localhost; 
# for all other hosts if you are accessing the databases from another computer
GRANT ALTER, CREATE, INSERT, SELECT, UPDATE ON `ap%`.* to usr_object@'%.mydomain.edu';       

12 Change Root password

To set the root password use the command:

sudo mysqladmin -u root password NEWPASSWORD

Or you can do it from within mysql

update user set password=password('your_own_root_password') where user="root";
Query OK, 2 rows affected (0.01 sec)
Rows matched: 2  Changed: 2  Warnings: 0

# run the flush privileges command to avoid problems
flush privileges;
^D or exit;

From now on, you will need to specify the password to connect to the database as root user like this:

mysql -u root -p mysql

13 Check MySQL variables

# at the command prompt, log into the leginon database

mysql -u usr_object -p leginondb

# At the mysql prompt show variables that begin with 'query'.
# Check that the changes you made to my.cfg are in place.

SHOW VARIABLES LIKE 'query%';
      +------------------------------+-----------+
      | Variable_name                | Value     |
      +------------------------------+-----------+
      | ft_query_expansion_limit     | 20        |
      | have_query_cache             | YES       |
      | long_query_time              | 10        |
      | query_alloc_block_size       | 8192      |
      | query_cache_limit            | 104857600 | ---This should correspond to your change
      | query_cache_min_res_unit     | 4096      |
      | query_cache_size             | 104857600 | ---This should correspond to your change
      | query_cache_type             | ON        | ---This should correspond to your change
      | query_cache_wlock_invalidate | OFF       |
      | query_prealloc_size          | 8192      |
      +------------------------------+-----------+
      10 rows in set (0.00 sec)

exit;
If you do not see your changes, try restarting mysql.
On centOS:
sudo /etc/init.d/mysqld restart

14 Make sure MySQL is running

mysqlshow -u root -p
      +--------------+
      | Databases    |
      +--------------+
      | mysql        |
      | leginondb    |
      | projectdb    |
      +--------------+

Run the following command from the command line:

Be sure to edit PASSWORD to the one you previously set for usr_object.

php -r "mysql_connect('localhost', 'usr_object', 'PASSWORD', 'leginondb'); echo mysql_stat();"; echo "" 

Expected output:

Uptime: 1452562 Threads: 1 Questions: 618 Slow queries: 0 Opens: 117 Flush tables: 1 Open tables: 106 Queries per second avg: 0.000

If there are any error messages, mysql may be configured incorrectly.

Note: If you do not have php and php-mysql packages installed you need to install them to run the above command. The yum installation is:

sudo yum -y install php php-mysql


< Download additional Software | File Server Setup Considerations >



Database Server Installation

Install MySQL

The following is for the computer that hosts the databases. This involves installing MySQL server and creation/configuration of the leginondb and projectdb databases.

1 Install MySQL-Server and MySQL-Client

Note: You may already have MySQL Server and Client installed. Check by typing mysql at the command line.
If you see a MySQL prompt (mysql>), you may skip this step.

To install Mysql on Linux you have two options (the first option is better):

  1. Use your package installer (yum, zypper, YaST, apt-get). For example:
    sudo yum install mysql mysql-server

    For Suse
    yast2 -i mysql mysql-client
  2. Download the latest MySQL-server package for Linux from http://www.mysql.com

2 Locate Example MySQL configuration files

They are usually located in /usr/share/mysql.

ls /usr/share/mysql/my*
    /usr/share/mysql/my-huge.cnf
    /usr/share/mysql/my-innodb-heavy-4G.cnf
    /usr/share/mysql/my-large.cnf
    /usr/share/mysql/my-medium.cnf
    /usr/share/mysql/my-small.cnf

If that does not work try the locate function
locate my | egrep "\.cnf$" 
    /etc/my.cnf
    /usr/share/mysql/my-huge.cnf
    /usr/share/mysql/my-innodb-heavy-4G.cnf
    /usr/share/mysql/my-large.cnf
    /usr/share/mysql/my-medium.cnf
    /usr/share/mysql/my-small.cnf

3 Configure my.cnf in /etc using my-huge.cnf as the template

  1. Copy my-huge.cnf to my.cnf
    sudo cp -v /usr/share/mysql/my-huge.cnf /etc/my.cnf
  2. Edit /etc/my.cnf to add or change query cache variables like these (be sure to place them under the [mysqld] section):
    query_cache_type = 1
    query_cache_size = 100M
    query_cache_limit= 100M
    
  3. Search for the text default-storage-engine in /etc/my.cnf. If it exists and is set to other than MyISAM, you should change it to:
    default-storage-engine=MyISAM

4 Start the MySQL Server

For CentOS/Fedora/RHEL system use the service command:

sudo /sbin/service mysqld start

For other Unix systems:

sudo /etc/init.d/mysqld start

or on some installations (Suse),

sudo /etc/init.d/mysql start

For future reference: start | stop | restart MySQL Server with similar commands:

For Centos, Fedora

sudo /etc/init.d/mysqld start
sudo /etc/init.d/mysqld stop
sudo /etc/init.d/mysqld restart

or
sudo /sbin/service mysqld start
sudo /sbin/service mysqld stop
sudo /sbin/service mysqld restart

or for Suse
sudo /etc/init.d/mysql start
sudo /etc/init.d/mysql stop
sudo /etc/init.d/mysql restart

5 Configure MySQL to start automatically at boot

sudo /sbin/chkconfig mysqld on

or for SuSe:
sudo /sbin/chkconfig --add mysql

6 For future reference, the database location will be:

ls /var/lib/mysql
    ibdata1  ib_logfile0  ib_logfile1  mysql  mysql.sock  test

7 Create the Leginon database, call it leginondb

sudo mysqladmin create leginondb

8 Create the Project database, call it projectdb

sudo mysqladmin create projectdb

9 Connect to mysql db

If starting from scratch, the mysql root user will have no password. This is assumed to be the case and we will set it later.

mysql -u root mysql

You should see a mysql prompt: mysql>

You can view the current mysql users with the following command.

select user, password, host from user;
      +------+----------+-----------+
      | user | password | host      |
      +------+----------+-----------+
      | root |          | localhost |
      | root |          | host1     |
      |      |          | host1     |
      |      |          | localhost |
      +------+----------+-----------+
      4 rows in set (0.00 sec)

10 Create user

Create and grant privileges to a user called usr_object for the databases on both the localhost and other hosts involved. For example, use wild card '%' for all hosts. You can set specific (ALTER, CREATE, DROP, DELETE, INSERT, RENAME, SELECT, UPDATE) privileges or ALL privileges to the user. See MySQL Reference Manual for details. The following examples demonstrate some of the options available.

11 Give create and access privileges for the processing databases which begin with "ap".

# if your web host is local
GRANT ALTER, CREATE, INSERT, SELECT, UPDATE ON `ap%`.* to usr_object@localhost; 
# for all other hosts if you are accessing the databases from another computer
GRANT ALTER, CREATE, INSERT, SELECT, UPDATE ON `ap%`.* to usr_object@'%.mydomain.edu';       

12 Change Root password

To set the root password use the command:

sudo mysqladmin -u root password NEWPASSWORD

Or you can do it from within mysql

update user set password=password('your_own_root_password') where user="root";
Query OK, 2 rows affected (0.01 sec)
Rows matched: 2  Changed: 2  Warnings: 0

# run the flush privileges command to avoid problems
flush privileges;
^D or exit;

From now on, you will need to specify the password to connect to the database as root user like this:

mysql -u root -p mysql

13 Check MySQL variables

# at the command prompt, log into the leginon database

mysql -u usr_object -p leginondb

# At the mysql prompt show variables that begin with 'query'.
# Check that the changes you made to my.cfg are in place.

SHOW VARIABLES LIKE 'query%';
      +------------------------------+-----------+
      | Variable_name                | Value     |
      +------------------------------+-----------+
      | ft_query_expansion_limit     | 20        |
      | have_query_cache             | YES       |
      | long_query_time              | 10        |
      | query_alloc_block_size       | 8192      |
      | query_cache_limit            | 104857600 | ---This should correspond to your change
      | query_cache_min_res_unit     | 4096      |
      | query_cache_size             | 104857600 | ---This should correspond to your change
      | query_cache_type             | ON        | ---This should correspond to your change
      | query_cache_wlock_invalidate | OFF       |
      | query_prealloc_size          | 8192      |
      +------------------------------+-----------+
      10 rows in set (0.00 sec)

exit;
If you do not see your changes, try restarting mysql.
On centOS:
sudo /etc/init.d/mysqld restart

14 Make sure MySQL is running

mysqlshow -u root -p
      +--------------+
      | Databases    |
      +--------------+
      | mysql        |
      | leginondb    |
      | projectdb    |
      +--------------+

Run the following command from the command line:

Be sure to edit PASSWORD to the one you previously set for usr_object.

php -r "mysql_connect('localhost', 'usr_object', 'PASSWORD', 'leginondb'); echo mysql_stat();"; echo "" 

Expected output:

Uptime: 1452562 Threads: 1 Questions: 618 Slow queries: 0 Opens: 117 Flush tables: 1 Open tables: 106 Queries per second avg: 0.000

If there are any error messages, mysql may be configured incorrectly.

Note: If you do not have php and php-mysql packages installed you need to install them to run the above command. The yum installation is:

sudo yum -y install php php-mysql


Additional Setup After Webserver initialization

If your webserver installation is successful, a number of tables will be propagated in the databases. There were several options for setting up databse user privileges recommended in Database Server Installation. The following additional steps should be taken, depending on which option you previously used.


DB Migration Process

The tables that will be affected are in the dbemdata database and the project database.
Migrate the user data from project to dbemdata because dbemdata is already in Sinedon format.

dbemdata

project

Future:
Eventually, we would like to have 3 databases, appion, leginon and project. The user related tables in dbemdata would be moved to project.
All the tables in project still need to be converted to Sinedon format.

1 Add new columns to UserData

Add:

Leave the existing columns as is. Use of "name" and "full name" (with a space) will be phased out.

2 Copy data to the UserData table

From users, copy username, firstname, lastname to UserData.

Update existing dbemdata.UserData entries with information from project.users when the names match.

UPDATE UserData, project.users, project.login
    SET UserData.username=project.users.username,
        UserData.firstname=project.users.firstname,
        UserData.lastname=project.users.lastname,
        UserData.email=project.users.email 
    WHERE UserData.`full name` like concat(project.users.firstname, ' ',project.users.lastname) 
          and project.login.userId = project.users.userId 
          and project.users.userId not in(63,211)
          and UserData.DEF_id != 54

Some names did not match exactly. Update these seperatly.

//Palida?
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 42
          AND UserData.DEF_id = 25

//Gabe?
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 65
          AND UserData.DEF_id = 29

//Edward Bridgnole
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 78
          AND UserData.DEF_id = 41

//Pickwei
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 122
          AND UserData.DEF_id = 57

//Mark Daniels
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 199
          AND UserData.DEF_id = 65

//Chris Arthur
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 35
          AND UserData.DEF_id = 67

//Fei Sun
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 233
          AND UserData.DEF_id = 76

//Chi-yu Fu
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 245
          AND UserData.DEF_id = 78

//Otomo Takanori    uId=79    puId=252
UPDATE UserData, projectdata.users
    SET UserData.username=projectdata.users.username,
        UserData.firstname=projectdata.users.firstname,
        UserData.lastname=projectdata.users.lastname,
        UserData.email=projectdata.users.email 
    WHERE projectdata.users.userId = 252
          AND UserData.DEF_id = 79

Insert the rest of the project.users entries into the dbemdata.UserData table.

This inserts users that have a corresponding project.login entry and have not already been merged into existing dbemdata.UserData entries.
NRAMM usernames with no login entry are not transferred.

INSERT INTO dbemdata.UserData (username, firstname, lastname, email)
SELECT projectdata.users.username,projectdata.users.firstname, projectdata.users.lastname,projectdata.users.email
FROM projectdata.users
WHERE projectdata.users.userId IN (SELECT projectdata.login.userId FROM projectdata.login)
AND (projectdata.users.userId NOT IN 
(
    SELECT projectdata.users.userId userId
    FROM dbemdata.UserData, projectdata.users, projectdata.login
    WHERE dbemdata.UserData.`full name` LIKE concat( projectdata.users.firstname, ' ', projectdata.users.lastname )
    AND projectdata.login.userId = projectdata.users.userId
)
AND projectdata.users.userId NOT IN ( 42, 65, 78, 122, 199, 35, 233, 245, 252, 63, 211 ))

From project.login, copy user password to dbemdata.UserData.

UPDATE dbemdata.UserData, projectdata.login
SET dbemdata.UserData.password=projectdata.login.password
WHERE dbemdata.UserData.username = projectdata.login.username

3 Modify userdetails table

Remove the email column from the userdetails table.
From users, copy all needed fields.

Copy users from dbemdata.UserData to the project.userdetails table. inserts 188 rows

INSERT INTO projectdata.userdetails 
  (`REF|leginondata|UserData|user`, 
   title, 
   institution, 
   dept, 
   address, 
   city, 
   statecountry, 
   zip, 
   phone, 
   fax, 
   url)
SELECT dbemdata.UserData.DEF_id, projectdata.users.title, projectdata.users.institution, 
  projectdata.users.dept, projectdata.users.address, projectdata.users.city, 
  projectdata.users.statecountry, projectdata.users.zip, projectdata.users.phone, 
  projectdata.users.fax, projectdata.users.url
FROM dbemdata.UserData, projectdata.users
WHERE dbemdata.UserData.username = projectdata.users.username
AND projectdata.users.userId NOT IN ( 216, 224, 107, 204, 219, 241, 261 )

ignore:
project.users.userId username
216 nramm_hetzer (dup w/less data)
224 nramm_hjing
107 nramm_jlanman
204 nramm_rkhayat
219 nramm_rkhayat
241 nramm_vinzenz.unger
261 nramm_vinzenz.unger

4 Create projectowner table

Move the data from pis table to a new projectowner table in the project database. This table will refer to users in the UserData table.
We will phase out use of the pis table.

Insert users that are project owners and do not have login info and do not have a dbem user name.
Set the passwords to the username.

Add the following project owners to dbemdata.UserData:
nramm_mbevans
nramm_erica
nramm_erwright
nramm_mgfinn
nramm_pucadyil
nramm_abaudoux
nramm_kuzman
nramm_my3r
nramm_liguo.wang
nramm_bbartholomew
nramm_cciferri
nramm_galushin
nramm_nachury
nramm_mfisher1
nramm_nicoles
nramm_gokhan_tolun
nramm_rkirchdo

INSERT INTO dbemdata.UserData (username, firstname, lastname, email, password)
SELECT projectdata.users.username,projectdata.users.firstname, projectdata.users.lastname, 
projectdata.users.email, projectdata.users.username
FROM projectdata.users
WHERE projectdata.users.username IN ("nramm_mbevans", "nramm_erica", "nramm_erwright", "nramm_mgfinn", 
"nramm_pucadyil", "nramm_abaudoux", "nramm_kuzman", "nramm_my3r", "nramm_liguo.wang", "nramm_bbartholomew", 
"nramm_cciferri", "nramm_galushin", "nramm_nachury", "nramm_mfisher1", "nramm_nicoles", "nramm_gokhan_tolun", 
"nramm_rkirchdo")

Add their details into the userdetails table

INSERT INTO projectdata.userdetails (`REF|leginondata|UserData|user`, title, institution, 
dept, address, city, statecountry, zip, phone, fax, url)
SELECT dbemdata.UserData.DEF_id, projectdata.users.title, projectdata.users.institution, 
projectdata.users.dept, projectdata.users.address, projectdata.users.city, projectdata.users.statecountry, 
projectdata.users.zip, projectdata.users.phone, projectdata.users.fax, projectdata.users.url
FROM dbemdata.UserData, projectdata.users
WHERE dbemdata.UserData.username = projectdata.users.username
AND projectdata.users.username IN ( "nramm_mbevans", "nramm_erica", "nramm_erwright", 
"nramm_mgfinn", "nramm_pucadyil", "nramm_abaudoux", "nramm_kuzman", "nramm_my3r", "nramm_liguo.wang", 
"nramm_bbartholomew", "nramm_cciferri", "nramm_galushin", "nramm_nachury", "nramm_mfisher1", "nramm_nicoles", 
"nramm_gokhan_tolun", "nramm_rkirchdo")

Update the pis table with the correct usernames.
The correct usernames are the ones that the users actually use to login to the system.
They have been found by manual inspection.

UPDATE projectdata.pis
SET projectdata.pis.username="chappie" 
WHERE projectdata.pis.username="nramm_chappie" 

UPDATE projectdata.pis
SET projectdata.pis.username="carthur" 
WHERE projectdata.pis.username="nramm_Christopher.Arthur" 

UPDATE projectdata.pis
SET projectdata.pis.username="cpotter" 
WHERE projectdata.pis.username="nramm_cpotter" 

UPDATE projectdata.pis
SET projectdata.pis.username="craigyk" 
WHERE projectdata.pis.username="nramm_craigyk" 

UPDATE projectdata.pis
SET projectdata.pis.username="dfellman" 
WHERE projectdata.pis.username="nramm_dfellman" 

UPDATE projectdata.pis
SET projectdata.pis.username="dlyumkis" 
WHERE projectdata.pis.username="nramm_dlyumkis" 

UPDATE projectdata.pis
SET projectdata.pis.username="southworth" 
WHERE projectdata.pis.username="nramm_dsouthwo" 

UPDATE projectdata.pis
SET projectdata.pis.username="fapalida" 
WHERE projectdata.pis.username="nramm_fapalida" 

UPDATE projectdata.pis
SET projectdata.pis.username="feisun" 
WHERE projectdata.pis.username="nramm_feisun" 

UPDATE projectdata.pis
SET projectdata.pis.username="glander" 
WHERE projectdata.pis.username="nramm_glander" 

UPDATE projectdata.pis
SET projectdata.pis.username="haoyan" 
WHERE projectdata.pis.username="nramm_hao.yan" 

UPDATE projectdata.pis
SET projectdata.pis.username="jaeger" 
WHERE projectdata.pis.username="nramm_jaeger" 

UPDATE projectdata.pis
SET projectdata.pis.username="koehn" 
WHERE projectdata.pis.username="nramm_koehn" 

UPDATE projectdata.pis
SET projectdata.pis.username="mmatho" 
WHERE projectdata.pis.username="nramm_mmatho" 

UPDATE projectdata.pis
SET projectdata.pis.username="moeller" 
WHERE projectdata.pis.username="nramm_moeller" 

UPDATE projectdata.pis
SET projectdata.pis.username="muldera" 
WHERE projectdata.pis.username="nramm_mulderam" 

UPDATE projectdata.pis
SET projectdata.pis.username="paventer" 
WHERE projectdata.pis.username="nramm_paventer" 

UPDATE projectdata.pis
SET projectdata.pis.username="rharshey" 
WHERE projectdata.pis.username="nramm_rasika" 

UPDATE projectdata.pis
SET projectdata.pis.username="nramm_langlois" 
WHERE projectdata.pis.username="nramm_rl2528" 

UPDATE projectdata.pis
SET projectdata.pis.username="rmglaeser" 
WHERE projectdata.pis.username="nramm_rmglaeser" 

UPDATE projectdata.pis
SET projectdata.pis.username="rtaurog" 
WHERE projectdata.pis.username="nramm_rtaurog" 

UPDATE projectdata.pis
SET projectdata.pis.username="sstagg" 
WHERE projectdata.pis.username="nramm_sstagg" 

UPDATE projectdata.pis
SET projectdata.pis.username="tgonen" 
WHERE projectdata.pis.username="nramm_tgonen" 

UPDATE projectdata.pis
SET projectdata.pis.username="vossman" 
WHERE projectdata.pis.username="nramm_vossman" 

UPDATE projectdata.pis
SET projectdata.pis.username="ychaban" 
WHERE projectdata.pis.username="nramm_ychaban" 

Add project co-owners (the people who actually access the project).
Many of the project owners do not actually access the data. Add the users who actually work with the project.

INSERT INTO projectdata.pis (projectId, username)
VALUES (200,"nramm_fazam"), (230,"glander"), (190,"jlee"), 
(231,"glander"), (203,"Ranjan"), (181,"kubalek"), (84,"strable"), 
(222,"nramm_barbie"), (199,"joelq")

Insert rows into projectowners.
All project owners now have usernames in dbemdata.UserData and all projects have an active owner in project.pis.

INSERT INTO projectdata.projectowners (`REF|projects|project`, `REF|leginondata|UserData|user`)
SELECT projectdata.pis.projectId, dbemdata.UserData.DEF_id
FROM dbemdata.UserData, projectdata.pis
WHERE dbemdata.UserData.username = projectdata.pis.username

5 Set groups and privileges

Set all null groups to 4 (users)

UPDATE dbemdata.UserData
SET dbemdata.UserData.`REF|GroupData|group`= 4
WHERE dbemdata.UserData.`REF|GroupData|group` IS NULL

set all group privileges that are null to 3

UPDATE dbemdata.GroupData
SET dbemdata.GroupData.`REF|projectdata|privileges|privilege`=3
WHERE dbemdata.GroupData.`REF|projectdata|privileges|privilege` IS NULL

6 Update any NULL values in dbemdata.UserData

Set the full name in dbemdata.UserData.

UPDATE dbemdata.UserData
SET dbemdata.UserData.`full name` = concat(dbemdata.UserData.firstname, ' ', dbemdata.UserData.lastname)
WHERE dbemdata.UserData.`full name` IS NULL;
UPDATE dbemdata.UserData
SET dbemdata.UserData.username = dbemdata.UserData.name
WHERE dbemdata.UserData.username IS NULL;
UPDATE dbemdata.UserData
SET dbemdata.UserData.password = dbemdata.UserData.username
WHERE dbemdata.UserData.password IS NULL;
UPDATE dbemdata.UserData
SET dbemdata.UserData.firstname = "" 
WHERE dbemdata.UserData.firstname IS NULL;

update shareexperiments

UPDATE project.shareexperiments
SET project.shareexperiments.`REF|leginondata|SessionData|experiment` = project.shareexperiments.experimentId
WHERE project.shareexperiments.`REF|leginondata|SessionData|experiment` IS NULL;

add usernames where they are missing

UPDATE project.shareexperiments, project.users
SET project.shareexperiments.username = project.users.username 
WHERE project.users.userId = project.shareexperiments.userId 
AND project.shareexperiments.username IS NULL

update users who have a matching username in dbemdata

UPDATE project.shareexperiments, dbemdata.UserData 
SET project.shareexperiments.`REF|leginondata|UserData|user` = dbemdata.UserData.DEF_id 
WHERE dbemdata.UserData.username = project.shareexperiments.username 
AND project.shareexperiments.`REF|leginondata|UserData|user` IS NULL 

Default Groups and Privileges:

Group name Description Privilege
administrators may view and modify all groups, users, projects and experiments in the system All at administration level
power users may view and modify anything that is not specifically owned by the default Administrator User View all but administrate owned
users may view and modify project that they own and view experiments that have been shared with the user Administrate/view only owned projects and view shared experiments
guests may view projects owned by the user and experiments shared with the user View owned projects and shared experiments

Revert Settings

Revert Settings is a tool for use with the Leginon image acquisition software.

Leginon settings for the applications are saved in the database during the installation. When a user use Leginon the first time, the settings or the Appion/Leginon administrator user will be loaded. The user can change them and Leginon will remember the new values from then on.

In the case that a user incorrectly modifies Leginon application settings, the user or an administrator may revert all the settings of a specific user to the default values.

  1. Click on the Revert Settings icon
  2. Choose the user in the list, and click on the Revert button.
Expected result:

< Instruments | Applications >



Default users

Username Firstname Lastname Displayed Group Description
administrator Leginon-Appion Administrator administrators Default leginon settings are saved under this user
anonymous Public User guests If you want to allow public viewing to a project or an experiment, assign it to this user

Developers guide

This guide is primarily intended to help noobs to both Appion and Programming in general get up and running in the development environment that we have created at AMI.
It is a good place to add notes, however basic, that may help someone else accomplish a task related to Appion software development.
Parts of this guide are specific to machines and the environment that we have at AMI. Our apologies.

  1. System Overview
    1. Leginon
      1. Scope
      2. Windows Machine
    2. Appion
      1. Web Parts - web server
      2. Python Parts - processing server
      3. 3rd party apps
    3. Clusters
    4. Myami code module diagram
       
  2. Development Tools
    1. Redmine quick start - issue reporting and adding documentation
    2. Eclipse quick start - setting up an integrated development environment
    3. Subversion (SVN) - Tips for using our code repository
       
  3. Language and Technology Resources
    1. language specific tutorials, guides and tips
      1. HTML notes
      2. CSS notes
      3. PHP notes
      4. Python notes
      5. Javascript notes
    2. General Best Practices
    3. AMI's best practices
    4. Object Oriented Programming
    5. Useful shell commands
    6. Getting started with MySQL
       
  4. Installing Appion for development
    1. Running the code from your sandbox
       
  5. Adding a new program to the pipeline
    1. General Instructions
      1. Processing parts (Python)
        1. database access
      2. Web Parts (PHP)
        1. Adding an Appion job launch page
        2. reporting page
          1. Using basicReport.inc for very simple PHP report pages
        3. database accesss
    2. Adding a refinement method (single and multi model)
      1. Python wrapper for 3rd party programs (Anchi)
      2. Modifications to runJob.py (Christopher)
      3. Uploading results to the databse (Dmitry)
      4. Adding the user interface (Amber)
         
  6. Testing
    1. Test datasets at AMI
    2. Create Appion Session for testing purposes
    3. How to set up AMI databases on your local machine: Handy if you want to play with the databases without affecting anyone else.
    4. Automated testing
    5. How to test upload images with your own sandbox
    6. How to run manual picking and mask making
       
  7. Error Handling
    1. Error handling guide
       
  8. Adding pop-up Help
    1. adding popup help
       
  9. Making changes to database tables
    1. Database change procedure
       
  10. Other stuff
    1. Where to find help
    2. Appion tricks
    3. Common variables used
    4. Appion Developer's Workshops
      1. 2010 Appion Developer Workshop
      2. 2011 Appion Developer Workshop
  11. Job submission vs direct Appion Script running from terminal What are the differences in database logging and resource usage.

Differences between Linux flavors

Different Linux flavors often put web server and mysql-related files in different locations. This can be confusing. From experience, we found the equivalent on CentOS vs SuSE. Here we list them for reference. If your system uses different naming and you are willing to share your experience, please send us the list. We will add it here:

Table Different File locations and Commands on CentOS vs SUSE

File or Command Head CentOS SuSE
php.ini /etc/ /etc/php5/apache2/
httpd.conf /etc/httpd/conf/ /etc/php5/apache2/
default document_root /var/www/html/ /srv/www/htdocs/
apache start/stop/restart command head /sbin/service httpd /etc/init.d/apache2
mysql start/stop/restart command head /sbin/service mysqld /etc/init.d/mysql

For a more detailed comparison of Apache file layout on different Linux distributions, see http://wiki.apache.org/httpd/DistrosDefaultLayout


Install Web Server Prerequisites >



Doc test notes

change from 5.4 to 5.5
how can you tell what computer you have?

what is SHA1SUM confirmation and how do you find it?

links not found: * http://centos.mirrors.tds.net/pub/linux/centos/5.4/isos/x86_64/sha1sum.txt for 64bit * http://centos.mirrors.tds.net/pub/linux/centos/5.4/isos/i386/sha1sum.txt for 32bit

should i check the box that says packages from centos extras under please select any additional repositories that you want to use for soft ware installation.

how do i add myself as a sudoer?
how do i make sure i have root permission?

so many commands to know for doing the sudoers part.
use a vi cheat sheet on google.


Dog Picking

Use Dog Picker if you have no accurate idea of what your particle looks like or you simply want to pick everything (this will include blobs of noise).

General Workflow:

  1. Test first then submit: Dog picker will pick white blobs, so make sure to invert the density if necessary (usually ice images)
    Test the settings on a characteristic image (simply paste the filename of a typical image in the test settings box). To get an idea which parameters suit your data "mouseover" the boxes. There you can find some general estimates which you can use as starting point. During testing you should optimize one parameter at a time and then move on to the next. It is a good idea to optimize the settings on one image and then test a second image. Don’t worry about the final boxsize or binning this will be determined in the next step: stacks !
  2. Click on Run Dog Picker to submit the job: If you submit the job while you are still collecting data use the option: wait for more images after finishing.
  3. Continue with stacks

Notes, Comments, and Suggestions:

  1. If you want to rerun the job with identical settings go to Repeat from other session and select the desired job
  2. Tilt images:

< Particle Selection | CTF Estimation >



Download additional Software (CentOS Specific)

Install the additional package repositories

There are several additional CentOS repositories that you can install. These repositories provide additional packages, such as patented software (MP3 players), closed source applications (Flash plugin, Adobe Acrobat Reader) and lesser used packages (python numpy, Gnu Scientific Library). But some repositories install packages over other packages, which can cause problems and conflicts (ATrpms is bad at this). So we recommend only installing EPEL and RPM Fusion. Read more here:
CentOS Additional Repositories
Particularly, pay attention to the note about protecting yourself from unintended updates from 3rd party packages. The following yum plugin may help you:
yum-priorities plugin

Extra Packages for Enterprise Linux (EPEL)

Download repository rpm and install

sudo rpm -Uvh http://dl.fedoraproject.org/pub/epel/5/`uname -i`/epel-release-5-4.noarch.rpm

or CentOS 6:

wget 'http://mirrors.cat.pdx.edu/epel/6/i386/epel-release-6-7.noarch.rpm'
sudo yum --nogpgcheck localinstall epel-release-6-7.noarch.rpm 

RPM Fusion (optional)

Download repository rpms and install

sudo rpm -Uhv http://download1.rpmfusion.org/free/el/updates/testing/5/`uname -i`/rpmfusion-free-release-5-0.1.noarch.rpm
sudo rpm -Uvh http://download1.rpmfusion.org/nonfree/el/updates/testing/5/`uname -i`/rpmfusion-nonfree-release-5-0.1.noarch.rpm

Update current packages

Update the updater to make life easier

sudo yum -y update yum*

Update all packages

sudo yum -y update

NOTE

Download was over 129 MB (in July 2009) and 333 MB (in May 2010). If you have a slow internet connection you can setup presto/deltarpms, see this email and this email for more information

NOTE

Sometimes I have problems with 32bit packages, so uninstall of them:

rpm -qa --qf "%{NAME}.%{ARCH}\n" | grep i.86 | wc -l
sudo yum remove `rpm -qa --qf "%{NAME}.%{ARCH}\n" | grep i.86`

NOTE

You can also remove large packages like openoffice, java, and gimp to save space, if you are just making a server

sudo yum remove openoffice* gimp* java*

You will want to restart your computer when this completes.

sudo reboot

Install Complete list of additional packages:

General instructions for installation and configuration of some of these packages (such as mysql) are found later in this manual. It may be faster to install them now as a group rather than individually, but it is not necessary.

If you are using an RPM based system (e.g., SuSE, Mandriva, CentOS, or Fedora) this website is good for determining the exact package name that you need. For CentOS 5, just type:

sudo yum -y install \
python-tools python-devel python-matplotlib \
subversion ImageMagick grace gnuplot \
wxPython numpy scipy python-imaging \
gcc-gfortran compat-gcc-34-g77 \
gcc-objc fftw3-devel gsl-devel \
mysql mysql-server MySQL-python \
httpd php php-mysql phpMyAdmin  \
gcc-c++ openmpi-devel libtiff-devel \
php-devel gd-devel re2c fftw3-devel php-gd \
xorg-x11-server-Xvfb netpbm-progs \
libssh2-devel

If you have an nVidia video card and setup RPM fusion, install the nVidia binary, will speed things up especially for UCSF Chimera. This command works on Fedora

sudo yum -y install nvidia-x11-drv

for CentOS you will have to download and install the nvidia driver from the nvidia website

Clean up packages to save drive space

sudo yum clean all

Re-index the hard drive, this will come in handy later

sudo updatedb

Enable web and database servers on reboot

sudo /sbin/chkconfig httpd on
sudo /sbin/chkconfig mysqld on

You can further configure this with the GUI and turn off unnecessary items

system-config-services

Reboot the computer

sudo reboot

< Instructions for installing CentOS on your computer | Database Server Installation >



Download additional Software (Fedora Specific)

Install the additional package repositories

Unlike RHEL/CentOS, Fedora comes with an Extras repository by default that contains all of the open source software needed by Appion/Leginon.

That said, there are several additional Fedora repositories that you can install. These repositories provide additional packages that are not allowed in the default Fedora package list, such as patented software (MP3 and Movie players), closed source applications (Nvidia video driver, Flash plugin, Adobe acrobat reader). But some repositories install packages over other packages, which can cause problems and conflicts (ATrpms is especially bad at this), so avoid these repositories. So, we recommend only installing RPM Fusion.

RPM Fusion (optional)

Download repository rpms and install

sudo rpm -Uvh http://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-stable.noarch.rpm
sudo rpm -Uvh http://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-stable.noarch.rpm

Update current packages

Update the updater to make life easier

sudo yum -y update yum

Update all packages

sudo yum -y update

You will want to restart your computer when this completes.

sudo reboot

Install Complete list of additional packages:

General instructions for installation and configuration of some of these packages (such as mysql) are found later in this manual. It may be faster to install them now as a group rather than individually, but it is not necessary.

If you are using an RPM based system (e.g., SuSE, Mandriva, CentOS, or Fedora) this website is good for determining the exact package name that you need. For CentOS 5, just type:

sudo yum -y install \
python-tools python-devel python-matplotlib \
subversion ImageMagick grace gnuplot \
wxPython numpy scipy python-imaging \
gcc-gfortran compat-gcc-34-g77 \
gcc-objc fftw3-devel gsl-devel \
mysql mysql-server MySQL-python \
httpd php php-mysql phpMyAdmin  \
gcc-c++ openmpi-devel libtiff-devel \
php-devel gd-devel re2c fftw3-devel php-gd \
xorg-x11-server-Xvfb netpbm-progs \
xorg-x11-drv-nvidia

If you have an nVidia video card and setup RPM fusion, install the nVidia binary, will speed things up especially for UCSF Chimera. This command works on Fedora

sudo yum -y install nvidia-x11-drv

for CentOS you will have to download and install the nvidia driver from the nvidia website

Clean up packages to save drive space

sudo yum clean all

Re-index the hard drive, this will come in handy later

sudo updatedb

Enable web and database servers on reboot

sudo /sbin/chkconfig httpd on
sudo /sbin/chkconfig mysqld on

You can further configure this with the GUI and turn off unnecessary items

system-config-services

Reboot the computer

sudo reboot

< Instructions for installing Fedora on your computer | Complete Installation ^



Download Appion and Leginon Files

If you have not already downloaded the Appion and Leginon files,

Download Myami 2.2 (contains Appion and Leginon) using one of the following options:

 
This is a stable supported branch from our code repository.
Change directories to the location that you would like to checkout the files to (such as /usr/local) and then execute the following command:

svn co http://ami.scripps.edu/svn/myami/branches/myami-2.2 myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.
 

Option 2: Release Version as tar file - NOT AVAILABLE YET FOR 2.2

Option 3: SVN Development version

 
This contains features that may still be under development. It is not supported and may not be stable. Use at your own risk.

svn co http://ami.scripps.edu/svn/myami/trunk myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.


< Check php information | Install the MRC PHP Extension >



Download Leginon/Appion Files

Download Myami 2.2 (contains Appion and Leginon) using one of the following options:

 
This is a stable supported branch from our code repository.
Change directories to the location that you would like to checkout the files to (such as /usr/local) and then execute the following command:

svn co http://ami.scripps.edu/svn/myami/branches/myami-2.2 myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.
 

Option 2: Release Version as tar file - NOT AVAILABLE YET FOR 2.2

Option 3: SVN Development version

 
This contains features that may still be under development. It is not supported and may not be stable. Use at your own risk.

svn co http://ami.scripps.edu/svn/myami/trunk myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.


< Install supporting packages | Perform system check >



Download Myami 2.2 (contains Appion and Leginon) using one of the following options:

 
This is a stable supported branch from our code repository.
Change directories to the location that you would like to checkout the files to (such as /usr/local) and then execute the following command:

svn co http://ami.scripps.edu/svn/myami/branches/myami-2.2 myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.
 

Option 2: Release Version as tar file - NOT AVAILABLE YET FOR 2.2

Option 3: SVN Development version

 
This contains features that may still be under development. It is not supported and may not be stable. Use at your own risk.

svn co http://ami.scripps.edu/svn/myami/trunk myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.


Dual Viewer

The Dual Viewer splits the browser window to allow two instances of the Image Viewer to appear side by side. The following example shows images from two different Projects being displayed side by side. For more details see Image Viewer Overview.

Dual Viewer Screen:
Dual Viewer


< 3 Way Viewer | RCT >



Ed's Iteration Alignment

This method uses multiple iterations of Spider AP SR (reference-free) and Spider AP SH (reference-based) commands to align your particles. It is a set of batch-files used for analysis of conformational flexibility of fatty acid synthase during catalysis in Brignole, et al. Nature Structural and Molecular Biology vol16, 190-197 (2009).

General Workflow:

  1. Check to boxes of templates to be used as references during alignment.
  2. Click on "use these templates."
  3. Make sure that appropriate run name is specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  4. Enter a description of your run into the description box.
  5. Make sure that appropriate directory tree is specified.
  6. Select the stack to align from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  7. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  8. Double check that the templates are the ones you want to use.
  9. Click on "Run Ed-Iter Alignment" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  10. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Alignment" submenu in the appion sidebar.
  11. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Alignment" tab in the appion sidebar. This opens a summary of all alignments that have been done on this project.
  12. Click on the link next to "reference stack" to open a window that shows the class averages and that contains tools for exploring the result. Such tools include the ability to browse through particles in a given class, create templates for reference based alignment, substack creation, 3D reconstruction, etc.
  13. To perform a feature analysis, click on the grey link entitled "Run Feature Analysis on Align Stack ID xxx".

Notes, Comments, and Suggestions:

  1. This method sometimes produces a halo effect around particle averages. This is particularly noticeable for non-globular particles, due to the nature of the algorithm used to determine the radii of class averages output between alignment iterations. If this becomes an issue, we suggest trying an alternate alignment algorithm.
  2. In the parameters box on the right, under "Particle Params" the last and first ring radii refer to the inner and outermost rings along which alignment parameters will be determined. Good default values for a particle with a box size of 300 x 300 pixels are shown in the overview snapshot above.
  3. In the parameters box on the right, under "Alignment Params" the search range refers to the number of pixels that will be considered from the center of any given starting point during parameter determinination. A step size of 1 means that every single ring between first and last radii will be considered during the search. Good default values for a particle with a box size of 300 x 300 pixels are shown in overview snapshot above.
  4. Clicking on "Show Composite Page" in the Alignment Stack List page (accessible from the "completed" link under "Run Alignment" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Alignment | Run Feature Analysis >



Edit Project Description

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select the project you wish to edit by clicking on the project name.
  3. Select the <edit> button as shown below.
     
    Edit Project Button

     
  4. Modify the fields as desired and select the Update button.

< Create New Project | Edit Project Owners >



Edit Project Owners

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select the project you wish to edit by clicking on the project name.
  3. Under the Info section, locate the Owners: list and select the edit link.
     
    Edit Owners Link

     
  4. Use the drop down list to choose a user to add as an owner
  5. Click the add button
     

< Edit Project Description | Create a Project Processing Database >



EMAN Common Lines

This method relies on the central section theorem, which permits identification of identical intersecting 1D lines for all combinatorial pairs of 2D projections to assign Euler angles needed for 3D reconstruction. This method is only applicable when the specimen does not exhibit preferred orientation.

General Workflow:

Note: EMAN Common Lines can be accessed directly from the Appion sidebar, or by clicking on the "Run Common Lines" button displayed above class averages generated through 2D Alignment and Classification

  1. Select the clustering run to use for class averages from the drop down menu.
  2. Use the radio buttons to select the symmetry appropriate to your particles. Note the links to EMAN manual pages.
  3. Enter the class averages to include or to exclude during reconstruction.
  4. Click "Continue to next step>>".
  5. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  6. Enter a description of your run into the description box.
  7. Check the commut model to database box.
  8. Specify a mask to use that is larger than your molecule, and the number of rounds to iterate.
  9. Click on "Create Model" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  10. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run RCT Volume" submenu in the appion sidebar. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run EMAN Common Lines" tab in the appion sidebar. Clicking on this link opens a summary of all EMAN Common Lines runs that have been done on this project.
  11. Click on the EMAN common lines job run name to open a new window containing all relevant alignment and classification information.
  12. Click on the snapshot images to enlarge in a new window.
  13. Access the volume data file at this path
  14. Upload the volume to use as an initial model for refinement

Notes, Comments, and Suggestions:

  1. This method is not good if you have preferred orientation.

< Ab Initio Reconstruction | Refine Reconstruction >



EMAN Refinement

General Workflow:

  1. Select the CTF corrected stack of particles to use for refinement from the dropdown menu.
  2. Use the radio button to select the initial model to use for refinement. To upload an initial model click on the "Upload a new initial model" link or use one of the Import Tools in the Appion Sidebar.
  3. Click "Use this stack and model" to proceed to the next step.
  4. Select the cluster to which you would like to submit this job.
  5. Check the Run Name and Output Directory.
  6. Set cluster parameters as is appropriate.
  7. double check the number of nodes and processors per node to be allocated for the reconstruction within the cluster and check wall and CPU time parameters. The "Recon procs per node" option refers to the number of EMAN reconstruction processors per node, NOT the amount of processors allocated to the job, which can be more.
  8. Set specific EMAN parameters. Use the mouse to point to an individual parameter, and a help-box will float above that option giving the required information for the user to choose the parameters. It is possible to set default parameters for the first iteration by clicking on the "Set Defaults for Iteration 1" box, then copy the iteration by clicking on the "copy" radio button and modifying the parameters for the subsequent iteration as necessary. Within NRAMM we have experimented with several default refinement schemes, and have provided standard parameters for GroEL-type molecules, icosahedral viruses, and mostly asymmetric molecules such as the ribosome. Specifying one of these options under the "Import Parameters" dropdown toolbox will set basic default parameters for similar molecules.
  9. Click "Create Job File". This opens another window containing directions for final submission of the refinement. Below the directions, a copy of the job file is displayed.
  10. Click "Put files in DMF". This opens a pop-up window containing DMF commands.
  11. Copy the dmf commands.
  12. Open a terminal, and paste the dmf commands.
  13. Now click on "Submit job to Cluster." You can setup your Appion account such that you get email updates for refinements.

    If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "# running" option under the "EMAN Refinement" submenu in the appion sidebar. Once the job is finished, an additional link entitled "# complete" will appear under the "EMAN Refinement" submenu in the appion sidebar.
  14. When the job is finished, the files need to be transfered back to the main filesystem in a similar manner as before.
  15. Copy the command and paste into terminal
  16. Once the files are transferred back, the "Upload EMAN results" button takes you to the page that uploads all the reconstruction results to the database
  17. For each refinement, the summary page automatically displays for each iteration information such as the FSC curve, Euler angle distributions, good and bad classes, 3D snapshots of the model, and allows for numerous post-processing procedures

Notes, Comments, and Suggestions:


< Refine Reconstruction|Quality Assessment >



EMDB to Model

The user is able to retrieve emdb models from the Electron Microscopy Data Bank.

General Workflow:

To launch:

  1. The user enters the EMDB ID
  2. The user decides the resolution of the density model (typically filter to 15 - 25 Angstrom resolution)
  3. The user selects the symmetry of the model

Output:

  1. Once the model is generated, it will be deposited under "3d Density Volumes"
  2. The user then assess the quality of the model and decides whether to convert the volume into an initial model

Notes, Comments, and Suggestions:


< PDB to Model | Upload Particles >


Enable User Authentication

To enable or disable user authentication, run the setup wizard at http://YOUR_SERVER/myamiweb/setup.

If Appion/Leginon is configured to enable user authentication, the myamiweb user interface will allow users to:
  1. register a username and password
  2. login only with a valid username and password
  3. reset forgotten passwords
  4. modify their profile
  5. log out
  6. log in as an anonymous guest for viewing public data sets (version 2.1.0)

When a user points a browser to http://YOUR_SERVER/myamiweb, the following login screen is displayed:

Myamiweb Login Screen
Myami Login Screen


New User Registration >



Error Handling

  1. PHP error handling
  2. Python error handling

Example config file

The example config file below may be out of date.

<?php

/**
* The Leginon software is Copyright 2010
* The Scripps Research Institute, La Jolla, CA
* For terms of the license agreement
* see http://ami.scripps.edu/software/leginon-license
*
*/

/**
* Please visit http://yourhost/myamiwebfolder/setup
* for automatically setup this config file for the
* first time.
*/

require_once 'inc/config.inc';
define('WEB_ROOT',dirname(__FILE__));

// --- define myamiweb tools base --- //
define('PROJECT_NAME',"myamiweb");
define('PROJECT_TITLE',"Ambers Trunk Appion and Leginon Tools");

// --- define site base path -- //
// --- This should be changed if the myamiweb directory is located -- //
// --- in a sub-directory of the Apache web directory. -- //
// --- ex. myamiweb is in /var/www/html/applications/myamiweb/ then -- //
// --- change "myamiweb to "applications/myamiweb" -- //
define('BASE_PATH',"~amber/myamiweb");

define('BASE_URL',"/~amber/myamiweb/");
define('PROJECT_URL',"/~amber/myamiweb/project/");

// --- myamiweb login --- //
// Browse to the administration tools in myamiweb prior to
// changing this to true to populate DB tables correctly.
define('ENABLE_LOGIN', true);

// --- Administrator email title and email address -- //
define('EMAIL_TITLE',"asdfasf");
define('ADMIN_EMAIL',"amber@scripps.edu");

// --- When 'ENABLE_SMTP set to true, email will send out -- //
// --- via ADMIN_EMIL's SMTP server. --//
define('ENABLE_SMTP', false);
define('SMTP_HOST',"");

// --- Check this with your email administrator -- //
// --- Set it to true if your SMTP server requires authentication -- //
define('SMTP_AUTH', false);

// --- If SMTP_AUTH is not required(SMTP_AUTH set to false, -- //
// --- no need to fill in 'SMTP_USERNAME' & SMTP_PASSWORD -- //
define('SMTP_USERNAME',"");
define('SMTP_PASSWORD',"");

// --- Set your MySQL database server parameters -- //
define('DB_HOST',"cronus4.scripps.edu");
define('DB_USER',ask someone);
define('DB_PASS',ask someone);
define('DB_LEGINON',"dbemdata");
define('DB_PROJECT',"project");

// --- default URL for project section --- //
define('VIEWER_URL', BASE_URL."3wviewer.php?expId=");
define('SUMMARY_URL', BASE_URL."summary.php?expId=");
define('UPLOAD_URL', BASE_URL."processing/uploadimage.php");

// --- Set cookie session time -- //
define('COOKIE_TIME', 0); //0 is never expire.

// --- defaut user group -- //
define('GP_USER', 'users');

// --- XML test dataset -- //
$XML_DATA = "test/viewerdata.xml";

// --- Set Default table definition -- //
define('DEF_PROCESSING_TABLES_FILE', "defaultprocessingtables.xml");
define('DEF_PROCESSING_PREFIX',"ap");

// --- Set External SQL server here (use for import/export application) -- //
// --- You can add as many as you want, just copy and paste the block -- //
// --- to a new one and update the connection parameters -- //
// --- $SQL_HOSTS['example_host_name']['db_host'] = 'example_host_name'; -- //
// --- $SQL_HOSTS['example_host_name']['db_user'] = 'usr_object'; -- //
// --- $SQL_HOSTS['example_host_name']['db_pass'] = ''; -- //
// --- $SQL_HOSTS['example_host_name']['db'] = 'legniondb'; -- //

$SQL_HOSTS[DB_HOST]['db_host'] = DB_HOST;
$SQL_HOSTS[DB_HOST]['db_user'] = DB_USER;
$SQL_HOSTS[DB_HOST]['db_pass'] = DB_PASS;
$SQL_HOSTS[DB_HOST]['db'] = DB_LEGINON;

// --- path to main --- //
set_include_path(dirname(__FILE__).PATH_SEPARATOR
.dirname(__FILE__)."/project".PATH_SEPARATOR
.dirname(__FILE__)."/lib".PATH_SEPARATOR
.dirname(__FILE__)."/lib/PEAR");

// --- add plugins --- //
// --- uncomment to enable processing web pages -- //
addplugin("processing");

define('DEFAULT_APPION_PATH',"/ami/data00/appion/");

// --- Add as many processing hosts as you like -- //
// --- Please enter your processing host information associate with -- //
// --- Maximum number of the processing nodes -- //
// --- $PROCESSING_HOSTS[] = array('host' => 'host1.school.edu', 'nproc' => 4); -- //
// --- $PROCESSING_HOSTS[] = array('host' => 'host2.school.edu', 'nproc' => 8); -- //

$PROCESSING_HOSTS[] = array('host' => 'guppy.scripps.edu',
                            'nproc' => 8,
                            'nodesdef' => '2',
                            'nodesmax' => '8',
                            'ppndef' => '8',
                            'ppnmax' => '8',
                            'reconpn' => '8',
                            'walltimedef' => '2',
                            'walltimemax' => '2',
                            'cputimedef' => '2',
                            'cputimemax' => '2',
                            'memorymax' => '30',
                            'appionbin' => '/opt/myamisnap/bin/appion/',
                            'baseoutdir' => DEFAULT_APPION_PATH,
                            'localhelperhost' => 'guppy.scripps.edu',
                            'dirsep' => '/' );

$PROCESSING_HOSTS[] = array('host' => 'garibaldi.scripps.edu', 
                            'nproc' => 8,
                            'nodesdef' => '16',
                            'nodesmax' => '280',
                            'ppndef' => '4',
                            'ppnmax' => '8',
                            'reconpn' => '4',
                            'walltimedef' => '240',
                            'walltimemax' => '240',
                            'cputimedef' => '240',
                            'cputimemax' => '240',
                            'memorymax' => '30',
                            'appionbin' => '~bcarr/appionbin/',
                            'baseoutdir' => '', //sends appion procession output to a location under the users home directory on the remote host
                            'localhelperhost' => 'amibox03.scripps.edu',
                            'dirsep' => '/' );

// --- register your cluster configure file below i.e (default_cluster) --- //
// --- $CLUSTER_CONFIGS[] = 'cluster1'; -- //
// --- $CLUSTER_CONFIGS[] = 'cluster2'; -- //

//$CLUSTER_CONFIGS[] = 'guppy_cluster';
//$CLUSTER_CONFIGS[] = 'garibaldi';
//$CLUSTER_CONFIGS[] = 'test1_cluster';
//$CLUSTER_CONFIGS[] = 'test2_cluster';

// --- Microscope spherical aberration constant
// --- Example : 2.0 --- //
define('DEFAULTCS',"2.0");

// --- Restrict file server if you want --- //
// --- Add your allowed processing directory as string in the array
$DATA_DIRS = array();

// --- Enable Image Cache --- //
define('ENABLE_CACHE', false);
// --- caching location --- //
// --- please make sure the apache user has write access to this folder --- //
// --- define('CACHE_PATH', "/srv/www/cache/"); --- //
define('CACHE_PATH',"");
define('CACHE_SCRIPT', WEB_ROOT.'/makejpg.php');

// --- define Flash player base url --- //
define('FLASHPLAYER_URL', "/flashplayer/");

// --- define python commands - path --- //

// to download images as TIFF or JPEG
// $pythonpath="/your/site-packages";
// putenv("PYTHONPATH=$pythonpath");

// To use mrc2any, you need to install the pyami package which is part
// of myami. See installation documentation for help.
// --- define('MRC2ANY', "/usr/bin/mrc2any" --- //
define('MRC2ANY',"/usr/bin/mrc2any");

// --- Check if IMAGIC is installed and running, otherwise hide all functions --- //
define('HIDE_IMAGIC', false);

// --- Check if MATLAB is installed and running, otherwise hide all functions --- //
define('HIDE_MATLAB', false);

// --- hide processing tools still under development. --- //
define('HIDE_FEATURE', false);

// --- temporary images upload directory --- //
define('TEMP_IMAGES_DIR',"/tmp");

// --- use appion warpper --- //
define('USE_APPION_WRAPPER', true);
// --- define('APPION_WRAPPER_PATH', ""); --- //
define('APPION_WRAPPER_PATH',"/opt/myamisnap/bin/appion");

// --- sample tracking ---//
define('SAMPLE_TRACK', false);

// --- exclude projects in statistics. give a string with numbers separated by ',' ---//
// --- for example, "1,2" ---//
define('EXCLUDED_PROJECTS',"");

// --- hide processing tools still under development. --- //
define('HIDE_TEST_TOOLS', false);

$TEST_SESSIONS = array(
        'zz07jul25b'
        ,'zz06apr27c'
        ,'zz09feb12b'
        ,'zz09apr14b'
        ,'zz09feb18c'
    );

?>

Explanation of Sample Names

MySQL database usernames, Leginon/Appion username, and your linux login username are all different. Each serves its own purpose.

Database names, user, and password need to be entered during web server and sinedon.cfg setup.
In our example, we have:

purpose example name
database name for leginon parameters and metadata leginondb
database name for project management projectdb
database name prefix for appion processing ap
database user name usr_object
database user password (not set)

username for Leginon image viewing and appion processing/reporting from the web = username registered for Leginon running
Firstname used in the registration + Lastname used in the registration = fullname entered in leginon.cfg


Relevant Topics:
Database Server Installation
Configure sinedon.cfg
Web Server Installation



File Server Setup Considerations

A specific file tree structure has been assumed as default in Appion/Leginon. Until v2.2 release, this can not be altered. Here is a description of it:

  1. Leginon default image path is defined in leginon.cfg. The acceptable form is /your_file_server_mount_point/whatever/leginon. The images are saved by sessionname under subdirectory of this directory in a form such as
    /your_file_server_mount_point/whatever/leginon/sessionname/rawdata/.
  2. Appion processing is default under appion directory parrallel to leginon directory, i.e., /your_file_server_mount_point/whatever/appion. This relationship is hard-coded until v2.2 release. The processing are divided by sessionname and then processingtype and then runname of the process. Therefore, your processing results are saved in the following default appion path: /your_file_server_mount_point/whatever/appion/sessionname/processing_type/runname/

The following permission rule is required for multi-unix-user usage of Leginon/Appion:

For Leginon:

  1. The web server apache user need to have read permission to all images in the rawdata directories in order to display them in the viewer.
  2. all users need to have permission to create directory in /your_file_server_mount_point/whatever/leginon
  3. all users need to have read permission in a leginon rawdata directory that contains reference images.
  4. Users need to have write permission to the session they created, of course.

For Appion:

  1. Images uploaded to Appion is considered as a Leginon session. See above for its file tree structure.
  2. The web server apache user need to have read permission to processing directories to display the results.
  3. All users need to have permission to create directory in /your_file_server_mount_point/whatever/appion
  4. If more than one user will be processing data in the same session, they all need write permission to the session directory /your_file_server_mount_point/whatever/appion/sessionname/


< Database Server Installation | Processing Server Installation >



File Server Setup Considerations

A specific file tree structure has been assumed as default in Appion/Leginon. Until v2.2 release, this can not be altered. Here is a description of it:

  1. Leginon default image path is defined in leginon.cfg. The acceptable form is /your_file_server_mount_point/whatever/leginon. The images are saved by sessionname under subdirectory of this directory in a form such as
    /your_file_server_mount_point/whatever/leginon/sessionname/rawdata/.
  2. Appion processing is default under appion directory parrallel to leginon directory, i.e., /your_file_server_mount_point/whatever/appion. This relationship is hard-coded until v2.2 release. The processing are divided by sessionname and then processingtype and then runname of the process. Therefore, your processing results are saved in the following default appion path: /your_file_server_mount_point/whatever/appion/sessionname/processing_type/runname/

The following permission rule is required for multi-unix-user usage of Leginon/Appion:

For Leginon:

  1. The web server apache user need to have read permission to all images in the rawdata directories in order to display them in the viewer.
  2. all users need to have permission to create directory in /your_file_server_mount_point/whatever/leginon
  3. all users need to have read permission in a leginon rawdata directory that contains reference images.
  4. Users need to have write permission to the session they created, of course.

For Appion:

  1. Images uploaded to Appion is considered as a Leginon session. See above for its file tree structure.
  2. The web server apache user need to have read permission to processing directories to display the results.
  3. All users need to have permission to create directory in /your_file_server_mount_point/whatever/appion
  4. If more than one user will be processing data in the same session, they all need write permission to the session directory /your_file_server_mount_point/whatever/appion/sessionname/

Filter by MeanStdev

Filter particle stack by mean and stdev values.

General Workflow:

  1. Check run name, output directory location, and write a description for this filtering run.
  2. Input min and max values for the x-axis (mean particle intensity) and the y-axis (stdev).
  3. Click "Test selected points" to refresh the graph with a trapezoid that defines the particles to keep.
  4. Check the resulting trapezoid and repeat steps 2-3 until satisfied.
  5. Click "Create SubStack" to submit to the cluster. Alternatively, click "Just Show Command" to obtain a command that can be copied and pasted into a unix shell.
  6. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Stacks" submenu in the Appion Sidebar. When the job is finished, the tally for "X Complete" under the "Stacks" submenu in the Appion Sidebar will increase by one; the filtered stack can be accessed on the summary page by clicking on the "X complete" link.

Notes, Comments, and Suggestions:

<View Stacks | Center Particles >


Frealign Refinement

General Workflow:

  1. Decide which EMAN reconstruction you want to use to import initial orientations. Note the ID number and stack info.
  2. Go to "Stack Creation" and follow the general workflow to create a new stack [[http://ami.scripps.edu/redmine/projects/appion/wiki/Stack_creation]] with the following exceptions:
Once the stack is created, go back to "Frealign Refinement"
  1. Select the non-CTF corrected, density inverted stack from the dropdown menu.
  2. Select the non-CTF corrected stack of particles to use for the final reconstruction from the dropdown menu.
  3. Use the radio button to select the initial model to use for refinement. To upload an initial model click on the "Upload a new initial model" link or use one of the Import Tools in the Appion Sidebar. The initial model does NOT need have the density inverted.
  4. Click "Use this stack and model" to proceed to the next step.
  5. Check the Run Name and Output Directory.
  6. Use the radio button to choose initial orientation parameters. These can be imported from a previously run EMAN reconstruction, or determined by Frealign.
  7. Set cluster parameters as is appropriate.
  8. Set specific Frealign parameters. Use the mouse to point to an individual parameter, and a help-box will float above that option giving the required information for the user to choose the parameters.
  9. Click "Prepare Frealign" to submit to the cluster. Alternatively, click "Just Show Command" in order to get a command that can be pasted into a unix shell. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Frealign Refinement" submenu in the appion sidebar.
  10. Once the job is finished, an additional link entitled "1 prepared" will appear under the "Frealign Refinement" tab in the appion sidebar. Click on this link.
  11. Use the radio button to choose the prepared frealign job you want to launch.
  12. Click "Use this prepared job" to proceed to the next step.
  13. Select the cluster to which you would like to submit this job.
  14. Check wall and CPU time parameters.
  15. Click "Show Job File". This opens another window containing directions for final submission of the refinement. Below the directions, a copy of the job file is displayed.
  16. Click "Put files in DMF". This opens a pop-up window containing DMF commands.
  17. Copy the dmf commands.
  18. Open a terminal, and paste the dmf commands.
  19. Now click on "Submit job to Cluster." You can setup your Appion account such that you get email updates for refinements.
  20. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Frealign Refinement" submenu in the appion sidebar. Once the job is finished, an additional link entitled "1 complete" will appear under the "Frealign Refinement" submenu in the appion sidebar.

Notes, Comments, and Suggestions:


< Refine Reconstruction|Quality Assessment >



Compiling Fast free-hand

Attached to this posting there are five scripts:

Compiling instructions

1. Copy the attached scripts to your working directory.

2. Open and edit lines 47 - 49 of f90c_ubuntu64simple.csh to include the path to the following MRC library commands:

47: /usr/local/image2010/lib/imlib2010.a
48: /usr/local/image2010/lib/misclib.a
49: /usr/local/image2010/lib/genlib.a

3. Compile:

./f90c_ubuntu64simple.csh fastfreehand_v1_01.f90
./f90c_ubuntu64simple.csh totsumstack.f90

These will compile into fastfreehand_v1_01.exe and totsumstack.exe, respectively.


General Best Practices

Any Language

Web performance best practices

Database

REST


Goniometer

Goniometer settings are for use with the Leginon image acquisition software.

If you are not using Leginon, you may ignore the Goniometer settings. If you are using Leginon, please refer to the Leginon user manuals section on Goniometer.


< Applications



Grid Management

The grid management tools handle registration, modification, and deletion of grid boxes and grids. A grid registered in the database can be associated the images easily. Currently, the management tool is primarily used for grids to be handled by grid insertion/extraction robots coupled with Leginon's "Robot-MSI screen" applications. Apart from that, only "Manual" application has the interface for selecting grids from the database. Appion-only users can ignore this management tool.

This tool is located at http://your_myamiweb/project/gridtray.php

Grid registration starts with register a new grid box.

1. Grid Boxes

The grids are located in a grid box.
  1. Click on <new> tab under the heading "Grid Box".
  2. Type an label unique to the database.
    There are three types of grid boxes defined: 4 slot cryo grid box, typical 50 slot grid box, and 96 well robot grid tray
  3. Select the type of grid box used.
  4. Click on "add" to register this grid box in the database.

2. Grids

  1. Click on <new> tab under the heading "Grid" to register a new grid

After the grid is added, it can be assigned to a location on an existing grid box by choosing the box and click on a location not yet occupied by a grid.

3. Upload Grids

This allows the user to register multiple grids on the same grid box. This is most useful for a robot grid tray. Above rules for the grid fields still applies to the uploaded grids. In addition, only grids from the same project can be uploaded at one uploading grid file.
  1. Click on upload grid/tray
  2. Create grid file on local computer. The fields should be separated by tabs.
  3. Select the project with which the grids in the file is associated to.
  4. Type the grid box label to which the grids are inserted into. If the grid box does not exist, a new robot tray is created.
  5. Browse to choose the grid file and upload

< View a Summary of a Project Session



Groups

Groups are used to associate Users with common privileges.
Several default groups are included with your installation and correspond to the available privilege levels.

Default Groups and Privileges:

Group name Description Privilege
administrators may view and modify all groups, users, projects and experiments in the system All at administration level
power users may view and modify anything that is not specifically owned by the default Administrator User View all but administrate owned
users may view and modify project that they own and view experiments that have been shared with the user Administrate/view only owned projects and view shared experiments
guests may view projects owned by the user and experiments shared with the user View owned projects and shared experiments

View all Groups

Groups may be viewed and managed within the Administration tool by clicking on the Groups Icon:

A list of the available groups is displayed.
Click on a group in the list to show the group information.

Add a new group

  1. Type a new Group Name in the Name field of the GroupData table
  2. Provide a description (optional)
  3. Select a privilege level
  4. Press the Save button

Modify a group

  1. Select the Group you wish to modify from the Group list
  2. Change the desired fields
  3. Press the Save button.

Remove a group

Note: This feature is currently disabled.


Users >



Helical Processing

Helical or tubular crystals, which often occur upon high-density reconstitution of membrane proteins into lipid membranes, offer some unique advantages over 2D crystals. A single image of a helical tube provides all of the information required for calculating a 3D map, and the inclusion of many tubes can be combined to improve the resolution without the need to tilt the sample in the microscope. Helical processing can be performed in real space or in Fourier space. Appion encourages the use of independent methods as every dataset is different and therefore responds differently to various protocols. In addition, the use of multiple packages can be a tool to improve reliability of reconstructions, as each method should converge on a similar result.

Helical Processing Options Available in Appion:

  1. Phoelix
  2. Refinement Post-processing Procedures

Appion Sidebar Snapshot:

Notes, Comments, and Suggestions:


< Stacks|Quality Assessment >



Helical Image Processing (PHOELIX)

General Workflow:

  1. Make sure that the appropriate run name and directory tree are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter the directory tree where the mandatory files are located. These files include llbo.sa, strong.sa, range.sa, cutfit1.dek, cutfit2.dek, cutfit3.dek, chop1.dek, chop2.dek, and template. If you do not have these files already, complete the steps for Running preHIP and follow the guided protocol to generate them.
  3. Enter a description of your run into the description box.
  4. Select the stack to process from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  5. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked. This box is automatically deactivated when running preHIP).
  6. Enter the Filament Parameters or verify the default values.
  7. Enter the Processing Parameters or verify the default values.
  8. Click on "Run HIP" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.

Running PreHIP:

NOTE: Steps requiring user feedback/interaction are in BLUE, steps detailing what the program is doing are in GREEN, warning messages are in RED.

After completing steps 1-7 in the General Workflow complete these additional steps:
  1. Click the "Run PreHIP" checkbox which will activate the indexing parameters below it and display a warning message.
  2. Fill in the Indexing Parameters.
  3. Select either the "Use rise and twist" or "Use layer line/bessel order" radio button which will activate the parameters below it. You can only choose one of these options, therefore the input parameter boxes under the non-selected option become deactivated.
  4. Click "Just Show Command" to generate the UNIX shell command. Do NOT click "Run HIP", it will not run because the interactive GUI must be executed from a command line.
  5. Note that the "Commit to Database" box is deactivated. preHIP is used only for generating the mandatory files and therefore the preliminary results are not stored to the database.
  6. Copy and paste the command into a UNIX shell.
  7. Review the llbo.sa file when it is displayed and either approve or reject the LLBO assignments. If you reject it, you have the option of either restarting preHIP with different rise/twist/helical repeat values or supplying your own llbo.sa (if you choose this option make sure it is in the proper format! You can use the failed llbo.sa as a reference.)
  8. Scroll through the files using the Tkir viewer to see if the filaments were centered properly (files with extension .s are raw files and .sbut are filtered). When finished, close both Tkir windows. Enter either "yes" or "no" after the prompt in the terminal. If you enter "no", you will be prompted to enter a new filter value and the filaments will be recentered with this value. Continue this process until you find a filter value that works.
  9. All files are diffracted and summed together to find the strongest layer lines. Review the strong layer lines when they are displayed and either approve or reject them. If they are not correct you have the option to change, remove, or add lines. Follow the instructions as prompted until you are satisfied.
  10. The strong layer lines from step 9 are used to select ranges for out of plane tilt and shift correction. Review/adjust the ranges when they are displayed in the Tkll viewer. Right click to remove points, left click to add a point. Each layer line can only have 2 points and they must be ordered 0-1. If you modify the file you must save it (File > Save Data > Ok), do not change the filename. Close the Tkll window. Be sure to evaluate each layer line file and choose the best one to use as a template in step 11.
  11. Select the layer line file from the displayed list that you would like to use as a template for round 1 of averaging. Copy and paste the exact filename. Each subsequent round of averaging uses the previous round's average as the template. If you would like to use a different file as a template, make sure it has the same llbo combo as the current session, then enter the filename including the full directory path.
  12. Select a few layer lines for the cutfit1.dek file, which is used during averaging for fitting and scaling to the template. Right click to remove points, left click to add a point. Each layer line can only have 2 points and they must be ordered 0-1. Save the file when you are done, (File > Save Data > Ok), do not change the filename. Close the Tkll window. There are 3 iterations of fitting in each round of averaging. Repeat this process for cutfit2 and cutfit3, selecting a few more low-moderate resolution layer lines for each.
  13. Review/adjust chop1.dek and chop2.dek making sure the ranges are over the significant portion of each layer line, which is the area between the inner and outer radii of the helix. Sniffing searches a region around each predicted layer line location to find and extract the layer line with the lowest phase residual over the ranges specified here. Right click to remove points, left click to add a point. Each layer line can only have 2 points and they must be ordered 0-1. Every layer line needs a range, except for LL 0. Save the file when you are done, (File > Save Data > Ok), do not change the filename. Close the Tkll window.
  14. When preHIP finishes it will delete all generated files except for the indexing files required for running HIP. The final average will also be saved as "template" if you want to use it as the template for subsequent runs. It will display the directory containing these mandatory files and the optimized filter value.

Helical Processing

Helical or tubular crystals, which often occur upon high-density reconstitution of membrane proteins into lipid membranes, offer some unique advantages over 2D crystals. A single image of a helical tube provides all of the information required for calculating a 3D map, and the inclusion of many tubes can be combined to improve the resolution without the need to tilt the sample in the microscope. Helical processing can be performed in real space or in Fourier space. Appion encourages the use of independent methods as every dataset is different and therefore responds differently to various protocols. In addition, the use of multiple packages can be a tool to improve reliability of reconstructions, as each method should converge on a similar result.

Helical Processing Options Available in Appion:

  1. Helical Image Processing (PHOELIX)
  2. Refinement Post-processing Procedures

Appion Sidebar Snapshot:

Notes, Comments, and Suggestions:


< Stacks|Quality Assessment >


Hierarchical or K-means Clustering

This method clusters particles using k-means or hierarchical ascendancy according to metrics obtained via feature analysis procedures such as correspondence analysis.

General Workflow:

Note: "Run Particle Clustering" can be accessed directly from a feature analysis run, or via the "Run Particle Clustering" link in the Appion sidebar menu. In the latter case, you will be taken to the list of feature analyses that have been completed, where you can select the feature analysis run which you wish to cluster.

  1. Check that the appropriate alignment and feature analysis runs have been chosen.
  2. Click on the dendrogram to enlarge.
  3. Select the boxes beneath eigen images that you wish to use for clustering.
  4. Write in a list of numbers of classes to calculate. The default is to calculate 4, 16, and 64 classes.
  5. Use the radio button to select Hierarchical or K-means clustering.
  6. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  7. Click on "Run Cluster Coran" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  8. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Particle Clustering" submenu in the appion sidebar.
  9. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Particle Clustering" tab in the appion sidebar. Clicking on this link opens a summary of all clustering runs that have been done on this project.
  10. Click on the link corresponding to the number of class averages calculated to view, and/or click on the [variance] link to view 2D variance maps. A new tab will open with additional processing tools.
  11. In the further processing window, use the boxes and pull down menus to set the range, binning, quality, and info of images to display, and click "load" to affect setting changes.
  12. Change the mouse selection mode from exclude (red) to include (green), depending on your needs. Use your mouse to select images to include or exclude. Note that a list of included and excluded images is automatically generated.
  13. Select from the options to perform on selected/excluded images. A new tab will open for processing, and the current window will remain open so that you can come back and perform multiple operations.

Notes, Comments, and Suggestions:

  1. The interactive mode of our webpages do not work with the Safari web-browers. Firefox works well.
  2. Clicking on "Show Composite Page" at the top of the Cluster Stack List page will expand the page to show the relationships between alignment runs and feature analysis runs.
  3. Clicking on "Show Composite Page" in the Cluster Stack List page (accessible from the "completed" link under "Run Particle Clustering" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Particle Clustering | Ab Initio Reconstruction >



Hole Template Viewer

The Hole Template Viewer tool allows Leginon users to view the templates used to find grid holes.


< Tomography Tool



How to add a job launch page to the Appion pipeline

  1. Add a php page with the basic appion template
     
    Most of our launch pages are php files with at leat 2 functions, one to create a form for the user to fill out, and another to build a job command when the user submits the form. You can copy an existing PHP file such as runSimple.php to create your new launch page. To give your page the Appion processing page look and feel with the header and side menu, be sure the functions processing_header($title,$heading,$javascript) and processing_footer() are called.
     
    A starting template with documentation is available here to get you started:
    cp -v runAppionScript.php.template runMyProgram.php
    
     
  2. Add a link to your page in the menuprocessing.php file or from another page
     
    The menuprocessing file is a bit tricky to work with.
     
  3. Create a new form class for your package specific parameters
     
    You can copy simpleParamsForm.inc as a template for your own form parameters. There are 2 primary functions to define.
    1. Define the constructor
      This is where all your parameters are listed. Values passed into the constructor become default values. Validations can be added to any of the parameters.
    2. Define the generateForm() function
      This function outputs html. There are many predefined parameter fields that can be used to build your form.
       
  4. Add pop-up help messages to help.js
     
    Located at myami/myamiweb/processing/js/help.js.
    1. Add a new namespace for your form, you can copy the 'simple' section. Don't forget any commas.
    2. add a help string for each of the parameter keys in your form
    3. make sure $javascript .= writeJavaPopupFunctions(); is in your createForm() function in your php launch page prior to the processing header function.
       
  5. Add a publication reference for the package you are using
     
    1. Edit /myami/myamiweb/processing/inc/publicationList.inc to include an entry for any references you need to add to your launch page.
    2. publications can be added to a page with the following code:
          $pub = new Publication('appion'); 
          echo $pub->getHtmlTable(); //returns the html reference to the "appion" publication
      

       
  6. Use your new form class in your launch page
     
    1. Modify the launch page createForm() function
      The create form function outputs the html needed for your form. The myami/myamiweb/processing/inc/forms directory holds reusable form classes based on the basicForm.inc class. Any combination of these can be used to add parameters to your form with little knowlege of html. You may also create a new form class to define the parameters specific to your job command.
      1. Add a call to your form class generateForm() function, adding default values in the constructor.
            $simpleParamsForm = new SimpleParamsForm('','','','CHECKED','','','10','30','','20','100','2','2','10','','0.8','40','3','3');
            echo $simpleParamsForm->generateForm();
        
      2. Add a reference to your publication
            $pub = new Publication('appion');
            echo $pub->getHtmlTable();
        
    2. Modify the launch page createCommand() function
      1. instantiate and validate your form parameters
            $simpleParamsForm = new SimpleParamsForm();
            $errorMsg .= $simpleParamsForm->validate( $_POST );
        
      2. Create your new command
            /* *******************
             PART 2: Create program command
             ******************** */
            $command = "runSimpleCluster.py ";
        
            // add run parameters
            $command .= $runParametersForm->buildCommand( $_POST );
        
            // add simple parameters
            $command .= $simpleParamsForm->buildCommand( $_POST );
        
      3. Add a reference to your publication in the header info
      4. Change the jobtype passed in showOrSubmitCommand()

How to add a new refinement method

database architecture for refinement methods

The current database scheme for every refinement method (both single-model and multi-model) is shown below:

database architecture for refinements

For reference, below is a diagram of the modifications to the refinement pipeline that have been performed for the refactoring. Color coding is as follows:

changes to the database architecture for refinements

How to add a new refinement

  1. determine the name of the new table in the database. In most cases, this will only be called "ApYourPackageRefineIterData." Unless there are specific parameters for each particle that you would like to save, this should probably contain all of your package-specific parameters.
  2. write a refinement preparation script in python (see example below).

What's being done in the background

the ReconUploader base class takes care of a many different functions, specifically:

Write refinement preparation script in python

Write refinement job script in python

Add job type to Agent.

After you have added the new refinement methods job class it needs to be added to the job running agent by editting the file apAgent.py in appionlib.

  1. Add the name of the module you created to the import statements at the top of the file.
  2. In the method createJobInst add the new refinment job type to the condition statements.
      Ex.
      elif "newJobType" == jobType:
                jobInstance = newModuleName.NewRefinementClass(command)
    

Upload refinement script in python

The script should be titled 'uploadYourPackageRefine.py'

This script performs all of the basic operations that are needed to upload a refinement to the database, such that it can be displayed in AppionWeb. The bulk of the job is performed with the ReconUploader.py base class, which is inherited by each new uploadYourPackageRefine.py subclass script. this means that the developer's job is simply to make sure that all of the particle / package parameters are being passed in a specific format. Effectively, the only things that need to be written to this script are:

  1. define the basic operations that will be performed: this will setup basic package parameters and call on converter functions. The simplest case is the external refinement package uploader, in which case only the general refinement parameters are uploaded to the database:
def __init__(self):
    ###    DEFINE THE NAME OF THE PACKAGE
    self.package = "external_package" 
    super(uploadExternalPackageScript, self).__init__()

#=====================
def start(self):

    ### determine which iterations to upload; last iter is defaulted to infinity
    uploadIterations = self.verifyUploadIterations()                

    ### upload each iteration
    for iteration in uploadIterations:
        for j in range(self.runparams['numberOfReferences']):

            ### general error checking, these are the minimum files that are needed
            vol = os.path.join(self.resultspath, "recon_%s_it%.3d_vol%.3d.mrc" % (self.params['timestamp'], iteration, j+1))
            particledatafile = os.path.join(self.resultspath, "particle_data_%s_it%.3d_vol%.3d.txt" % (self.params['timestamp'], iteration, j+1))
            if not os.path.isfile(vol):
                apDisplay.printError("you must have an mrc volume file in the 'external_package_results' directory")
            if not os.path.isfile(particledatafile):
                apDisplay.printError("you must have a particle data file in the 'external_package_results' directory")                                        

            ### make chimera snapshot of volume
            self.createChimeraVolumeSnapshot(vol, iteration, j+1)

            ### instantiate database objects
            self.insertRefinementRunData(iteration, j+1)
            self.insertRefinementIterationData(iteration, j+1)

    ### calculate Euler jumps
    self.calculateEulerJumpsAndGoodBadParticles(uploadIterations)
In the single-model refinement case (example Xmipp projection-matching):
def __init__(self):
    ###    DEFINE THE NAME OF THE PACKAGE
    self.package = "Xmipp" 
    self.multiModelRefinementRun = False
    super(uploadXmippProjectionMatchingRefinementScript, self).__init__()

def start(self):

    ### database entry parameters
    package_table = 'ApXmippRefineIterData|xmippParams'

    ### set projection-matching path
    self.projmatchpath = os.path.abspath(os.path.join(self.params['rundir'], self.runparams['package_params']['WorkingDir']))

    ### check for variable root directories between file systems
    apXmipp.checkSelOrDocFileRootDirectoryInDirectoryTree(self.params['rundir'], self.runparams['cluster_root_path'], self.runparams['upload_root_path'])

    ### determine which iterations to upload
    lastiter = self.findLastCompletedIteration()
    uploadIterations = self.verifyUploadIterations(lastiter)    

    ### upload each iteration
    for iteration in uploadIterations:

        apDisplay.printColor("uploading iteration %d" % iteration, "cyan")

        ### set package parameters, as they will appear in database entries
        package_database_object = self.instantiateProjMatchParamsData(iteration)

        ### move FSC file to results directory
        oldfscfile = os.path.join(self.projmatchpath, "Iter_%d" % iteration, "Iter_%d_resolution.fsc" % iteration)
        newfscfile = os.path.join(self.resultspath, "recon_%s_it%.3d_vol001.fsc" % (self.params['timestamp'],iteration))
        if os.path.exists(oldfscfile):
            shutil.copyfile(oldfscfile, newfscfile)

        ### create a stack of class averages and reprojections (optional)
        self.compute_stack_of_class_averages_and_reprojections(iteration)

        ### create a text file with particle information
        self.createParticleDataFile(iteration)

        ### create mrc file of map for iteration and reference number
        oldvol = os.path.join(self.projmatchpath, "Iter_%d" % iteration, "Iter_%d_reconstruction.vol" % iteration)
        newvol = os.path.join(self.resultspath, "recon_%s_it%.3d_vol001.mrc" % (self.params['timestamp'], iteration))
        mrccmd = "proc3d %s %s apix=%.3f" % (oldvol, newvol, self.runparams['apix'])
        apParam.runCmd(mrccmd, "EMAN")

        ### make chimera snapshot of volume
        self.createChimeraVolumeSnapshot(newvol, iteration)

        ### instantiate database objects
        self.insertRefinementRunData(iteration)
        self.insertRefinementIterationData(package_table, package_database_object, iteration)

    ### calculate Euler jumps
    self.calculateEulerJumpsAndGoodBadParticles(uploadIterations)    

    ### query the database for the completed refinements BEFORE deleting any files ... returns a dictionary of lists
    ### e.g. {1: [5, 4, 3, 2, 1]} means 5 iters completed for refine 1
    complete_refinements = self.verifyNumberOfCompletedRefinements(multiModelRefinementRun=False)
    if self.params['cleanup_files'] is True:
        self.cleanupFiles(complete_refinements)

in the multi-model refinement case (example Xmipp ML3D):
def __init__(self):
    ###    DEFINE THE NAME OF THE PACKAGE
    self.package = "XmippML3D" 
    self.multiModelRefinementRun = True
    super(uploadXmippML3DScript, self).__init__()

def start(self):

    ### database entry parameters
    package_table = 'ApXmippML3DRefineIterData|xmippML3DParams'

    ### set ml3d path
    self.ml3dpath = os.path.abspath(os.path.join(self.params['rundir'], self.runparams['package_params']['WorkingDir'], "RunML3D"))

    ### check for variable root directories between file systems
    apXmipp.checkSelOrDocFileRootDirectoryInDirectoryTree(self.params['rundir'], self.runparams['cluster_root_path'], self.runparams['upload_root_path'])

    ### determine which iterations to upload
    lastiter = self.findLastCompletedIteration()
    uploadIterations = self.verifyUploadIterations(lastiter)                

    ### create ml3d_lib.doc file somewhat of a workaround, but necessary to make projections
    total_num_2d_classes = self.createModifiedLibFile()

    ### upload each iteration
    for iteration in uploadIterations:

        ### set package parameters, as they will appear in database entries
        package_database_object = self.instantiateML3DParamsData(iteration)

        for j in range(self.runparams['package_params']['NumberOfReferences']):

            ### calculate FSC for each iteration using split selfile (selfile requires root directory change)
            self.calculateFSCforIteration(iteration, j+1)

            ### create a stack of class averages and reprojections (optional)
            self.compute_stack_of_class_averages_and_reprojections(iteration, j+1)

            ### create a text file with particle information
            self.createParticleDataFile(iteration, j+1, total_num_2d_classes)

            ### create mrc file of map for iteration and reference number
            oldvol = os.path.join(self.ml3dpath, "ml3d_it%.6d_vol%.6d.vol" % (iteration, j+1))
            newvol = os.path.join(self.resultspath, "recon_%s_it%.3d_vol%.3d.mrc" % (self.params['timestamp'], iteration, j+1))
            mrccmd = "proc3d %s %s apix=%.3f" % (oldvol, newvol, self.runparams['apix'])
            apParam.runCmd(mrccmd, "EMAN")

            ### make chimera snapshot of volume
            self.createChimeraVolumeSnapshot(newvol, iteration, j+1)

            ### instantiate database objects
            self.insertRefinementRunData(iteration, j+1)
            self.insertRefinementIterationData(package_table, package_database_object, iteration, j+1)

    ### calculate Euler jumps
    self.calculateEulerJumpsAndGoodBadParticles(uploadIterations)            

    ### query the database for the completed refinements BEFORE deleting any files ... returns a dictionary of lists
    ### e.g. {1: [5, 4, 3, 2, 1], 2: [6, 5, 4, 3, 2, 1]} means 5 iters completed for refine 1 & 6 iters completed for refine 2
    complete_refinements = self.verifyNumberOfCompletedRefinements(multiModelRefinementRun=True)
    if self.params['cleanup_files'] is True:
        self.cleanupFiles(complete_refinements)
  1. write python functions that will convert parameters. Examples of these converters can be found in the python scripts below:

http://ami.scripps.edu/svn/myami/trunk/appion/bin/uploadXmippRefine.py (simplest)
http://ami.scripps.edu/svn/myami/trunk/appion/bin/uploadXmippML3DRefine.py (simple multi-model refinement case)
http://ami.scripps.edu/svn/myami/trunk/appion/bin/uploadEMANRefine.py (complicated, due to additional features / add-ons)

Below is a list of necessary functions, everything else is optional:

Appion parameter format

In order to utilize the base class ReconUploader.py to upload all parameters associated with the refinement the following files must exist:

  1. an FSC file. Lines that are not read should begin with a "#". Otherwise, the first column must have values in inverse pixels. The second column must have the Fourier shell correlation for that spatial frequency. You can have as many additional columns as you would like, but they will be skipped by ReconUploader.py
  2. .img/.hed files describing projections from the model and class averages belonging to those Euler angles. The format is as follows: image 1 - projection 1, image 2 - class average 1, image 3 - projection 2, image 4 - class average 2, etc., see below
  3. the 3D volume in mrc format
  4. a text file describing the parameters for each particle. NOTE: PARTICLE NUMBERING STARTS WITH 1, NOT 0. An example file is attached. The columns are as follows:
    1. particle number - starts with 1!!!
    2. phi Euler angle - rotation Euler angle around Z, in degrees
    3. theta Euler angle - rotation Euler angle around new Y, in degrees
    4. omega Euler angle - rotation Euler angle around new Z (in-plane rotation), in degrees
    5. shiftx - in pixels
    6. shifty - in pixels
    7. mirror - specify 1 if particle is mirrored, 0 otherwise. If mirrors are NOT handled in the package, and are represented by different Euler angles, leave as 0
    8. 3D reference # - 1, 2, 3, etc. Use 1 for single-model refinement case
    9. 2D class # - the number of the class to which the particle belongs. Leave as 0 if these are not defined
    10. quality factor - leave as 0 if not defined
    11. kept particle - specifies whether or not the particle was discarded during the reconstruction routine. If it was KEPT, specify 1, if it was DISCARDED, specify 0. If all particles are kept, all should have a 1.
    12. post Refine kept particle (optional) - in most cases just leave as 1 for all particles

How to run manual picking and mask making

Manual picking and mask making use GUIs and require user interaction.
When running these at AMI, you can use amibox02 or amibox03.
When you ssh, you need to use the -X flag to tell the terminal to display the GUI.

ssh -X amibox02

Then put the correct path to appion in front of the command, such as /ami/sw/bin/appion.

/ami/sw/bin/appion makestack2.py --single=start.hed --selectionid=1002 --invert --normalized --maskassess=manualrun1 --boxsize=16 --description="test" --projectid=5 --preset=upload --session=10may13l35 --runname=stack7 --rundir=/ami/data00/appion/10may13l35/stacks/stack7 --no-rejects --no-wait --commit --reverse --limit=1 --continue

How to test upload images with your own sandbox

Just had to do this so taking some notes:

the objective

I have images that I want to upload located in my home directory.
I want to use my sandbox on the web side of appion.
I want to use the wrapper/appion snapshot/beta appion for the python parts. However you want to call it, I just need a recent version.
I want to upload the images to my own private database that is located on the fly server.
When the images are uploaded, I want them stored in my home directory rather than on /ami/data00.

setup

sinedon.cfg in home directory

Make sure sinedon.cfg is in your home directory and the host, user and password correspond to your database.
Make sure the projectdata and leginondata settings are set to the name of your db.

leginon.cfg in home directory

Make sure leginon.cfg is in your home directory and the images path is set to a folder in your home directory.

config.php located in your sandbox under myamiweb

Make sure the database information matches what is found in sinedon.cfg.

create the command

execute the command

Fly does not have the python parts installed.
Guppy does not have access to your home directory to get the images.

  1. ssh into amibox02 or amibox03 to run uploadimage.py
  2. Copy and past the command that was provided by the web page into the terminal
  3. Add the path to the wrapper installation to make sure you are executing the most recent version of the python code.
    This is probably /opt/myamisnap/bin/appion and it should be pre-pended to the command.

example:

/opt/myamisnap/bin/appion uploadImages.py --projectid=268 --image-dir=/home/amber/uploadedimages/pairedimages --mpix=1E-09 --type=defocalseries --images-per-series=2 --defocus-list=-1E-10,-2E-10 --mag=50000 --kv=120 --description="defocal test" --jobtype=uploadimage


HTML notes


Image Assessment

Image Assessment Options Available in Appion:

  1. Web Img Assessment: Go through micrographs and picks one at a time using a web interface.
  2. Multi Img Assessment: Go through multiple micrographs using the web interface.
  3. Run Image Rejector

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:



< Appion Processing



Image Viewer

From the Appion and Leginon Tools start page, select Image Viewer to view images associated with your Project Sessions in a single viewing pane.
The following screen is displayed. For more details see Image Viewer Overview.

Image Viewer Screen:
Image Viewer Screen


< Image Viewer Overview | 2 Way Viewer >



Image Viewers

  1. Image Viewer Overview
  2. Image Viewer
  3. 2 Way Viewer
  4. 3 Way Viewer
  5. Dual Viewer
  6. RCT


< Project DB | LOI - Leginon Observer Interface >



  1. Image Viewer Overview
  2. Image Viewer
  3. 2 Way Viewer
  4. 3 Way Viewer
  5. Dual Viewer
  6. RCT

IMAGIC Angular Reconstitution

Coming soon! Working out some bugs...

General Workflow:

Notes, Comments, and Suggestions:


< Ab Initio Reconstruction | Refine Reconstruction >



IMAGIC Multi Reference Alignment

This method uses the IMAGIC M-R-A command to align your particles.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Select the stack to align from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  4. Select a template stack that will be used as references. If you do not have a template stack uploaded to the database, you will not be able to use this feature. To upload a template stack (which can be anything, i.e. raw particles, class averages, forward projections, etc., but must be in .img / .hed [IMAGIC] format), click on the "Template Stacks" option in the menu. Once that is complete, you will be able to use this alignment feature.
  5. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  6. Specify the number of processors to use (the feature currently only runs on a single node, but can do so on as many processors as that node has).
  7. Specify whether or not you want to filter the particle, under the "particle parameters" section
  8. Do the same in the "Reference Parameters" section. Note that you can threshold the pixel densities of the references to cut off negative (dark) values. If you do not want to threshold the pixel intensities, leave as -999.
  9. Specify the alignment procedure, with corresponding alignment parameters. The pop-up menus explain what each parameter means
  10. Click on "Run Multi Reference Alignment" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  11. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "# running" option under the "Run Alignment" submenu in the appion sidebar.
  12. Once the job is finished, an additional link entitled "# complete" will appear under the "Run Alignment" tab in the appion sidebar. Clicking on this link opens a summary of all alignments that have been done on this project.
  1. Click on the "alignstack.hed" link to browse through aligned particles.
  2. To perform a feature analysis, click on the grey link entitled "Run Feature Analysis on Align Stack Id xxx" within the box that summarizes this alignment run.

Notes, Comments, and Suggestions:

  1. brute force alignment: This alignment method, available in the pop-up menu, has produced very nice results in test runs. It is obviously very slow, but well worth experimenting with. The higher the value for "number of orientations", the more orientations the algorithm compares, and the slower the alignment (scales linearly).
  2. Clicking on "Show Composite Page" in the Alignment Stack List page (accessible from the "completed" link under "Run Alignment" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Alignment | Run Feature Analysis >


IMAGIC Refinement

Coming Soon! Working out a few bugs...

General Workflow:

Notes, Comments, and Suggestions:


< Refine Reconstruction|Quality Assessment >



Import Tools

Appion provides various tools for importing 2D images as well as 3D volumes for various usage ranging from templates for alignment to 3D models for model refinement.

Available Import Tools:

  1. PDB to Model
  2. EMDB to Model
  3. Upload Particles
  4. Upload Template
  5. Upload Model
  6. Upload More Images
  7. Upload Stack

Appion Sidebar Snapshot:

Notes, Comments, and Suggestions:


< CTF Estimation | Image Assessment >



Index of test scripts

File Name Path Server Test Purpose Install Script Test Tool Wizard Added to
syscheck.py myami/leginon/ Processing Package Installation tells you which versions of python and third party python packages you have installed X checkprocessingserver.py
check.sh myami/appion/ Processing Appion Installation imports Appion libraries and runs binaries X X checkprocessingserver.py
test1.py and test2.py myami/leginon/ Networking detect problems due to a firewall or host name resolution X X
createtestsession.py myami/appion/test Web Image Viewer, Pipeline (Manual) loads up a session filled with sample images for one to test with X
testsuite.py myami/appion/test Web Pipeline executes processing pipeline X X
teststack.py myami/appion/test Web Pipeline reads and writes stacks X
ex1.php, ex2.php, mymrc.mrc myami/programs/php_mrc-5.3 Web MRC installation show that mrc extensions have been correctly installed X X checkwebserver.php
Desired Tests
Server Test Purpose Install Script Test Tool Wizard Added to...
DB Package Installation X X
DB MySQL variables make sure user modified my.cnf X
Processing Scipy/Numpy make sure scipy and numpy work correctly X checkProcessingServer.py
Processing leginon.cfg check that the file was created by user X X checkProcessingServer.py
Processing sinedon.cfg check that the file was created by user X X checkProcessingServer.py
Processing EMAN Installation check that help window is displayed X checkProcessingServer.py
Processing Spider Installation launch spider X
Processing Xmipp Installation launch Xmipp X checkProcessingServer.py
Web Package Installation X X checkwebserver.php
Web mrc.so check that it exists X X
Web mrc tools verify installation with info.php X checkwebserver.php
Web config.php ensure there are no extra lines at the end after the php tag X X checkwebserver.php

Installation Troublshooting Guide

You can run all of our troubleshooting scripts in a terminal.

cd /path/to/myami/install
python troubleshooter.py

Or, you can run the individual scripts described below.

Troubleshooting the Processing Server:

Perform system check

In addition to the downloads from our svn repository, there are several other requirements that you will get either from your OS installation source, or from its respective website. The system check in the Leginon package checks your system to see if you already have these requirements.

cd myami/leginon/ 
python syscheck.py

If python is not installed, this, of course will not run. If you see any lines like "*** Failed...", then you have something missing. Otherwise, everything should result in "OK".

Test Appion

You need to edit leginon.cfg.

Note: check3rdPartyPackages.py is currently only available with development svn checkout, will be included in version 2.2

Troubleshooting the Web Server:

Run the web server troubleshooter

A web server troubleshooting tool is available at http://YOUR_HOST/myamiweb/test/checkwebserver.php.
You can browse to this page from the Appion and Leginon Tools home page (http://YOUR_HOST/myamiweb) by clicking on [test Dataset] and then [Troubleshoot].

This page will automatically confirm that your configuration file and PHP installation and settings are correct and point you to the appropriate documentation to correct any issues.

Firewall settings

You may need to configure your firewall to allow incoming HTTP (port 80) and MySQL (port 3306) traffic:

$ system-config-securitylevel

Security-enhanced linux

Security-enhanced linux may be preventing your files from loading. To fix this run the following command:

$ sudo /usr/bin/chcon -R -t httpd_sys_content_t /var/www/html/

see this website for more details on SELinux


Installing Appion with an existing Leginon installation

Make sure you have the latest version of Leginon installed. Then follow the steps below to install image processing packages on your Processing Server.

  1. Configure .appion.cfg
  2. Install External Packages
  3. Install EMAN
  4. Install EMAN2
  5. Install SPIDER
  6. Install Xmipp
  7. Install UCSF Chimera
  8. Install Grigorieff lab software
  9. Compile FindEM
  10. Install Ace2
  11. Install Imod
  12. Install Protomo
  13. Test Appion


Create a Test Project >



Install Ace2

Test Ace 2 binary

The 64bit Ace2 binary is already available in the myami/appion/bin directory.
Test it by changing directories to myami/appion/bin and type the following commands:

./ace2.exe -h
./ace2correct.exe -h

If it is working the help commands will display.

Compile Ace 2 from source

It is highly recommended to use the Ace2 binary, if it works.

see Compile Ace 2 from source


< Compile FindEM | Install Imod >



Install Apache Web Server

1. Install the Apache Web Server with the YaST or yum utility.

2. Find "httpd.conf".
This is /etc/httpd/conf/httpd.conf on CentOS and /etc/apache2/httpd.conf on SuSE

sudo nano /etc/httpd/conf/httpd.conf

3. Edit the "httpd.conf" configuration file with the following:

 DirectoryIndex index.html index.php

 HostnameLookups On
 

Note: It may be possible to edit httpd.conf in YaST2 as well.

4. If you plan to enable the web interface user login feature, the ServerName directive should be set to a resolvable host name and UseCanonicalnames should be turned on. This will ensure the link provided in the email to verify user registration is valid. Follow the example below replacing YourServer.yourdomain.edu with your servers name.

     ServerName YourSever.yourdomain.edu  

     UseCanonicalName On
  

5. Restart the web server.

apachectl restart
     or
sudo /sbin/service httpd restart     (ON CentOS/RHEL/Fedora)
     or
/etc/sbin/rcapache2 restart   (ON SuSE)
     or
/sbin/service httpd restart
If you want to start the web server automatically at boot on SuSE
sudo /sbin/chkconfig apache2 on  #SuSE
sudo /sbin/chkconfig httpd on  #CentOS/RHEL/Fedora

< Configure php.ini | Check php information >



Quick Install of Appion and Leginon using the auto-installation tool

NOTICE: We are in the process of releasing a new version of the installation tool. There are a few bugs right now, so you might want to hold off on using it until this notice is removed.

As of Appion/Leginon 2.1.0, there is an quick installation script available for use. It currently installs both Appion and Leginon on a single computer running the CentOS Linux operating system. This is not really intended for production systems but is instead perhaps a useful way to evaluate the software prior to undertaking a more complex installation across multiple CPU's. Please note that the auto-installation is intended for use with a clean installation of CentOS and might fail if you already have several other packages installed.

Follow the instructions below to install Appion and Leginon using the auto-installation script.

  1. Install CentOS.
    Disable SELinux during the installation. You can check to see if it is enabled after the centOS installation with:
    /usr/sbin/selinuxenabled
    echo $?
    

    If SELinux is enabled, the return value is 0. If it is disabled, the return value is 1. If you need to disable SELinux, follow the instructions for disabling SELinux found in the troubleshooting section of this document.
    The Auto-Installer currently assumes that you have a fresh centOS installation and that you have not set up a mysql database.
     
  2. Login to CentOS as the root user
     
  3. Request a registration key which will be needed during installation. We ask that you register so that we can inform the owners of processing packages such as EMAN, XMIPP and SPIDER how many people are recieving their software from us. These numbers help them gain funding to maintain the software.
     
  4. Download centosAutoInstallation.py
     
  5. Open a Terminal, go to the download directory and run:
    python centosAutoInstallation.py
    
  6. Enter an administrator's email address, and root password when prompted.
     
  7. You may choose to install several of the third party software packages that Appion uses for image processing. If you do not allow the installer to install them at this time, you can follow the manual installation instructions found toward the end of this page.
     
  8. Wait. Depending on how many packages are updated during yum update, installation could take anywhere between 6 minutes and 2 hours.
     
  9. When the installation completes:
    1. A log file will be created called installation.log. You may want to view it and look for any errors.
    2. The Leginon start page may be displayed. If you are not using Leginon, close this window.
    3. A web browser will be launched with the Appion and Leginon tools loaded.
       
  10. Click on [test dataset]
     
  11. You should see two images in an image viewer. If they are not visible, run the troubleshooter by clicking on [Troubleshoot].
     
  12. Click on [Home] to return to the start page.
     
  13. Click on Project DB and you should find a GroEL Demo project that has been created for you.
     
  14. Return to the [Home] page.
     
  15. Click on [Image Viewer] and you should find the sample images that have been added to a Sample Session in the Demo Project.
     
  16. Click on [Processing] to begin running image processing jobs on the sample GroEL images.
     
  17. Following the directions in Process Images is a good test of the installation.
     
  18. You will want to continue by installing the third party processing packages which are required for many of the features supported by Appion:

  1. Configure .appion.cfg
  2. Install External Packages
  3. Install EMAN
  4. Install EMAN2
  5. Install SPIDER
  6. Install Xmipp
  7. Install UCSF Chimera
  8. Install Grigorieff lab software
  9. Compile FindEM
  10. Install Ace2
  11. Install Imod
  12. Install Protomo
  13. Test Appion

Troubleshooting

SELinux is Enabled

If your installation completes and you see a Welcome to Leginon window, a second Leginon window that appears blank and a web page with the Web Tools Setup wizard in a System updating state as shown below, you have installed CentOs with Security Enhanced Linux enabled. You need to disable SE Linux:

  1. Edit file "/etc/selinux/config"
  2. Change "SELINUX=enforcing" to "SELINUX=disabled"
  3. Save the file
  4. Restart your computer

Once SELinux is disabled, the configuration will complete on its own.

You may need to reconfigure the database if the Tools web page is not displayed.

SELinux Enabled

 

A database was previously configured

If you see the errors shown in the screen below, you may have previously configured a database with mysql. You will need to re-install CentOS and run the installation script again.

Database already configured.


Install Appion Packages

Install the appion python package

Since the appion package includes many executable scripts, it is important that you know where they are being installed. To prevent cluttering up the /usr/bin directory, you can specify an alternative path, typically /usr/local/bin, or a directory of your choice that you will later add to your PATH environment variable. Install appion like this:

cd /path/to/myami-VERSION/appion
sudo python setup.py install --install-scripts=/usr/local/bin/appion

Install all the myami python packages except appion using the following script:

cd /path/to/myami-VERSION/myami
sudo ./pysetup.sh install

That will install each package, and report any failures. To determine the cause of failure, see the generated log file "pysetup.log". If necessary, you can enter a specific package directory and run the python setup command manually. For example, if sinedon failed to install, you can try again like this:

cd sinedon
sudo python setup.py install

python-site-package-path: where the installed python packages went:

Python installer put the packages you installed into its site-packages directory. This enables all users on the same computer to access them. The easiest way to discover where your installed package is loaded from by python is to load a module from the package using interactive python command lines like this:

Start the python command line from shell:

python

Import a module from the package. Let's try sinedon here. All packages installed through the above setup.py script should go to the same place.
At the python prompt (python>) type:

import sinedon

If the module is loaded successfully, call the module attribute path (two underscrolls before "path" and two underscrolls after) will return the location of the module it is loaded from

sinedon.__path__
['/usr/lib/python2.4/site-packages/sinedon']

In this case, /usr/lib/python2.4/site-packages/ is your python-site-package-path. If you go to that directory, you will find all the packages you just installed.

Save this value as an environment variable for use later, for bash:

export PYTHONSITEPKG='/usr/lib/python2.4/site-packages'

or C shell
setenv PYTHONSITEPKG '/usr/lib/python2.4/site-packages'

Finally, you will need to set the the MATLABPATH environment variable in order to get the Appion utilities that use Matlab to work.
For bash:

export MATLABPATH=$MATLABPATH:<your_appion_directory>/ace

or C shell
setenv MATLABPATH $MATLABPATH:<your_appion_directory>/ace


< Perform system check | Configure leginon.cfg >



cd /path/to/myami-VERSION/myami
sudo ./pysetup.sh install

That will install each package, and report any failures. To determine the cause of failure, see the generated log file "pysetup.log". If necessary, you can enter a specific package directory and run the python setup command manually. For example, if sinedon failed to install, you can try again like this:

cd sinedon
sudo python setup.py install

python-site-package-path: where the installed python packages went:

Python installer put the packages you installed into its site-packages directory. This enables all users on the same computer to access them. The easiest way to discover where your installed package is loaded from by python is to load a module from the package using interactive python command lines like this:

Start the python command line from shell:

python

Import a module from the package. Let's try sinedon here. All packages installed through the above setup.py script should go to the same place.
At the python prompt (python>) type:

import sinedon

If the module is loaded successfully, call the module attribute path (two underscrolls before "path" and two underscrolls after) will return the location of the module it is loaded from

sinedon.__path__
['/usr/lib/python2.4/site-packages/sinedon']

In this case, /usr/lib/python2.4/site-packages/ is your python-site-package-path. If you go to that directory, you will find all the packages you just installed.

Save this value as an environment variable for use later, for bash:

export PYTHONSITEPKG='/usr/lib/python2.4/site-packages'

or C shell
setenv PYTHONSITEPKG '/usr/lib/python2.4/site-packages'


Install all the myami python packages except appion using the following script:

cd /your_download_area/myami
sudo ./pysetup.sh install

That will install each package, and report any failures. To determine the cause of failure, see the generated log file "pysetup.log". If necessary, you can enter a specific package directory and run the python setup command manually. For example, if sinedon failed to install, you can try again like this:

cd /your_download_area/myami/sinedon
sudo python setup.py install

Install the Appion python package

Important: You need to install the current version of Appion packages to the same location that you installed the previous version of Appion packages. You may have used a flag shown below (--install-scripts=/usr/local/bin) in your original installation. If you did, you need to use it this time as well. You can check if you installed your packages there by browsing to /usr/local/bin and looking for ApDogPicker.py. If the file is there, you should use the flag. if the file is not there, you should remove the flag from the command to install Appion to the default location.

The pysetup.py script above did not install the appion package. Since the appion package includes many executable scripts, it is important that you know where they are being installed. To prevent cluttering up the /usr/bin directory, you can specify an alternative path, typically /usr/local/bin, or a directory of your choice that you will later add to your PATH environment variable. Install appion like this:

cd /your_download_area/myami/appion
sudo python setup.py install --install-scripts=/usr/local/bin 


Install EMAN

EMAN1 is a fundamental package used in Appion for general file conversion and image filtering.

Download latest EMAN1 (not EMAN2) binaries (currently version 1.9):

  1. Go to the EMAN webpage
  2. Click on (download) next to EMAN1
  3. Scroll to Download 1.9
  4. Click on Download 1.9 link
  5. Select the appropriate package

Install EMAN

  1. Go to download directory and unzip the tar file:
    tar -zxvf eman-linux-x86_64-cluster-1.9.tar.gz
  2. Move the unzipped folder to a global location
    sudo mv -v EMAN /usr/local/
  3. Run the EMAN installer, it sets up the EMAN python module (must be run from the EMAN directory)
    cd /usr/local/EMAN/
    ./eman-installer
    

Set enivromental variables

You may need to log out and log back in for these changes to take place.

Test EMAN install

Run proc2d

proc2d help

Should popup a window displaying help for proc2d


< Install External Packages | Install EMAN2 >



Install EMAN2/SPARX

It is best to install EMAN2/SPARX from source, so that do not have conflicts with having two different versions of python on your system. Binaries of EMAN2/SPARX all come with their own python pre-installed.

This documentation assumes you are using CentOS 6 (written as of CentOS 6.2)

Install require pre-packages for EMAN2 compiling

yum based packages

bsddb3

Additionally you need to install the python-bsddb3 library (not available via YUM). I just use the pypi easy_installer, yum will never know.

sudo easy_install bsddb3

Download the source

  1. To download the source code go to the link:
  2. Click on "Current stable version - direct link"
  3. Go under the heading "Source" at bottom of page
  4. Click to download the eman-source-2.xx.tar.gz file (as of August 2012, 2.xx is 2.06)

Work with the source

  1. go to the directory with the source code
  2. extract the archive:
    tar zxvf eman-source-2.06.tar.gz
    
  3. go into directory
    cd EMAN2/src/build
  4. start configure script:
    cmake ../eman2/
  5. start compiling:
    make
  6. install to directory:
    sudo make install

Set environmental variables

bash

sudo nano /etc/profile.d/eman2.sh
export EMAN2DIR=/usr/local/EMAN2
export PATH=${EMAN2DIR}/bin:${PATH}
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:${EMAN2DIR}/lib
export PYTHONPATH=${EMAN2DIR}/lib:${EMAN2DIR}/bin

c shell

sudo nano /etc/profile.d/eman2.csh
setenv EMAN2DIR /usr/local/EMAN2
setenv PATH ${EMAN2DIR}/bin:${PATH}
setenv LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:${EMAN2DIR}/lib
setenv PYTHONPATH ${EMAN2DIR}/lib:${EMAN2DIR}/bin

Install MyMPI

Test to see if code works

see http://blake.bcm.edu/emanwiki/EMAN2/FAQ/EMAN2_unittest

cd EMAN2/test/rt
./rt.py

Install MyMPI for MPI functions

see http://sparx-em.org/sparxwiki/MPI-installation
or https://www.nbcr.net/pub/wiki/index.php?title=MyMPI_Setup

This fixes this problem:

    from mpi import mpi_init
ImportError: No module named mpi

This module was very difficult to get working, it seems to be a poorly supported python wrapper for MPI. So, what we are going to do is compile the module, rename it, and create a wrapper. So, essentially we are creating a wrapper around the wrapper. We can only hope they switch to [http://mpi4py.scipy.org/ mpi4py] in the future.

Documentation


< Install EMAN 1 | Install SPIDER >



Install External Packages

Appion allows you to use and pass data between multiple image processing packages from one integrated user interface. The image processing packages must be installed on your computer so that Appion can interface with them. You do not need to have all the packages installed for Appion to run, but you must have the packages installed that support the specific operations you wish to execute.


< Configure sinedon.cfg | Install EMAN >



Install FindEM from source

If the binary included with Appion does not work, or you wish to compile it yourself follow these instructions.

$ make
$ make test

WARNING
Only if the first part fails, you must add the path to libg2c.so library file.
Otherwise skip to next section.

$ ls /usr/lib/gcc/`uname -i`-redhat-linux/3.4.6/libg2c.so
$ locate libg2c.so
EXLIBS=-L/usr/lib/gcc/i386-redhat-linux/3.4.6/ -lg2c

Install FindEM ^



Free-hand test within Appion

These scripts originally came from John Rubinstein at University of Toronto and have been incorporated into Appion by Michael Cianfrocco with John's permission. Please get in touch with John for specific questions regarding issues related to compiling or the execution of these scripts.

/home/acheng/michaelappion uploadstack.py --session=12aug15x --file=/home/acheng/myami/test_freeHand/stack00_dc4_sel.img --apix=6.02 --diam=380 --description="UNtilted stack" --commit --normalize --not-ctf-corrected --rundir=/ami/data00/appion/12aug15x/stacks/stack4 --runname=stack4 --projectid=371 --expid=10286 --jobtype=uploadstack --synctype=tilt --syncstack=2


Install Grigorieff lab software

The Grigorieff lab at Brandies provides several individual programs that are used in Appion. See their main software page to download their software

Software used in Appion:


< Install UCSF Chimera | Compile FindEM >



Install Imod

IMOD is used for tomography processing developed primarily by David Mastronarde, Rick Gaudette, Sue Held, Jim Kremer, and Quanren Xiong at the Boulder Laboratory for 3-D Electron Microscopy of Cells.

Go to http://bio3d.colorado.edu/imod/ for download and installation instruction.

To use it with Appion, its bin directory need to be in user's direct path and its lib directory in LD_LIBRARY_PATH in addition to other IMOD environment variable that is set in a typical installation.

Appion scripts creates and run vms-styled IMOD command files like what eTomo produces and they do not use the gui (eTomo, 3dmod etc.) from IMOD package.


< Install Ace2 | Install Protomo >



Install myami-2.2 on guppy (work in progress...)

The following describes how we did myami-2.2 installation on guppy running CentOS 6.

[root@guppy opt]# cd /opt
[root@guppy opt]# ln -s myami-2.2 myami

Install phpMyAdmin

You are not required to install phpMyAdmin for Appion or Leginon, however, it is a useful tool for interfacing with the mysql databases.

Install supporting packages

Name: Download site: yum package name SuSE rpm name
PHP http://php.net/downloads.php php
php-mysql php-mysql

Install phpMyAdmin

If you have not already installed phpMyAdmin, do so. The yum installation is:

sudo yum -y install phpMyAdmin

Configure phpMyAdmin

Edit the phpMyAdmin config file /etc/phpMyAdmin/config.inc.php and change the following lines:

sudo nano /etc/phpMyAdmin/config.inc.php
$cfg['Servers'][$i]['AllowRoot']     = FALSE;
$cfg['Servers'][$i]['host']          = 'mysqlserver.INSTITUTE.EDU';

Edit the phpMyAdmin apache config file /etc/httpd/conf.d/phpMyAdmin.conf and change the following lines:

sudo nano /etc/httpd/conf.d/phpMyAdmin.conf
<Directory /usr/share/phpMyAdmin/>
   order deny,allow
   deny from all
   allow from 127.0.0.1
   allow from YOUR_IP_ADDRESS
</Directory>

Note: If you want to access phpMyAdmin from another computer, you can also add it to this config file with an allow from tag

Restart Web Server

Next restart the web server to take on the new setting

sudo /sbin/service httpd restart

Test the configuration

To test the phpMyAdmin configuration, point your browser to http://YOUR_IP_ADDRESS/phpMyAdmin or http://localhost/phpMyAdmin and login with the usr_object user.

A common problem is that the firewall may be blocking access to the web server and mysql server. On CentOS/Fedora you can configure this with the system config:

system-config-securitylevel

Firewall configuration is specific to different Unix distributions, so consult a guide on how to do this on non-RedHat machines.


< Install the Web Interface | Potential Problems >



Install Protomo

The "protomo" package contains programs and shell scripts for electron tomography of thin specimens. Developed by H. Winkler, et. al. at Florida State University.

References:
H. Winkler and K.A. Taylor, Ultramicroscopy 106, 240-254, 2006.
K.A.Taylor, J.Tang, Y.Cheng and H.Winkler, J. Struct. Biol. 120, 372-386, 1997.

The package is available at

http://www.electrontomography.org/software.html

To use it with Appion, the environment variable PROTOMOROOT need to be set to the installed package and its bin and (x86_64/bin or i686/bin) directories need to be in user's path.

Appion scripts do not use the user interface come with the package, just the shell scripts and programs called by them.


< Install Imod | Test Appion >



Install Protomo2

  1. download and unzip the protomo file from hanspeter - http://www.sb.fsu.edu/~winkler/protomo/protomo-2.2.0.tar.bz2
  2. download and unzip the deplib file from hanspeter - http://www.sb.fsu.edu/~winkler/protomo/deplibs.tar.bz2

Get the libs and devel of these packages:
Try yum:

yum search ...

test

tomoalign-gui -help

This is the current, but not yet officially released version:

Marker-free tilt series alignment:

http://www.sb.fsu.edu/~winkler/protomo/protomo-2.2.0.tar.bz2
http://www.sb.fsu.edu/~winkler/protomo/protomo-users-guide-2.0.12.pdf

Tutorial:

http://www.sb.fsu.edu/~winkler/protomo/protomo-tutorial-2.0.12.pdf
http://www.sb.fsu.edu/~winkler/protomo/protomo-tutorial-2.0.12.tar.bz2

3rd-party libraries:
http://www.sb.fsu.edu/~winkler/protomo/deplibs.tar.bz2


Install SIMPLE

mkdir /sw/packages/SIMPLE
cd /sw/packages/SIMPLE
mkdir downloads
cd downloads
wget http://simple.stanford.edu/binaries/simple_linux_120521.tar.gz
cd ..
tar zxf downloads/simple_linux_120521.tar.gz
cd simple_linux_120521
./simple_config.pl
type "local" when prompted.

This actually modifies the files in simple_linux_120521 to configure them with the correct path. This cannot later be modified by running ./simple_config.pl again on the same directory, so if you want to install somewhere other than /sw/packages/SIMPLE/simple_linux_120521, then you have to unpack a new copy from the tar.gz and run simple_config.pl once it is in the new location.

To set up your environment to use the executables in this new SIMPLE installation, you need both "apps" and "bin" subdirectory in your PATH. For example:

export PATH=$PATH:/sw/packages/SIMPLE/simple_linux_120521/apps:/sw/packages/SIMPLE/simple_linux_120521/bin

WARNING: I notice that some commands in the bin directory are fairly generic names, so just be aware of this when adding this to PATH. For example, there is a command "align", so hopefully there is no other "align" command which conflicts with this.


Install SPIDER

Install documentation at wadsworth.org

The Wadsworth Institute provides detailed documentation on how to install SPIDER on various systems. Below we cover our way to get it working on your system.

Download SPIDER binary from wadsworth.org (250 MiB)

Most of our SPIDER scripts were originally designed around SPIDER v14 and v15, but we are diligently working toward compatibility with SPIDER v18. That said you are probably best off using the newest version of SPIDER (v18.10 as of May 2010) and then reporting any bugs to us.

Extract the archive

tar -zxvf spiderweb.18.10.tar.gz

The archive will create 3 folders: spider, spire, and web. At this time only the spider program is used within Appion, you can safely ignore web and spire.

Install SPIDER

Set environmental variables

For BASH, create an spider.sh and add the following lines:

export SPIDERDIR=/usr/local/spider
export SPMAN_DIR=${SPIDERDIR}/man/
export SPPROC_DIR=${SPIDERDIR}/proc/
export SPBIN_DIR=${SPIDERDIR}/bin/
export PATH=$PATH:${SPIDERDIR}/bin

For C shell, create an spider.csh and add the following lines:

setenv SPIDERDIR /usr/local/spider
setenv SPMAN_DIR ${SPIDERDIR}/man/
setenv SPPROC_DIR ${SPIDERDIR}/proc/
setenv SPBIN_DIR ${SPIDERDIR}/bin/
setenv PATH $PATH:${SPIDERDIR}/bin

And then add it to the global /etc/profile.d/ folder

sudo cp -v spider.sh /etc/profile.d/spider.sh
sudo chmod 755 /etc/profile.d/spider.sh

-or-

sudo cp -v spider.csh /etc/profile.d/spider.csh
sudo chmod 755 /etc/profile.d/spider.csh

You may need to log out and log back in for these changes to take place.

Test SPIDER


< Install EMAN | Install Xmipp >



Install SSH module for PHP

This installation occurs on the web server.

The ssh2 extension for php

First, install the following prerequisites:

Name: Download site: CentOS yum package name Fedora yum package name SuSE rpm name
php devel http://www.php.net php-devel php-devel
libssh2 devel http://www.libssh2.org libssh2-devel (found in epel repo) libssh2-devel
SSH PECL extension http://www.php.net/manual/en/ssh2.installation.php - php-pecl-ssh2

For newer systems the extension is available through the repository, e.g., on Fedora 12 type "sudo yum install php-pecl-ssh2"

CentOS 6: Install the ssh2 extension

sudo yum install php-pecl-ssh2

CentOS 5: Download and compile the ssh2 extension

This setup is almost identical to the Install the MRC PHP Extension. See http://www.php.net/manual/en/ssh2.installation.php for more information.

  1. Go the website http://pecl.php.net/package/ssh2
  2. Download the latest version of ssh2:
    wget http://pecl.php.net/get/ssh2-0.11.3.tgz
  3. Extract the tar ball
    tar zxvf ssh2-0.11.3.tgz
  4. Go into the directory
    cd ssh2-0.11.3
  5. Run phpize and standard make process
    phpize
    ./configure
    make
  6. Install the module:
    sudo make install
  7. Add a ssh2.ini to your php ini folder:
    sudo touch /etc/php.d/ssh2.ini
    sudo chmod 666 /etc/php.d/ssh2.ini
    echo "; Enable ssh2 extension module" > /etc/php.d/ssh2.ini
    echo "extension=ssh2.so" >> /etc/php.d/ssh2.ini
    sudo chmod 444 /etc/php.d/ssh2.ini
    cat /etc/php.d/ssh2.ini
    
  8. Restart httpd:
    sudo /sbin/service httpd restart

Test if the module is working:

  1. go to the info.php that was created earlier in the Install the MRC PHP Extension, http://localhost/info.php or http://HOST.INSTITUTE.EDU/info.php
  2. search for the ssh2 module, you should see this section:

Still having problems?

  1. on one machine restarting httpd was not enough, I had to restart the entire system to get it working (even though it shows up under info.php)


< Setup job submission server | Configure web server to submit job to local cluster >



Install SSH module for PHP

This installation occurs on the web server.

The ssh2 extension for php

First, install the following prerequisites:

Name: Download site: CentOS yum package name Fedora yum package name SuSE rpm name
php devel http://www.php.net php-devel php-devel
libssh2 devel http://www.libssh2.org libssh2-devel (found in epel repo) libssh2-devel
SSH PECL extension http://www.php.net/manual/en/ssh2.installation.php - php-pecl-ssh2

For newer systems the extension is available through the repository, e.g., on Fedora 12 type "sudo yum install php-pecl-ssh2"

CentOS 6: Install the ssh2 extension

sudo yum install php-pecl-ssh2

CentOS 5: Download and compile the ssh2 extension

This setup is almost identical to the Install the MRC PHP Extension. See http://www.php.net/manual/en/ssh2.installation.php for more information.

  1. Go the website http://pecl.php.net/package/ssh2
  2. Download the latest version of ssh2:
    wget http://pecl.php.net/get/ssh2-0.11.3.tgz
  3. Extract the tar ball
    tar zxvf ssh2-0.11.3.tgz
  4. Go into the directory
    cd ssh2-0.11.3
  5. Run phpize and standard make process
    phpize
    ./configure
    make
  6. Install the module:
    sudo make install
  7. Add a ssh2.ini to your php ini folder:
    sudo touch /etc/php.d/ssh2.ini
    sudo chmod 666 /etc/php.d/ssh2.ini
    echo "; Enable ssh2 extension module" > /etc/php.d/ssh2.ini
    echo "extension=ssh2.so" >> /etc/php.d/ssh2.ini
    sudo chmod 444 /etc/php.d/ssh2.ini
    cat /etc/php.d/ssh2.ini
    
  8. Restart httpd:
    sudo /sbin/service httpd restart

Test if the module is working:

  1. go to the info.php that was created earlier in the Install the MRC PHP Extension, http://localhost/info.php or http://HOST.INSTITUTE.EDU/info.php
  2. search for the ssh2 module, you should see this section:

Still having problems?

  1. on one machine restarting httpd was not enough, I had to restart the entire system to get it working (even though it shows up under info.php)

Install SSH module for PHP

This installation occurs on the web server.

The ssh2 extension for php

First, install the following prerequisites:

Name: Download site: CentOS yum package name Fedora yum package name SuSE rpm name
php devel http://www.php.net php-devel php-devel
libssh2 devel http://www.libssh2.org libssh2-devel (found in epel repo) libssh2-devel
SSH PECL extension http://www.php.net/manual/en/ssh2.installation.php - php-pecl-ssh2

For newer systems the extension is available through the repository, e.g., on Fedora 12 type "sudo yum install php-pecl-ssh2"

CentOS 6: Install the ssh2 extension

sudo yum install php-pecl-ssh2

CentOS 5: Download and compile the ssh2 extension

This setup is almost identical to the Install the MRC PHP Extension. See http://www.php.net/manual/en/ssh2.installation.php for more information.

  1. Go the website http://pecl.php.net/package/ssh2
  2. Download the latest version of ssh2:
    wget http://pecl.php.net/get/ssh2-0.11.3.tgz
  3. Extract the tar ball
    tar zxvf ssh2-0.11.3.tgz
  4. Go into the directory
    cd ssh2-0.11.3
  5. Run phpize and standard make process
    phpize
    ./configure
    make
  6. Install the module:
    sudo make install
  7. Add a ssh2.ini to your php ini folder:
    sudo touch /etc/php.d/ssh2.ini
    sudo chmod 666 /etc/php.d/ssh2.ini
    echo "; Enable ssh2 extension module" > /etc/php.d/ssh2.ini
    echo "extension=ssh2.so" >> /etc/php.d/ssh2.ini
    sudo chmod 444 /etc/php.d/ssh2.ini
    cat /etc/php.d/ssh2.ini
    
  8. Restart httpd:
    sudo /sbin/service httpd restart

Test if the module is working:

  1. go to the info.php that was created earlier in the Install the MRC PHP Extension, http://localhost/info.php or http://HOST.INSTITUTE.EDU/info.php
  2. search for the ssh2 module, you should see this section:

Still having problems?

  1. on one machine restarting httpd was not enough, I had to restart the entire system to get it working (even though it shows up under info.php)


< Install the MRC PHP Extension | Install the Web Interface >



Install supporting packages

Using the Required Supporting Packages table below, install any missing prerequisite packages by following the instructions for your specific Linux distribution.

For example, SUSE users can use YaST to install them; RedHat and CentOS users can use yum, Debian and Ubuntu uses apt-get.

We highly recommend using pre-built binary packages to install the programs. Installing from source can quickly become a nightmare! See also the previous page, Instructions_for_installing_CentOS_on_your_computer for Red Hat based systems.

Required supporting packages:

Name: Download site: yum package name SuSE rpm name
Python 2.4 or newer http://www.python.org python python-devel
wxPython 2.5.2.8 or newer http://www.wxpython.org wxPython python-wxGTK
MySQL Python client 1.2 or newer http://sourceforge.net/projects/mysql-python MySQL-python python-mysql
Python Imaging Library (PIL) 1.1.4 or newer http://www.pythonware.com/products/pil/ python-imaging python-imaging
NumPy 1.0.1 or newer http://numpy.scipy.org/ numpy numpy
SciPy 0.5.1 (tested, others may work)* http://www.scipy.org scipy python-scipy

If you use Python 2.4 or earlier, you also need to install the PyXML module . For more recent versions of Python, it is already included in the main Python package.

For CentOS, see Download additional Software page, if you have trouble finding these packages.

Scipy/Numpy

SUSE specific issues

You can test your numpy and scipy install with their build in test functions:

python -c 'import numpy; numpy.test(level=1)'
python -c 'import scipy; scipy.test(level=1)'

Numpy is more stable should be successful. Expect to see lots of errors with scipy.

Mac OS X 10.5+:

You can successfully install most of these packages on a Mac by downloading DMG files and clicking on the install programs. This is not for novice Mac user. Be warned, wxpython problems on Mac will make your life difficult in using Leginon gui. Don't make your mac the only processing server if it is Leginon that you will use mainly.

  1. First, start by downloading the Mac OS X Installer Disk Image (v2.6 recommended) from http://python.org/download/ and install the newer version of python
  2. Download and install numpy and scipy DMG files from http://sourceforge.net/projects/numpy/files/ and http://sourceforge.net/projects/scipy/files/
  3. Download and install wxPython DMG file from http://www.wxpython.org/download.php
Other program need more work:
  1. The Python Imaging Library (PIL) has be compiled from source, which also requires the Mac OS X Developer tools to be installed.


Download Appion/Leginon Files >



Install supporting packages

Using the Required Supporting Packages table below, install any missing prerequisite packages by following the instructions for your specific Linux distribution.

For example, SUSE users can use YaST to install them; RedHat and CentOS users can use yum, Debian and Ubuntu uses apt-get.

We highly recommend using pre-built binary packages to install the programs. Installing from source can quickly become a nightmare! See also the previous page, Instructions_for_installing_CentOS_on_your_computer for Red Hat based systems.

Required supporting packages:

Name: Download site: yum package name SuSE rpm name
Python 2.4 or newer http://www.python.org python python-devel
wxPython 2.5.2.8 or newer http://www.wxpython.org wxPython python-wxGTK
MySQL Python client 1.2 or newer http://sourceforge.net/projects/mysql-python MySQL-python python-mysql
Python Imaging Library (PIL) 1.1.4 or newer http://www.pythonware.com/products/pil/ python-imaging python-imaging
NumPy 1.0.1 or newer http://numpy.scipy.org/ numpy numpy
SciPy 0.5.1 (tested, others may work)* http://www.scipy.org scipy python-scipy

If you use Python 2.4 or earlier, you also need to install the PyXML module . For more recent versions of Python, it is already included in the main Python package.

For CentOS, see Download additional Software page, if you have trouble finding these packages.

Scipy/Numpy

SUSE specific issues

You can test your numpy and scipy install with their build in test functions:

python -c 'import numpy; numpy.test(level=1)'
python -c 'import scipy; scipy.test(level=1)'

Numpy is more stable should be successful. Expect to see lots of errors with scipy.

Mac OS X 10.5+:

You can successfully install most of these packages on a Mac by downloading DMG files and clicking on the install programs. This is not for novice Mac user. Be warned, wxpython problems on Mac will make your life difficult in using Leginon gui. Don't make your mac the only processing server if it is Leginon that you will use mainly.

  1. First, start by downloading the Mac OS X Installer Disk Image (v2.6 recommended) from http://python.org/download/ and install the newer version of python
  2. Download and install numpy and scipy DMG files from http://sourceforge.net/projects/numpy/files/ and http://sourceforge.net/projects/scipy/files/
  3. Download and install wxPython DMG file from http://www.wxpython.org/download.php
Other program need more work:
  1. The Python Imaging Library (PIL) has be compiled from source, which also requires the Mac OS X Developer tools to be installed.

Install the MRC PHP Extension

MRC Tools is installed as a php extension and is required for displaying mrc files live on the web browser.

Note: The MRC PHP Extension is not compatible with PHP 5.3 and greater. For this reason, Appion/Leginon version 3.0 and greater no longer require the MRC PHP extension. Appion/Leginon versions older than 3.0 still require the MRC PHP extension as well as a PHP 5.2.x.

Make sure you have installed the prerequisite packages

You may find installation information for the following packages under Install Web Server Prerequisites.

php-devel

You can check whether php-devel is installed by typing:

phpize

Do not worry about any error message as long as the command is found.
 

php-GD/FFTW3-devel

Make sure that php-GD and FFTW 3 devel libraries are installed. Visit or refresh http://yourhost/info.php which you created earlier. It should have a section looking like this:

Note:
MRCtools are compiled and added to php extension with php-devel package. MRCtools use GD and FFTW3 that need to be compiled from their development libraries while the extension is compiled. If GD and FFTW3 sources were downloaded and compiled directly on your computer, these development files are included. If (as in most cases) GD and FFTW3 are installed from rpm, they are not included. An error message will appear when you attempt to compile mrctools. In this case, you will need separate download and installation of GD-devel and FFTW3-devel. Search http://rpmfind.net/linux/rpm2html/ for GD-devel and FFTW3-devel for the rpm distribution needed for your system. More information on the gd library can be found here. If you find that you can only view images as png instead of jpg, it may be that you do not have gd jpeg support installed.

MRC Tools Installation

  1. Go to the myami/programs/php_mrc directory
    cd myami/programs/php_mrc
    
  2. Compile and install the MRC module
    phpize
    ./configure 
    make
    sudo make install
    

     
  3. Check that the mrc.so module exists in your php module directory
     
    (e.g., /usr/lib64/php/modules on 64bit CentOS/RHEL/Fedora). If you are unsure where the php module directory is, use http://localhost/info.php to find it under extension_dir.
     
    ls /usr/lib64/php/modules
      mrc.so
    

    For Suse
    ls /usr/lib64/php5/extensions
      mrc.so
    
  4. Edit your php configuration file, php.ini.
     
    If your linux distro does not have a /etc/php.d/ or /etc/php.d/conf.d/ directory where other .ini files reside, you may need to follow the alternate instructions below titled: Alternative approach if mrc module does not show up in info.php output.
     
  5. Restart your webserver
     
    Example commands:
    #SuSe
    /etc/init.d/apache2 restart 
    &nbsp;
    #CentOS
    sudo /sbin/service httpd restart 
    

     
    Note: Sometimes, the MRC module will not work even after restarting the webserver. Try restarting the whole computer:
    sudo reboot

     
  6. Verify the mrc tools installation
     
    Visit or refresh http://yourhost/info.php which you created earlier. It should have a section looking like this (The version should correspond to what you've just installed):
     

     
    If mrc is not listed, the extension did not get added in the right order.

Alternative approach if mrc module does not show up in info.php output

1. find in the info.php web page the location of "additional .ini files parsed" in the first table (such as /etc/php.d/conf.d/*).

2. Go to the directory and make a copy of any ini file to use as a template for mrc.ini

      >cd [additional_ini_directory]
      >cp gd.ini mrc.ini

3. Edit mrc.ini to the following

      ; comment out next line to disable mrc extension in php
      extension=mrc.so

4. Comment out mrc extension from php.ini (found in /etc/php.ini/ on a typical PHP installation)

      ;extension=mrc.so

5. Restart your webserver

      > /etc/init.d/httpd restart

Test the MRC module installation

In the myami/php_mrc (or myami/programs/php_mrc if installing from trunk) directory, you will find two test scripts, "ex1.php" and "ex2.php" and a test MRC image "mymrc.mrc".

Copy them to your top level web directory (for example on CentOS: /var/www/html/):

cd myami/programs/php_mrc
sudo cp ex1.php ex2.php mymrc.mrc /var/www/html/

Run the scripts with the following commands and visit the corresponding pages from the web server:
The expected results are shown below. If you get the same images, you've installed the extension properly.
Note: the "display" command is part of the ImageMagick package, which you may have to install.

web server: http://localhost/ex1.php

php -q ex1.php | display

web server: http://localhost/ex2.php

php -q ex2.php | display

Test files work but images not showing up in the ImageViewers?
Here's one way this was fixed.


< Download Appion and Leginon Files | Install SSH module for PHP >



Install the Web Interface

Install Leginon and Appion web tools for viewing images and performing image processing through the web server.

1. Install pyami

TODO: put the prereqs for this in Web Preq page rather than linking to the processing page.

If you have not yet installed Leginon/Appion python packages on this server, the web interface will at least need the myami/pyami package to do MRC to JPEG conversion. First install the supporting packages. Then install myami/pyami as follows:

cd myami/pyami
sudo python setup.py install

This will install the script "mrc2any" into /usr/bin/mrc2any (for typical linux system). You can customize the location with options to the setup.py command.
To be sure where it was installed, run:
which mrc2any

You will need to know that location when configuring below.

2. Copy the myami/myamiweb directory to your Apache web directory

Example:

cd myami

#CentOS example
sudo cp -vr myamiweb /var/www/html/ 

#this is temporary for setup, revert to 755 when finished with this page
sudo chmod 777 /var/www/html/myamiweb  

#if you have SELinux enabled this command will help
sudo chcon -R --type=httpd_sys_content_t /var/www/html

3. Configure your installation

There is a setup wizard available to help you set the configuration parameters for your installation. If you prefer not to use the wizard, there are instructions for manually editing the configuration file. If this is your first time creating the web tool configuration file, we recommend using the setup wizard.

Configuration using the setup wizard

The setup wizard will check your database connection, create required database tables, and perform default data initialization.

Manual configuration instructions (Advanced User)

Go to Install the Web Interface Advanced for the advanced configuration.

4. Revert permissions

sudo chmod 755 /var/www/html/myamiweb

5. Test the installation

Visit http://yourhost/myamiweb or http://localhost/myamiweb to confirm functionality.
You may also browse to the automatic web server troubleshooter at: http://localhost/myamiweb/test/checkwebserver.php

6. Turn off error checking in php.ini

Edit the following items in php.ini (found as /etc/php.ini on CentOS and /etc/php5/apache2/php.ini on SuSE) so that they look like the following:

display_errors = Off


< Install SSH module for PHP | Install phpMyAdmin >



Install the Web Interface Advanced

Manual configuration instructions (Advanced User)

Note: You may skip this section if you configured your installation with the setup wizard at http://localhost/myamiweb/setup.

Copy config.php.template to config.php and edit the latter by adding these parameters:
"config.php" should be located in /var/www/html/myamiweb/ on CentOS and /srv/www/htdocs/myamiweb/ on SuSE.

  1. define site base path
    This should be changed if the myamiweb directory is located
    in a sub-directory of the Apache web directory.
    ex. myamiweb is in /var/www/html/applications/myamiweb/ then
    change to define('BASE_PATH',"applications/myamiweb");
     
    define('BASE_PATH',"myamiweb"); 
    

     
  2. myamiweb login system
    // Browse to the administration tools in myamiweb prior to 
    // changing this to true to populate DB tables correctly.
    define('ENABLE_LOGIN', false);    
    

     
  3. Administrator email title and email address
    define('EMAIL_TITLE', 'The name of your institute');
    define('ADMIN_EMAIL', "example@institute.edu");
    

     
  4. SMTP Server setup (not required but recommended):
    define('ENABLE_SMTP', false);
    define('SMTP_HOST', 'mail.institute.edu');    //your smtp host
    

     
    1. When SMTP server requires authentication
      // --- Check this with your email administrator -- //
      // --- Set it to true if your SMTP server requires authentication -- //
      define('SMTP_AUTH', false);
      
      // --- If SMTP_AUTH is not required(SMTP_AUTH set to false, -- //
      // --- no need to fill in 'SMTP_USERNAME' & SMTP_PASSWORD -- //
      define('SMTP_USERNAME', "");
      define('SMTP_PASSWORD', "");
      

       
  5. Setup your MySQL database server parameters:
    define('DB_HOST', "");        // DB Host name
    define('DB_USER', "");        // DB User name
    define('DB_PASS', "");        // DB Password
    define('DB_LEGINON', "");    // Leginon database name
    define('DB_PROJECT', "");    // Project database name
    

     
  6. Enable Image Cache
    If you want to use caching function for faster image loading time in image viewer, change 'ENABLE_CACHE' to true.
    You can change the 'CACHE_PATH' to other location, but make sure the apache user has write access to this folder.
    // --- Enable Image Cache --- //
    define('ENABLE_CACHE', true);
    // --- caching location --- //
    // --- please make sure the apache user has write access to this folder --- //
    define('CACHE_PATH', '/srv/www/cache/');
    define('CACHE_SCRIPT', WEB_ROOT.'/makejpg.php');
    

Additional parameters needed for Appion Installations only:

  1. Enable the processing plug-in by uncommenting out the following line in the file`myamiweb/config.php`
    addplugin("processing");
    

     
  2. IMAGIC and Other features:
    // Check if IMAGIC is installed and running, otherwise hide all functions
    define('HIDE_IMAGIC', false);
    
    // hide processing tools still under development.
    define('HIDE_FEATURE', true);
    

     
  3. Add processing host information
     
    Appion version 2.2 and later:
    The following code should be added and modified for each processing host available.
    $PROCESSING_HOSTS[] = array(
    'host' => 'LOCAL_CLUSTER_HEADNODE.INSTITUTE.EDU', // for a single computer installation, this can be 'localhost'    
    'nproc' => 32,  // number of processors available on the host, not used
    'nodesdef' => '4', // default number of nodes used by a refinement job
    'nodesmax' => '280', // maximum number of nodes a user may request for a refinement job
    'ppndef' => '32', // default number of processors per node used for a refinement job
    'ppnmax' => '32', // maximum number of processors per node a user may request for a refinement job
    'reconpn' => '16', // recons per node, not used 
    'walltimedef' => '48', // default wall time in hours that a job is allowed to run
    'walltimemax' => '240', // maximum hours in wall time a user may request for a job
    'cputimedef' => '1536', // default cpu time in hours a job is allowed to run (wall time x number of cpu's) 
    'cputimemax' => '10000', // maximum cpu time in hours a user may request for a job
    'memorymax' => '', // the maximum memory a job may use
    'appionbin' => 'bin/', // the path to the myami/appion/bin directory on this host
    'appionlibdir' => 'appion/', // the path to the myami/appion/appionlib directory on this host
    'baseoutdir' => 'appion', // the directory that processing output should be stored in
    'localhelperhost' => '', // a machine that has access to both the web server and the processing host file systems to copy data between the systems
    'dirsep' => '/', // the directory separator used by this host
    'wrapperpath' => '', // advanced option that enables more than one Appion installation on a single machine, contact us for info 
    'loginmethod' => 'SHAREDKEY', // Appion currently supports 'SHAREDKEY' or 'USERPASSWORD' 
    'loginusername' => '', // if this is not set, Appion uses the username provided by the user in the Appion Processing GUI
    'passphrase' => '', // if this is not set, Appion uses the password provided by the user in the Appion Processing GUI
    'publickey' => 'rsa.pub', // set this if using 'SHAREDKEY'
    'privatekey' => 'rsa'      // set this if using 'SHAREDKEY'
    );
    

     
    Appion version 2.1 and prior:
    // --- Please enter your processing host information associate with -- //
    // --- Maximum number of the processing nodes                                    -- //
    // --- $PROCESSING_HOSTS[] = array('host' => 'host1.school.edu', 'nproc' => 4); -- //
    // --- $PROCESSING_HOSTS[] = array('host' => 'host2.school.edu', 'nproc' => 8); -- //
    
    // $PROCESSING_HOSTS[] = array('host' => '', 'nproc' => );
    

     
  4. Microscope spherical aberration constant
     
    Not needed for Appion version 2.2 and later. Version 2.1 and earlier only:
    $DEFAULTCS = "2.0";
    

     
  5. Redux server information
     
    This is a very new feature, post Appion 2.2, so you only need this if you are running the trunk.
    The Redux server replaces the old mrc_php module and will be released in Appion version 3.0.
    Information on redux parameters to add to the config.php

We will not include the cluster registration now. It is covered in the last part of this document.


Go back to Install the Web Interface



Install UCSF Chimera

Versions of UCSF Chimera:

Chimera has two pages and five sections for downloading the UCSF Chimera program:

There are two main versions available for linux the normal version and the headless version.

Install UCSF Chimera for a desktop computer

If you plan on using the install computer also as a desktop computer (i.e. you want to open MRC files and manipulate them in UCSF Chimera) then you should install version 1.5.3. The known version to work both in desktop mode and background mode for generating images without opening a window.

  1. Go to the download page at UCSF
  2. Download the Linux for 32bit machines and Linux 64-bit for 64bit machines
  3. Save the file to your machine and go to the directory containing the file
  4. Make the downloaded file executable (newer versions of the installer end with .bin instead of .exe):
    chmod 755 chimera-1.5.3-linux_x86_64.exe
  5. Execute the UCSF Chimera installer
    ./chimera-1.5.3-linux_x86_64.exe
  6. Choose a location to install the files (e.g., /usr/local/chimera) and let it install all of its files
  7. Install Chimera globally by placing a symbolic link to the executable in /usr/local/bin
    ln -s /usr/local/chimera/bin/chimera /usr/local/bin/chimera
If the stable release does not work for background mode and produces snapshots that is badly offset, we know that 1.2509 works. You may try that one.
  1. Go to the old downloads page at UCSF and search for 1.2509.

Install UCSF Chimera for a server

On May 6, 2010, the UCSF Chimera team released a working version of their headless version of the program. The headless version runs the program but does not allow any interaction from the user. The version is ideal for servers, because it allows UCSF Chimera to create images of your molecule without having to install X windows.

  1. Download the latest Daily Build of the headless version (hopefully a working release will be provided soon)
  2. Go to the main download page
  3. Select the Headless Linux 64-bit or Headless Linux version of UCSF Chimera under the Unsupported Releases section
  4. Download the executable
    http://www.cgl.ucsf.edu/chimera/cgi-bin/secure/chimera-get.py?file=linux_x86_64_osmesa/chimera-1.6.2-linux_x86_64_osmesa.bin
  5. Make the downloaded file executable:
    chmod 755 chimera-1.6.2-linux_x86_64_osmesa.exe
  6. Execute the UCSF Chimera installer
    ./chimera-alpha-linux_x86_64_osmesa.exe
  7. Choose a location to install the files (e.g., /usr/local/chimera) and let it install all of its files
  8. Install Chimera globally by placing a symbolic link to the executable in /usr/local/bin
    ln -s /usr/local/chimera/bin/chimera /usr/local/bin/chimera

Test UCSF Chimera

The only way to test if UCSF Chimera is working within Appion is to have Appion completely installed.


< Install Xmipp | Install Grigorieff lab software >



Install Web Server Prerequisites

The myamiweb files are mostly php scripts that run at the web server. PHP, PHP-devel, gd, and fftw3 packages are required before installation of myamiweb and the mrc extension that handles the display of mrc files. Some of these packages may be found on the SuSE Linux DVD or included in common package repository. MySQL and the Apache Web Server can be downloaded from their respective websites.

Install Supporting Packages

Prerequisite packages for myamiweb:

Name: Download site: yum package name SuSE rpm name
Apache www.apache.org httpd apache2
php www.php.net php php
php-devel* rpmfind.net/linux/RPM/Development_Languages_PHP.html php-devel php-devel
php-mysql* rpmfind.net/linux/RPM/Development_Languages_PHP.html php-mysql php-mysql
php-gd www.php.ned/gd (Use gd2) php-gd php-gd
fftw3 library (including development libraries and header *) www.fftw.org (Use fftw3.x) fftw3-devel fftw3-devel
libssh2 developmental libraries http://www.libssh2.org libssh2-devel
phpMyAdmin (optional) http://www.phpmyadmin.net phpMyAdmin
GCC, the GNU Compiler Collection http://gcc.gnu.org gcc
Apache SSL module mod_ssl
#CentOS> 
sudo yum install \
  php-gd gcc phpMyAdmin libssh2-devel php-pecl-ssh2 \
  mod_ssl httpd php-mysql php-devel php fftw3-devel

Note: There are additional requirements for the Redux image server

Notes:


< Differences between Linux flavors | Configure php.ini >



Install Xmipp

Install documentation at Xmipp

Biocomputing Unit at the Spanish National Center of Biotechnology (CNB-CSIC) provides detailed documentation on how to install Xmipp on various systems. Below we cover our way to get it working on your system.

Install supporting packages

Name: Download site: CentOS package name Fedora package name SuSE rpm name
gcc-c++ gcc-c++
openmpi-devel openmpi-devel
libtiff-devel libtiff-devel
libjpeg-devel libjpeg-devel libjpeg-turbo-devel
zlib-devel zlib-devel

Install Xmipp from source

We recommend installing Xmipp from source to properly use the openmpi libraries that allows you to run on multiple processors

Download source code

Alternatively, you may download from the svn repo:

svn co http://newxmipp.svn.sourceforge.net/svnroot/newxmipp/tags/release-2.4/xmipp/ Xmipp-2.4-src

As of Feb 2012, this was required to compile the 2.4 source code.

Prepare Xmipp make files

Note: If you can not find the openmpi directory, make sure you have installed the openmpi package. The installation on CentOS using yum is: yum -y install openmpi-devel.

Compile the source code

CentOS 5

export PATH=$PATH:/usr/lib64/openmpi/1.3.2-gcc/bin
./scons.configure \
 MPI_LIBDIR=/usr/lib/openmpi/1.2.7-gcc/lib/ \
 MPI_INCLUDE=/usr/lib/openmpi/1.2.7-gcc/include/ \
 MPI_LIB=mpi

CentOS 6

Note: If you are installing Xmipp on x86_64 CentOS 6, you can use the following commands instead.

export PATH=/usr/lib64/openmpi/bin:$PATH
./scons.configure \
 MPI_LIBDIR=/lib/usr/lib64/openmpi/lib \
 MPI_INCLUDE=/lib/usr/lib64/openmpi/include \
 MPI_LIB=mpi
* Checking for MPI ... yes
./scons.compile
sudo mv -v Xmipp-2.4-src /usr/local/Xmipp

Setup environmental variables

You may need to log out and log back in for these changes to take place, or source the environment script:

 source /etc/profile.d/xmipp.sh

Test Xmipp

Test Xmipp by running ml_align2d program

xmipp_ml_align2d -h

This result should appear
2104:Argument -i not found or invalid argument
File: libraries/data/args.cpp line: 502
Usage:  ml_align2d [options] 
   -i <selfile>                : Selfile with input images 
   -nref <int>                 : Number of references to generate automatically (recommended)
   OR -ref <selfile/image>         OR selfile with initial references/single reference image 
 [ -o <rootname> ]             : Output rootname (default = "ml2d")
 [ -mirror ]                   : Also check mirror image of each reference 
 [ -fast ]                     : Use pre-centered images to pre-calculate significant orientations
 [ -thr <N=1> ]                : Use N parallel threads 
 [ -more_options ]             : Show all possible input parameters 


< Install SPIDER | Install UCSF Chimera >



Instructions for installing CentOS on your computer

Why CentOS?

If you have a new computer(s) for your Leginon/Appion installation, we recommend installing CentOS because it is considered to be more stable than other varieties of Linux.

CentOS is the same as Red Hat Enterprise Linux (RHEL), except that it is free and supported by the community.

We have most experience in the installation on CentOS and this installation guide has specific instruction for the process.

see Linux distribution recommendation for more.

Download the ISO disk of CentOS 5.x

Latest version tested at NRAMM: CentOS 5.8

Note: All formally released versions of Appion (versions 1.x and 2.x) run on CentOS 5.x. Appion developers, please note that the development branch of Appion is targeting CentOS 6.x and Appion 3.0 will run on CentOS 6.x.

  1. ISO files are available at
    1. http://wiki.centos.org/Download
    2. http://mirrors.kernel.org/centos/
  2. Click on i386 for 32bit machines or x86_64 for 64bit machines
  3. Pick a mirror and download 'CentOS-5.8-i386-bin-DVD-1of2.iso ' file

Confirm download went correctly

Perform a SHA1SUM confirmation:

sha1sum CentOS-5.8-i386-bin-DVD-1of2.iso

The result should be the same as in the sha1sum file provided by CentOS. This is found at the same location you downloaded the .iso file.
For example:

Burn ISO file to DVD disk

Use dvdrecord in Linux to burn disk.

dvdrecord -v -dao gracetime=10 dev=/dev/dvd speed=16 CentOS-5.8-i386-bin-DVD-1of2.iso 

Install CentOS with default packages

Add yourself to the sudoers file

Note: This step is optional, however you will need root access to complete the Appion Installation.

Make sure you have root permission.
Open the file in an editor. ex. vi /etc/sudoers
Look for the line: root ALL=(ALL) ALL.
Add this line below the root version:

your_username ALL=(ALL)       ALL

Logout and log back in with your username.

The CentOS installation is complete.


< Select Linux distribution to use | Download additional Software >



Instructions for installing Fedora on your computer

Why Fedora?

If you have a new computer(s) for your Leginon/Appion installation, Fedora is a cutting edge system that has all the latest and greatest feature.
While Fedora is great for a desktop computer, it can be a hassle for a server. Fedora recommends that you upgrade the computer every 6 months and all but requires a upgrade every year. With servers you like things to remain the same for longer periods of time, this is not Fedora. Fedora always has the latest versions. If you want something more stable we recommend installing CentOS. See Instructions for installing CentOS on your computer

Fedora is a cutting edge distribution produced by a community of programmers that is maintained by Red Hat.

Download the DVD iso disk of Fedora 13

  1. DVD iso files are available at
    1. http://fedoraproject.org/en/get-fedora-options
  2. In the section, Fedora 13 DVD, click on i386 for 32bit machines or x86_64 for 64bit machines
  3. Download the 64bit 'Fedora-13-x86_64-DVD.iso' file
    wget -c http://download.fedoraproject.org/pub/fedora/linux/releases/13/Fedora/x86_64/iso/Fedora-13-x86_64-DVD.iso
    
  4. Download the 32bit 'Fedora-13-x86_64-DVD.iso' file
    wget -c http://download.fedoraproject.org/pub/fedora/linux/releases/13/Fedora/i386/iso/Fedora-13-i386-DVD.iso
    

Confirm download went correctly

perform a SHA256SUM confirmation:

sha256sum Fedora-13-x86_64-DVD.iso
dab657e1475832129ea3af67fd1c25c91b60ff1acc9147fb9b62cef14193f8d2

The result should be the same as in the sha256sum file provided by Fedora:

Burn DVD iso file to DVD disk

Use dvdrecord in Linux to burn disk

dvdrecord -v -dao gracetime=10 dev=/dev/dvd speed=16 Fedora-13-x86_64-DVD.iso

The /dev/dvd may not link to your DVD drive on my machine it was called /dev/dvd1. Do a ls /dev/dvd* to look for alternate names

Install Fedora 13 with default packages

Note: In one case we had to use the option "Install with basic video driver", because after doing the normal install the screen went blank and it was not usable.

Add yourself to the sudoers file

Make sure you have root permission.
Open the file in an editor. ex. vi /etc/sudoers
Look for the line:

root    ALL=(ALL)     ALL
Add this line below the root version:
username    ALL=(ALL)     ALL

Logout and log back in with your username.

Your Fedora installation is complete.


Download additional Software (Fedora Specific) >



Instruments

The Instruments tool is for use with Leginon, Appion's sister image capture software.
If you are using Leginon, you may find more information about Instruments in the Leginon user manual.


< Users | Default Settings >



Issue Workflow Tutorial

There are several types of Trackers available:

  1. Bug
  2. Feature
  3. Support
  4. Task
  5. Test Case

All fall under the general category of Issue. For each Tracker type, there is the ability to set a status. The status types available vary for each Tracker type.

Bug and Feature work flow

The normal status work flow for a Bug or Feature request is New -> Assigned -> In Code Review -> In Test -> Closed.

New - The issue has been created, and there is not a particular person assigned to address the issue. In this case, the Issue Administrator (currently Amber) will review the new issue and assign it to the appropriate person.

Assigned - The issue has been assigned to someone. That person is responsible for addressing the issue. If it is a Bug, this person will fix it. If it is a feature, this person will implement it. This person will also indicate in the issue how their changes should be tested.

In Code Review - The person responsible for fixing or implementing the feature has completed the job and has checked the code into subversion. It is now ready for a code review. The person the issue was assigned to selects another person to perform a code review. The Assigned To field of the issue is changed to the person who will perform the code review.

In Test - The code has been reviewed and any potential problems have been addressed. Someone other than the person who implemented the code change is assigned to test the change. The person who implemented the change should indicate which Test Cases can be used to test the code changes.

Closed -> All testing of the code change is complete and successful.

There are several cases where the normal work flow will not apply:

Duplicate - indicates that there is already an issue addressing the same topic. In this case, be sure to make a reference to the existing issue.
Wont Fix / Wont Do - indicates that the bug or feature request will be ignored. Please provide a detailed explanation for making this choice.

Good guide from Drupal, we could incorporate


Javascript notes

Improved performance: Newbie Pitfalls:

Job submission vs direct Appion Script running from terminal

We have two schools of developers.
  1. Some has only single node computer and wants to run appionScript without going through resource manager by constructing Appion script and options directly on the terminal. This means that appionScript need to be able to manage the resource without a resource manager. The users would use Show Commands in myamiweb
  2. Some wants to move all appion job running through job submission so that resource managers are used in a large cluster of nodes. The users would use Submit job in myamiweb.

This wiki is to organize my (Anchi) findings on what modules does what and how options are passed and set and what are logged into database and when in the two different way of running Appion currently.

Show Command path (Bold are not performed in BasicScript)

"__init__()"

  1. createDefaultStats
  2. set self.functionname as basename of
    sys.argv[0]
  3. set self.appiondir
  4. setParams
    1. setupGlobalParseOptions (I have not found any appion script not doing this step even though it can be turned off)
    2. setupParserOptions (subclasses overwrite appionScript base class that adds stackid).
    3. convertOptions to self.params
  5. set processing database using projectid (don't see how we can have a situation without projectid any more).
  6. check conflicts
    1. checkConflicts from subclasses of appionScript
    2. checkGlobalConflicts
  7. set up run directory
  8. start pool of threads (set to 2 now and will time out if not used in 10 seconds)
  9. writeFunctionLog
  10. uploadScriptData
    1. ProgramName
    2. UserName
    3. Host
    4. ProgramRun (need cluster job id)
      1. query for ApAppionJobData by
        1. path = self.params['rundir']
        2. jobtype = self.functionname
      2. insert if not exist
      3. set ApAppionJobData to Run
    5. ParamName and ParamValues

start()

close()
  1. closeFunctionLog
  2. set ApAppionJobData to Done


Jumpers SubStack

This option allows to exclude particles which are referred to as Euler jumpers. An Euler jumper is a particle which changed the orientation from iteration to iteration. By using this method you can clean up your dataset from particles that do not converge to a single Euler angle assignment.

<More Stack Tools | Particle Alignment >


Linux distribution recommendation

We list our experience and current progress here.

Our Preference : CentOS 5.x

If you have a new computer(s) for your Leginon/Appion installation, we recommend installing CentOS because it is considered to be more stable than other varieties of Linux.

CentOS is the same as Red Hat Enterprise Linux (RHEL), except that it is free and supported by the community.

We have most experience in the installation of the supporting packages on CentOS and this installation guide has specific instruction for the process.

Start at Instructions for installing CentOS on your computer.

Other known cases of success:

Fedora 10

Start at Instructions for installing Fedora on your computer

SuSE, Ubuntu

MacOS


Logout

After logging into the system, from any page you may press the [Logout] button at the top left corner of the screen.
This will end your session with the Appion and Leginon Tools.


< Modify Your Profile | User Management ^



LOI - Leginon Observer Interface

The Leginon Observer Interface is a tool to view images being collected from a Microscope in real time. This is used for Leginon installations only and may be ignored in Appion only installations.


< Image Viewers | Tomography Tool >



Make a New Stack

This option allows you to create a stack of the picked particles Stack Creation

<More Stack Tools | Particle Alignment >


Manual Masking

Manual masking is used to mask out regions of cruds on micrographs such that particles picked in the masked out regions will not be used in subsequent processing (e.g. stack creation).

General Workflow:

The user selects the option of:
  1. Assess run of the same name (default)
  2. Combined run with existing assessment run (rarely used now)
  3. Assess run under a new name
Next:
  1. The user clicks on either "show command" or "run manual masking", which will then show the command
  2. The user will need to launch the program from a terminal
  3. The interface of manual masking is similar to manual picker or tilt picker, except that instead of picking particles, the user will draw polygons to define the mask
  4. Another mask region can be selected after the first polygon region is added to the mask.
  5. The mask is saved when the user click "Forward" to proceed on to the next image

Output:

  1. The masking results can be used as a filter when making a stack of particles
  2. The user chooses the desired masking session to apply for filtering out unwanted particle picks

Notes, Comments, and Suggestions:



< Region Mask Creation



Manual Picking

The manual particle picker allows the user to select the targets by eye. This can be extremely time consuming. However, if no starting model is available or the desired particles are represented in a very low concentration it is sometimes worth to spent some time selecting the particles manually. After several hundred particles have been collected a preliminary initial model or 2D averages can be generated and used as templates. Use this option if you want to select the particles manually from scratch or edit particle picks done by Dog Picking or Template Picking.

General Workflow:

  1. You have to run this job from a command line.
  2. The images will be preprocessed making particle detection easier. Usually the defaults work extremely well!
  3. Simply click on the desired particles or remove a region of wrongly selected objects by clicking around the area.

Notes, Comments and Suggestions:

To run the user interface for manual picking, you will be asked to copy and paste the command into a terminal. If you are connecting to a processing server, you may need to ssh with a -X flag to enable display.


< Particle Selection | CTF Estimation >



Minor release update instructions

You can check the Files tab for updated minor release versions of your installed release. These will include any critical bug fixes that have been addressed since the original release.

You may update by either downloading a released tar file or doing an svn update if your original installation was via svn checkout. To do the svn update, simply change directories to your myami installation and run

svn update
.

Update the Appion Packages

Install all the myami python packages except appion using the following script:

cd /your_download_area/myami
sudo ./pysetup.sh install

That will install each package, and report any failures. To determine the cause of failure, see the generated log file "pysetup.log". If necessary, you can enter a specific package directory and run the python setup command manually. For example, if sinedon failed to install, you can try again like this:

cd /your_download_area/myami/sinedon
sudo python setup.py install

Install the Appion python package

Important: You need to install the current version of Appion packages to the same location that you installed the previous version of Appion packages. You may have used a flag shown below (--install-scripts=/usr/local/bin) in your original installation. If you did, you need to use it this time as well. You can check if you installed your packages there by browsing to /usr/local/bin and looking for ApDogPicker.py. If the file is there, you should use the flag. if the file is not there, you should remove the flag from the command to install Appion to the default location.

The pysetup.py script above did not install the appion package. Since the appion package includes many executable scripts, it is important that you know where they are being installed. To prevent cluttering up the /usr/bin directory, you can specify an alternative path, typically /usr/local/bin, or a directory of your choice that you will later add to your PATH environment variable. Install appion like this:

cd /your_download_area/myami/appion
sudo python setup.py install --install-scripts=/usr/local/bin 

Update the web interface

Copy the entire myamiweb folder found at myami/myamiweb to your web directory (ex. /var/www/html). You may want to save a copy of your old myamiweb directory first.

cp -rf myamiweb /var/www/html

Run Database Update Script

Running the following script will indicate if you need to run any database update scripts.

cd /your_download_area/myami/dbschema
python schema_update.py

This will print out a list of commands to paste into a shell which will run database update scripts.
You can re-run schema_update.py at any time to update the list of which scripts still need to be run.


Modify Your Profile

Once you have successfully logged into the system, you may edit your profile:
  1. click on the [Profile] button in the top left corner of the Appion and Leginon Tools start page as shown below:
     
    Edit Profile Button
     
  2. edit your information as shown in the following My Profile screen
  3. When changing your password, check the box that says Change Password.
  4. Press the Update button
     
    My Profile
    My Profile Screen

< Retrieve Forgotten Password | Logout >



More Stack Tools

Appion Tools for Manipulating Stacks:

  1. Make a New Stack
  2. Combine Stacks
  3. View Stacks
  4. Alignment SubStacks
  5. Jumpers SubStack
  6. Convert Stack into Particle Picks

Notes, Comments, and Suggestions:



<Stacks | Particle Alignment >



Mount NFS on a Mac

sudo mkdir -p /ami/amishare
sudo mount -o resvport -t nfs colossus.scripps.edu:/export/amishare /ami/amishare


Multi Img Assessment

The user is able to assess multiple images at one time on the web by using the "Multi Img Assessment"

General Workflow:

Input:
  1. Like the single Img Assessment, the user assigns a status to the current image, whether to keep (check) the image or to reject (cross) it
  2. The difference is that the user is presented with multiple images at the same time
  3. Upon selecting "Keep", "Reject" or "None" (Default) for all images on the same page, the user will click the forward (arrow pointing to the right) button for the next set of images
  4. The user can assess the images based on particle picks or raw micrographs (the load time for raw micrographs is longer)
Output:
  1. Once the images are assessed, the result can be viewed by clicking the link right next to "Img Assessment"
  2. The result of the assessment can be used for limiting images used for subsequent processing (e.g. making stack)

Notes, Comments, and Suggestions:

  1. Assessment can be resumed from previous session and the user can choose to show only unassessed images.
  2. Multi Img Assessment is faster and users usually use it for checking RCT tilt picking results


< Web Img Assessment


Myami code diagram


WARNING: This is a preliminary document. Use at your own risk!

myami on Ubuntu

This is mainly a log of various experiences installing myami on Ubuntu which will gradually evolve into a more formal document. This will attempt to demonstrate the installation of all of myami including both leginon and appion on a single Ubuntu host. This will also include running MySQL, Apache Torque, etc. on this local host without needing to connect to any other host.

Installing a basic Ubuntu system

We use Ubuntu 12.04 LTS (Precise) Desktop 64 bit. Install a basic system from an image on CD or bootable USB drive. Default selections during the install process, no 3rd party repositories, no network configuration and no updates during the installation. This makes it easier in the remainder of this document to know that we have started from a known base system. Reboot.

Following the initial reboot:

You now have a basic up-to-date Ubuntu system

Installing additional packages required by myami

There are many additional packages to install, but we can try to condense that list to the smallest set necessary to pass to the update command which will then figure out all the other dependencies.

Here is a single command to install all the necessary packages:

sudo apt-get install subversion vim python-MySQLdb python-wxgtk2.8 python-fs mysql-server php5 php5-mysql libssh2-php php5-gd
During installation, you will be prompted several times to create a mysql root password. This can optionally be left blank at the expense of less security. Note: installing the vim text editor is my own preference... use an inferior text editor if you wish :)

The Scipy package is also required, but the current version 0.9.0 that comes with Ubuntu 12.04 is broken. You need to grab the more recent version of Scipy from Ubuntu Quantal (development version). To make a clean installation that will not confuse the package manager we have prepared a mini-repository that includes this new scipy. You can download and install scipy using the following set of commands copied into your terminal.

sudo mkdir /usr/local/share/myami
That should get your password in the sudo cache so you can copy the rest of these without entering a password:
cd /usr/local/share/myami
sudo wget http://ami.scripps.edu/redmine/attachments/download/1488/ubuntu-scipy-0.10.1.tar.bz
sudo tar jxf ubuntu-scipy-0.10.1.tar.bz
sudo sh -c 'echo "deb file:///usr/local/share/myami/ubuntu-scipy-0.10.1 ./" > /etc/apt/sources.list.d/myami.list'
sudo apt-get update
sudo apt-get install python-scipy
Make sure that last line executes (may have to hit enter). Also, you may have to confirm installing it without authentication.

Setting up MySQL databases

Setting up Apache web server

create /etc/php5/conf.d/myami.ini with the following contents:

error_reporting = E_ALL & ~E_NOTICE & ~E_WARNING
display_errors = On
register_argc_argv = On
short_open_tag = On
max_execution_time = 300
max_input_time = 300
memory_limit = 256M
restart apache:
sudo service apache2 restart

Install myami

Optional additions


Mysql Nested Subqueries Problem

This is the query that I wanted to do because it is easy to understand (= easier to maintain).
Unfortunatly there is a bug in mysql (http://bugs.mysql.com/bug.php?id=10312) which makes this
so slow that no one has seen it complete. The subsequent query is a bit more difficult to follow
but gets around this problem.

SELECT DB_PROJECT.projectexperiments.projectId 
FROM DB_PROJECT.projectexperiments 
WHERE DB_PROJECT.projectexperiments.name IN 
( 
    SELECT DB_LEGINON.SessionData.name 
    FROM DB_LEGINON.SessionData 
    WHERE DB_LEGINON.SessionData.`DEF_id` IN 
    ( 
        SELECT DB_PROJECT.shareexperiments.`REF|leginondata|SessionData|experiment` 
        FROM DB_PROJECT.shareexperiments 
        WHERE DB_PROJECT.shareexperiments.`REF|leginondata|UserData|user` = ".$userId." 
    )
);
SELECT DB_PROJECT.projectexperiments.projectId 
FROM DB_PROJECT.projectexperiments 
INNER JOIN 
(
    SELECT DB_LEGINON.SessionData.name AS SessionName 
    FROM DB_LEGINON.SessionData 
    INNER JOIN 
    (         
        SELECT DB_PROJECT.shareexperiments.`REF|leginondata|SessionData|experiment` AS SessionId 
        FROM DB_PROJECT.shareexperiments 
        WHERE DB_PROJECT.shareexperiments.`REF|leginondata|UserData|user` = ".$userId." 
    ) AS SessionIds 
    ON DB_LEGINON.SessionData.`DEF_id` = SessionIds.SessionId 
) AS SessionNames 
ON DB_PROJECT.projectexperiments.name = SessionNames.SessionName;

See another example here.


MySQL Notes

restart mysql on centOS

 /etc/init.d/mysqld restart 

Find out which installation is being used

which mysql

Get access to our databases

To get access to our databases, you need to ask Christopher (or someone else with the correct privileges) to add you as a user.
Our databases reside on Cronus4 (Appion and Leginon things), ami (websites and tools like Redmine), and Fly (a copy of Cronus4 used for testing).
Once added, you will need to change your password from the default to something all your own on each server that you are added to. Here's how:

At a terminal type:

mysql -h cronus4 -u [yourUserName] -p

You are prompted to type your default password. Then, change your password with:

set password = password("[new password]");

Other MySQL commands to try

show databases;
use [name of db]
show tables;
describe [name of table];

To stop a query that is taking too long:

mysql> showprocesslist;

mysql> kill [process Id number];

Mysql bug with nested subqueries

Mysql Nested Subqueries Problem


New User Registration

To register as a new user:
  1. Press the [register] button in the top left corner of the Login Screen
  2. Fill in all fields
  3. Press the Apply button

An email confirmation will be sent to the user.

Myamiweb Registration Screen
Myamiweb Registration Screen

NOTE:
If your Appion system is installed on multiple servers, each Appion user's Web Server and Processing Server user names and passwords should be identical.


< Enable User Authentication | Retrieve Forgotten Password >



Notes from Recon meeting

Moving forward, refinements will all be split into 2 steps, prep and run.

Prepare refine

When the user selects to prep a refinement, a web form is provided to select the:
  1. refinement method - eman, xmipp, frealign, etc...
  2. stack
  3. model
  4. run parameters - runname, rundir, description
  5. stack prep params - lp, hp, last particle, binning

The web server then calls prepRefine.py located on the local cluster to prepare the refinement.

Run Refine

When the user selects to run a prepared refinement, a web form is provided to select the:
  1. prepped refine
  2. cluster parameters - ppn, nodes, walltime, cputime, memory, mempernode
  3. refine params, both general and method specific
The web server will then:
  1. verify the cluster params by checking default_cluster.php
  2. if needed, copy the stack and model to a location that can be accessed by the selected cluster
  3. verify the user is logged into the cluster
  4. pass the refine command to runJob.py (extended from the Agent class), located on the remote cluster
runJob.py will:
  1. set the job type which was passed in the command
  2. create an instance of the job class based on the job type
  3. create an instance of the processing host class
  4. launch the job via the processing host
  5. update the job status in the appion database (do we have db access from the remote cluster?)

Object Model

Processing Host

Each processing host (eg. Garibaldi, Guppy, Trestles) will define a class extended from a base ProcessingHost class.
The extended classes know what headers need to be placed at the top of job files and they know how to execute a command based on the specific clusters requirements.
The base ProcessingHost class could be defined as follows:

abstract class ProcessingHost():
    def generateHeader(jobObject) # abstract, extended classes should define this, returns a string
    def executeCommand(command) # abstract, extending classes define this
    def createJobFile(header, commandList) # defined in base class, commandList is a 2D array, each row is a line in the job file.
    def launchJob(jobObject) # defined in base class, jobObject is an instance of the job class specific to the jobtype we are running
        header = generateHeader(jobObject)
        jobFile = createJobFile(header, jobObject.getCommandList())
        executeCommand(jobFile)

Job

Each type of appion job (eg Emanrefine, xmipprefine) will define a class that is extended from a base Job class.
The extending classes know parameters that are specific to the job type and how to format the parameters for the job file.
The base Job class could be defined as follows:

class Job():
    self.commandList
    self.name
    self.rundir
    self.ppn
    self.nodes
    self.walltime
    self.cputime
    self.memory
    self.mempernode
    def __init__(command) # constructor takes the command (runJob.py --runname --rundir ....)
        self.commandList = self.createCommandList(paramDictionary)   
    def createCommandList(command) # defined by sub classes, returns a commandList which is a 2D array where each row corresponds to a line in a job file

Agent

There will be an Agent class that is responsible for creating an instance of the appropriate job class and launching the job.
It will be implemented as a base class, where sub classes may override the createJobInst() function. For now, there will be only one sub class defined
called RunJob. The same runJob.py will be installed on all clusters. This implementation will allow flexibility for the future.
The base Agent class may be defined as follows:

class Agent():
    def main(command):
        jobType = self.getJobType(command)
        job = self.createJobInst(jobType, command)
        processHost = new ProcessingHost()
        jobId = processHost.launchJob(job)
        self.updateJobStatus()
    def getJobType(command) # parses the command to find and return the jobtype
    def createJobInst(jobType, command) # sub classes must override this the create the appropriate job class instance
    def updateJobStatus() # not sure about how this will be defined yet

Sub classes of Agent will define the createJobInst() function.
We could create a single subclass that creates a job class for every possible appion job type.
(we could make a rule that job sub classes are named after the jobtype with the word job appended. then this function would never need to be modified)
A sample implementation is:

class RunJob(Agent):
    def createJobInst(jobType, command)
        switch (jobType):
            case "emanrefine":
                job = new EmanJob(command)
                break
            case "xmipprefine":
                job = new XmippJob(command)
                break
         return job


Object Oriented Programming

This page provides details of the major features of object oriented programming and definitions of terminology.

Objects

Encapsulation

Inheritance

Polymorphism

PHP example

interface IAnimal {
    function getName();
    function talk();
}

abstract class AnimalBase implements IAnimal {
    protected $name;

    public function __construct($name) {
        $this->name = $name;
    }

    public function getName() {
        return $this->name;
    }
}

class Cat extends AnimalBase {
    public function talk() {
        return 'Meowww!';
    }
}

class Dog extends AnimalBase {
    public function talk() {
        return 'Woof! Woof!';
    }
}

$animals = array(
    new Cat('Missy'),
    new Cat('Mr. Mistoffelees'),
    new Dog('Lassie')
);

foreach ($animals as $animal) {
    echo $animal->getName() . ': ' . $animal->talk();
}

Python Example

class Animal:
    def __init__(self, name):    # Constructor of the class
        self.name = name
    def talk(self):              # Abstract method, defined by convention only
        raise NotImplementedError("Subclass must implement abstract method")

class Cat(Animal):
    def talk(self):
        return 'Meow!'

class Dog(Animal):
    def talk(self):
        return 'Woof! Woof!'

animals = [Cat('Missy'),
           Cat('Mr. Mistoffelees'),
           Dog('Lassie')]

for animal in animals:
    print animal.name + ': ' + animal.talk()

# prints the following:
#
# Missy: Meow!
# Mr. Mistoffelees: Meow!
# Lassie: Woof! Woof!

OTR Volume

The orthogonal tilt reconstruction method is an approach to generating single-class volumes with no missing cone for ab initio reconstruction of asymmetric particles (Leschziner & Nogales, 2005). The method involves collecting data at +45° and −45° tilts and only requires that particles adopt a relatively large number of orientations on the grid. One tilted data set is used for alignment and classification and the other set—which provides views orthogonal to those in the first—is used for reconstruction, resulting in the absence of a missing cone.

General Workflow:

1. Requirements:

  1. Run either Auto Align Tilt Pairs or Align and Edit Tilt Pairs
  2. Create two stacks: 1) Consist of only your first exposure particles and 2) Consist of only your second exposure particles
  3. Perform an alignment or classification run of your second-exposure particle stack

2. Step by Step Guide:

There are two general methods to run OTR Volume, just like how one would run RCT Volume

  1. From class averages after running alignment or classification:
    1. By selecting a desired class average (using the "select" selection mode) and then clicking on the button "Create OTR Volume", the user will be taken to the "Run OTR Volume" launch page.
    2. On this page, the user will provide the a brief description of the run, select the corresponding stack with respect to the stack on which the alignment or classification is performed and select the mask radius.
    3. The other options are set on defaults
  2. From the menu tab under Ab Initio Reconstruction:
    1. By clicking on the OTR Volume link, the user will be taken to the "Run OTR Volume" launch page.
    2. In this case, since the desired class average is not defined, the user has to choose the alignment or classification run and define the the class average number to reconstruct the OTR volume.
    3. Once that is done, the rest of the options are exactly the same as launching OTR volume from class averages in Method 1

3. Viewing OTR Volume Output Pages:

Notes, Comments, and Suggestions:


< Ab Initio Reconstruction | Refine Reconstruction >



Particle Alignment

Regardless of the method eventually utilized for 3D reconstruction, a good starting point for single particle EM investigations is 2D alignment and classification of the dataset. This type of analysis intimately acquaints the scientist with the types of particles, distribution of views, and the relative amount of "junk" contained in the dataset.

General Workflow:

  1. Run Alignment: Align particles using reference-free or reference-based approaches.
  2. Run Feature Analysis: Obtain a mathematical descriptor of the variance within the dataset.
  3. Run Particle Clustering: Sort the dataset into classes according to an interimage variance metric.

Appion SideBar Snapshot:

Appion Composite Output Snapshot:

Notes, Comments, and Suggestions:


< Stacks | Ab Initio Reconstruction >



Particle Selection

The first step in single particle analysis is to pick the particles within the micrographs. Basically three main ways exist to do this, all of them are integrated within Appion. Based on the shape of the particle, the prior knowledge and the amount of data collected, the user has to make a decision which approach is the best or use different approaches simultaneously and see which works best.

Available Particle Picking Tools:

  1. Template Picking
  2. Dog Picking
  3. Manual Picking
  4. Repeat from other Session
  5. Random Conical Tilt Picking
    1. Align and Edit Tilt Pairs
    2. Auto Align Tilt Pairs

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:



< Processing Cluster Login | CTF Estimation >



PDB to Model

The user is able to retrieve pdb models from the Protein Data Bank and generate a 3D density volume from the atomic model.

General Workflow:

To launch:

  1. The user enters the PDB ID (and check on the "use biological unit" check box if desired)
  2. The user decides the resolution of the density model (typically filter to 15 - 25 Angstrom resolution)
  3. Enter a low pass filter radius in Angstroms
  4. The user selects the symmetry of the model
  5. The user selects the pixel size for the model
  6. The user selects the box size. Note: the user needs to make sure that the box size is sufficiently large to accommodate the 3D volume to be generated.

Output:

  1. Once the model is generated, it will be deposited under "3d Density Volumes"
  2. The user then assess the quality of the model and decides whether to convert the volume into an initial model

Notes, Comments, and Suggestions:


< Import Tools | EMDB to Model >



PDB to Model

The user is able to retrieve pdb models from the Protein Data Bank and generate a 3D density volume from the atomic model.

To launch:

  1. The user enters the PDB ID (and check on the "use biological unit" check box if desired)
  2. The user decides the resolution of the density model (typically filter to 15 - 25 Angstrom resolution)
  3. The user selects the symmetry of the model
  4. The user selects the pixel size for the model
  5. The user selects the box size. Note: the user needs to make sure that the box size is sufficiently large to accommodate the 3D volume to be generated.

Output:

  1. Once the model is generated, it will be deposited under "3d Density Volumes"
  2. The user then assess the quality of the model and decides whether to convert the volume into an initial model

< Import Tools | EMDB to Model >



Perform system check

In addition to the downloads from our svn repository, there are several other requirements that you will get either from your OS installation source, or from its respective website. The system check in the Leginon package checks your system to see if you already have these requirements.

cd myami/leginon/ 
python syscheck.py

If python is not installed, this, of course will not run. If you see any lines like "*** Failed...", then you have something missing. Otherwise, everything should result in "OK".


< Download Appion/Leginon Files | Install Appion/Leginon Packages >



Perform system check

In addition to the downloads from our svn repository, there are several other requirements that you will get either from your OS installation source, or from its respective website. The system check in the Leginon package checks your system to see if you already have these requirements.

cd myami/leginon/ 
python syscheck.py

If python is not installed, this, of course will not run. If you see any lines like "*** Failed...", then you have something missing. Otherwise, everything should result in "OK".


PHP notes

This one is a bit old, but lots of good stuff that goes beyond style. Some things are questionable. I prefer Getters/Setters over Attributes as Objects (at least how the example shows it) to allow for better error handling. I prefer no underscores in naming except for constants that use all caps...but that is only a style issue.

PHP Coding Standard

From the Zend framework folks:
http://framework.zend.com/manual/en/coding-standard.html

An intro:
http://godbit.com/article/introduction-to-php-coding-standards

Nice Presentation:
http://weierophinney.net/matthew/uploads/php_development_best_practices.pdf

PHP Unit testing
http://www.phpunit.de/pocket_guide/

For automatically checking code against the Pear standards use CodeSniffer:
http://pear.php.net/package/PHP_CodeSniffer/

Best Practices:
http://www.odi.ch/prog/design/php/guide.php


Potential job submission problems


< Testing job submission | Setup Remote Processing ^



Troubleshooting the Web Server:

Run the web server troubleshooter

A web server troubleshooting tool is available at http://YOUR_HOST/myamiweb/test/checkwebserver.php.
You can browse to this page from the Appion and Leginon Tools home page (http://YOUR_HOST/myamiweb) by clicking on [test Dataset] and then [Troubleshoot].

This page will automatically confirm that your configuration file and PHP installation and settings are correct and point you to the appropriate documentation to correct any issues.

Firewall settings

You may need to configure your firewall to allow incoming HTTP (port 80) and MySQL (port 3306) traffic:

$ system-config-securitylevel

Security-enhanced linux

Security-enhanced linux may be preventing your files from loading. To fix this run the following command:

$ sudo /usr/bin/chcon -R -t httpd_sys_content_t /var/www/html/

see this website for more details on SELinux


< Install phpMyAdmin



Processing Cluster Login

If you want to run processing jobs directly from the Appion Data Proccessing interface, you must log into the processing server with the steps below. You may also choose not to log in. In this case you can copy and paste processing commands directly into a SSH session.

General Workflow:

  1. When you open the Appion Data Processing interface, notice the Username and Password fields in the upper right corner of the screen.
  2. Enter your username and password. This is NOT the same username and password that you use to enter the Appion and Leginon Web Tools interface. You should use appropriate credentials to log into the processing server.

Notes, Comments, and Suggestions:



< Common Workflow | Particle Selection >



  1. Configure .appion.cfg
  2. Install External Packages
  3. Install EMAN
  4. Install EMAN2
  5. Install SPIDER
  6. Install Xmipp
  7. Install UCSF Chimera
  8. Install Grigorieff lab software
  9. Compile FindEM
  10. Install Ace2
  11. Install Imod
  12. Install Protomo
  13. Test Appion

Processing Server Installation

Appion and Leginon shared steps:

  1. Install supporting packages
  2. Download Appion/Leginon Files
  3. Perform system check
  4. Install Appion/Leginon Packages
  5. Configure leginon.cfg
  6. Configure sinedon.cfg

Continue with the following steps unique to Appion:

  1. Configure .appion.cfg
  2. Install External Packages
  3. Install EMAN
  4. Install EMAN2
  5. Install SPIDER
  6. Install Xmipp
  7. Install UCSF Chimera
  8. Install Grigorieff lab software
  9. Compile FindEM
  10. Install Ace2
  11. Install Imod
  12. Install Protomo
  13. Test Appion

Developers Only - CentOS6 specific instructions for running the svn trunk:

< File Server Setup Considerations | Web Server Installation >



  1. Install supporting packages
  2. Download Appion/Leginon Files
  3. Perform system check
  4. Install Appion/Leginon Packages
  5. Configure leginon.cfg
  6. Configure sinedon.cfg

Process Images

Testing simple appion processing job submission

Upload template is a good example to try:
  1. Create a small mrc image as your template to be uploaded.
    You may instead download a sample GroEL template
     
  2. Log into the Appion processing page.
     
  3. Select Upload template in Import tools in the Appion processing menu of an existing session.
     
  4. Type in the needed information.
    If using the sample GroEl template, the Diameter is 180 and the pixel size is 3.26.
     
  5. When you finish filling in the form you may chose between Just show command and Upload template.
    If you chose Just show command, you will copy and past the command into a Terminal.
    You may also choose your processing host and run the job by clicking "Upload Template".
     
  6. For simple process such as this, the webpage will take a little while to refresh until the job is completed.
     
  7. If you get the message "Template is uploaded", the process is successful and if you refresh the page you will find the template in the available list.

Testing PBS-required appion processing job submission

Particle selection such as DogPicker is a good example to try:
  1. Click on DoG Picking under Particle Selection menu
  2. Enter the required parameters.
    If using sample GroEl, all you need to enter is the particle diameter of 180.
  3. When you finish filling a form for an appion processing, choose your processing host and run the job. Pages for monitoring the job become available after the job is queued and subsequently begins running. If the status appears as "Running or Queued" at first, the setup is likely correct.
  4. After a while, the process will be completed and the status becomes "Done" when you click to have the Status updated on the page.
  5. If you receive the message: ERROR in job submission. Check the cluster Torque may not be set up correctly.

Project DB

The Project DB tool allows users to:

  1. View Projects
  2. Create New Project
  3. Edit Project Description
  4. Edit Project Owners
  5. Create a Project Processing Database
  6. Unlink a Project Processing Database
  7. Upload Images to a new Project Session
  8. Share a Project Session with another User
  9. View a Summary of a Project Session
  10. Grid Management

< User Management | Image Viewers >



Purpose of Code Reviews

  1. It takes far less time to find bugs during a code review than to test and debug code.
  2. The fact that a colleague will be viewing your code is a huge motivation for writing good code and bug free code from the start.
  3. By walking another person through your code, you are no longer the only living human that understands what you were thinking! (Good for maintenance)
  4. If there is a problem with your code, you share responsibility for the bug with the person who performed the review. (It's not ALL your fault.)
  5. It is extremely difficult and prohibitively time consuming to test every decision path of a function. A code review may be the only quality assurance available.
  6. Code reviews provide a valuable opportunity for sharing knowledge and exchanging coding techniques.

Python Coding Standards

This document is a list of python coding standards. To add a new standard copy the template below and modify it.

see also http://ami.scripps.edu/wiki/index.php/Appionscripts_formatting_rules


Name of Coding Standard

Definition

What is the coding standard

Justification

Why is the coding standard important

Example

GOOD:

this is a good example code

BAD:

this is a bad example code


Python Coding Standards for AMI


Use tabs instead of spaces

Definition

Use tabs instead of spaces for inline code

Justification

It is important to be consistent. People like different sizes of columns, some like 8 spaces, others 4, 3, or 2. With tabs each individual can customize their viewer.

Example

GOOD:

if True:
<tab>while True:
<tab><tab>print "tab" 
<tab>break

BAD:

if True:
   while True:
        print "tab" 
   break


Checking beginning or ending of strings

Definition

Use ''.startswith() and ''.endswith() instead of string slicing to check for prefixes or suffixes.

Justification

startswith() and endswith() are cleaner and less error prone.

Example

GOOD:

if foo.startswith('bar'): 

BAD:

if foo[:3] == 'bar':


Never use from module import *

Definition

Never use from module import *, use import module instead

Justification

It is hard to track where functions come from when import * is used

Example

GOOD:

import numpy
a = numpy.ones((3,3))

BAD:

from numpy import *
a = ones((3,3))


Appion image data naming conventions

Definition

  1. never use 'image' or 'img'
  2. use 'imgdict' for image dictionaries
  3. use 'imgarray' for image numarrays
  4. use 'imgname' for image filenames
  5. use 'imgtree' for the main list of image dictionaries
  6. use 'imglist' for a list of image data

Justification

If you are consistent with your names people can read your code.

Example

GOOD:

for imgdict in imgtree:
    imgarray = imgdict['image']
    imgname = imgdict['filename']

BAD:

for image in imgs:
    array = image['image']
    name = image['filename']


Use descriptive variables

Definition

Use descriptive variables, asdf is not a variable.

Justification

No one understands shorthand variables.

Example

GOOD:

imgarray = mrc.read('leginon_image.mrc')
particle1 = imgarray[47, 21]
particle1 = imgarray[10, 15]
stack = [particle1, particle2]

BAD:

i = mrc.read('x.mrc')
prtl1 = i[47, 21]
prtl2 = i[10, 15]
s = [prtl1, prtl2]


Location of functions

Definition

Functions that have a global use, should go in appionlib folder.
Functions that will only be used by a single program go into that program's file.
Upload to the database function are typically only used by a single program and should be within that program not in appionlib

Justification

Keep the code clean an organized.

Example

GOOD:

from appionlib import commonFunctions

class AppionScript():
 def customUploadToDB(self):
   """stuff""" 
 def run(self):
   commonFunctions.commonFunction()
   self.customUploadToDB()

BAD:

from appionlib import apUploadCustom

class AppionScript():
 def commonFunction(self):
    """stuff""" 
 def run(self):
   self.commonFunction()
   apUploadCustom.customUploadToDB()


Python notes

Debugging


Quality Assessment

Purpose: Describe tools used for routine quality assessment in Leginon and Appion.

General Workflow:

  1. Assess the quality of automated data collection: In Leginon, the results of any operation can be tested within the interactive GUI prior to target queuing, as, for example, when assessing the hole-finder algorithm. This ensures that optimal criteria are set prior to data collection. Leginon displays output to the user when it encounters errors during data collection, as in the case when an autofocus procedure fails or when the specimen drifts for an extended period of time. At all times, the functional state of the software and all relevant data collection parameters are monitored and tracked using the database to ensure reproducibility in future sessions.
  2. Assess the quality of the collected data: Each Leginon session keeps track of statistics during data collection, which are used to assess the quality of the images. For each experiment and for each collected image, Leginon monitors: (a) all instrument parameters associated with the microscope, (b) the duration of collection, (c) camera readout statistics and electron dose, (d) the magnitude of specimen drift over time, (e) the ice thickness for each hole, (f) image and beam shift values, (g) the accuracy of the autofocus procedure, and several additional parameters. This type of record keeping allows us to track the state of the instruments and analyze the quality of the collected data. Leginon and Appion Database Tools.
  3. Eliminate undesired micrographs or regions: Images can be manually selected to be “hidden” from view using buttons on the Imageviewer. Note that images can also be marked as “Exemplars”. In Appion, “Junk Finding” and “Manual Masking” options exclude undesired areas of the micrograph for particle selection. The “Multi-Image Assessment” tool allows the user to reject entire micrographs, usually after a particle selection run, while the “Image Rejector” function automatically rejects images that either do not have CTF parameters, particle picks, or associated tilt pairs, or are outside a specified fitness factor of defocus range after CTF estimation. Images that have been “rejected” or “hidden” can be optionally excluded from processing runs. Processing runs can also be set to use only “exemplar” images, often useful for testing. Rejected or hidden images, or exemplars, can still be viewed in the Imageviewer by selecting them form the pulldown menu options. Hidden images can be restored while viewing them by selecting the hidden button again. Images are never permanently removed form the database. Appion Image Viewers.
  4. Assess the quality of processing algorithms: Appion’s imageviewer allows the user to assess some of the initial steps of image processing. The “ACE” button provides graphical displays of CTF estimation as well as fitting parameters and fitness values. The user can visualize images of the edges detected by the algorithm and compare them to the Thon rings present in the PSD of the image. The “P” button displays all particle picks from the current micrograph for the particle selection run specified by the user. Histograms of confidence scores and correlation values are displayed on the Appion summary pages for CTF estimation and particle picking. Appion Image Viewer Tools.
  5. Assess the quality of particle stacks: Stacks can be directly examined as a montage of particles. Summary pages provide graphical summaries of the intensity and standard deviation of the particle images and these outputs can be useful for cleaning up the stack by rejecting particles on the edge of a hole or over the carbon. The “Xmipp_sort_by_statistics” algorithm can also be used to identify junk particles. During subsequent image processing steps, particles may be rejected based on poor scores within 2D alignment and classification steps, high Euler jumper statistics, or poor correlations for class assignment during 3D refinement. Link to Appion Particle Stack Tools.
  6. Assess the accuracy of 2D alignment and classification: Appion report pages display Eigenimages, variance averages, and/or correlation histograms for each alignment and classification run; some outputs are algorithm dependent. All translations, rotations, and mirror reflections applied to the imges, as well as the coordinates within the original micrograph are stored in the database. Appion 2D Alignment and Classification Tools.
  7. Assess veracity of an electron density map: Appion displays a variety of output related to the reconstruction refinement as an aid in determining the consistency and veracity of the reconstructed map. Refinement runs automatically generate summary pages providing output at each iteration that includes: (a) Resolution derived from the Fourier Shell and Rmeasure criteria, (b) Distribution of Euler angles, (c) Euler angle differences between the current and previous iteration as well as the average median Euler difference for all iterations, (d) Side-by-side comparison of input projections with associated reprojections of the map, (e) Snapshots of the reconstructed model. Appion 3D Reconstruction Tools.

RCT Reconstruction Example of Quality Assessment Pages:

A. RCT reconstructions Summary Page lists all RCT reconstructions completed for the particular dataset

B. Clicking on a particular reconstruction opens a summary page displaying all relevant information for processing steps including and leading up to the reconstruction.

C. A link is provided to a plot summarizing the quality of the 2D alignment preceding the RCT reconstruction.

D. A link to the 2D alignment output opens a page that allows further processing, including selecting a particular stack (green), and viewing its corresponding raw particles (purple). Alternatively, another RCT reconstruction can be calculated, raw particles viewed, or an alternate method utilized to calculate another 3D reconstruction.

E. The raw particles for any stack can be viewed in the web browser with further processing options such as creating a substack by clicking on particular images to include or exclude (red) and then selecting "create substack" (blue).

Notes, Comments, and Suggestions:



< Step by Step Guide to Appion | Processing Cluster Login >



Random-Conical Tilt (RCT) Reconstruction workflow

Purpose: Basic workflow for processing an RCT data set.

General Workflow:

  1. Pick particles: Pick particle on the tilted and untilted micrographs. This can be done in separate jobs (e.g. with multiple templates), or in one job. If your particles are small, one strategy might be to pick more particles than are clearly evident. The reason for this is that particle pickes will get "filtered" (or thrown out) during the tilt-pair alignment if the tilt pairs don't match up, as well as during classification.
  2. Align tilted and untilted particles: The first step might be to try the Auto Align Tilt Pairs. If you have good overlap of your imaged area in your tilted and untilted micrographs, as well as picked particles on both tilt-pairs, then the auto-tilt alignment should work without problems. If, for example, you have little overlap or few particle picks, then the auto-tilt alignment might fail, or, worse, align the wrong tilted particle to its untilted counterpart. it is therefore very important to check the result of this step in the imageviewer. If the auto-tilt aligner fails, then the particles must be aligned manually using the manual Align and Edit Tilt Pairs function before proceeding to the next step.
  3. Create tilted and untilted stacks: Create (1) an untilted stack from the untilted particle picks and (2) a tilted stack from the tilted particle picks, referring to the picked set made using the alignment step above. The two stacks should have roughly the same number of particles. Note that they rarely have an identical number of particles, because some particles get rejected if they lie too close to the edge. This will not make a difference in the later stages of processing. The important thing is to have a tilted and untilted particle stack from the tilt-pair alignment step above. Once the tilted stack exists, you can leave it alone. You can manipulate the untilted particles as you wish (e.g. align, classify, remove particles, etc.). Since each untilted particle has a database record that corresponds to its tilt-pair, as long as the information stays within Appion, you can proceed without the need for any extra bookkeeping.
  4. Align (and classify) untilted particles: The untilted particle stack must be aligned. You can use any Particle Alignment available (however, note that, in Xmipp 2.4 CL2D does not save alignment information into a docfile, and therefore CL2D cannot be directly used for RCT reconstruction). If the alignment procedure also contains a classification / clustering within it, then the resulting class averages can be directly used for RCT reconstruction. Otherwise, you must classify / cluster the aligned stack, e.g. using Correspondence analysis, followed by hierarchical ascendance classification, or, e.g., Xmipp Kerden Self-Organizing Map.
  5. Reconstruct each class: Each class of particles produced by the alignment and classification step can be used to create an RCT volume. If you display the resulting classes in the viewer, you can click on any number of them, then click on 'create RCT volume', which will take you to the RCT Volume launch page. Once there, fill in the required information to create a map. If you click on more than one class, then the resulting volume will be a combination of the particles in those classes. However, if the two are NOT aligned in-plane with respect to each other, then they should not be combined, because the Euler angles will be averaged out.
  6. Assess the accuracy of the classification: Several ways to do this. First, one should make sure that the aligned particles actually belong to the class from which the volume was reconstructed. Most, or at least the majority, of the particles should actually look like the class average. Otherwise, there is too much heterogeneity in the class, and the classification and RCT reconstruction should be redone. In general, it is a good idea to create as many class averages as possible to accurately capture all the different views in the data without sacrificing signal-to-noise ratio. This is not a straightforward task, but can be approximated fairly well. As a general rule of thumb, ~100 particles per class is sufficient to provide a class average with good signal. ~500 particles per class should provide enough particles for a 40Å RCT map.
  7. Assess veracity of the RCT map: RCT reconstruction is a very powerful strategy for reconstructing electron density maps, particularly of heterogeneous systems, but also has its limitations. First the resulting volumes contain a missing cone of information (Radermacher, M., T. Wagenknecht, et al. (1986). "A new 3-D reconstruction scheme applied to the 50S ribosomal subunit of E.coli." Journal of Microscopy 141: RP1-RP2 and Radermacher, M., T. Wagenknecht, et al. (1987). "Three-dimensional reconstruction from a single-exposure, random conical tilt series applied to the 50S ribosomal subunit of Escherichia coli." J Microsc 146(Pt 2): 113-136) and second, it may be severely flattened (Cheng, Y. F., E. Wolf, et al. (2006). "Single particle reconstructions of the transferrin-transferrin receptor complex obtained with different specimen preparation techniques." Journal of Molecular Biology 355(5): 1048-1065). To minimize the effects of the missing cone and particle flattening, you can average different views of the same particle. The latter must be emphasized, because if your particles are either conformationally or compositionally heterogeneous, then you additionally have to determine whether or not distinct class averages arise from particle heterogeneity OR different views of an otherwise identical object. If they represent different views of an otherwise identical object, then, if you align and average in 3D the different resulting RCT reconstructions of distinct views, you should in theory come up with an improved result. If they represent heterogeneous specimens, then aligning the resulting volumes in 3D will only make things worse, because any information that can be potentially gained from distinguishing heterogeneity will be lost to averaging. Discriminating between these two possibilities is not trivial, but can be done with careful analysis of your data. In the case of a single preferred orientation, then the resulting map will be flattened, AND it will contain a missing cone of information. Although this cannot be avoided, the latter scenario may or may not affect the interpretation of the map, depending on what is desired.

< Step by Step Guide to Appion | Processing Cluster Login >



Random Conical Tilt Picking

If you ran a random conical tilt session you have to pick and correlate (align particle pairs) the particles on the tilt pairs. To find the particles you can use Template Picking, Manual Picking or Dog Picking.

To align particle pairs:

  1. Align and Edit Tilt Pairs: manual alignment and editing of the tilt pairs.
  2. Auto Align Tilt Pairs: automated alignment and editing of the tilt pairs.

Notes, Comments, and Suggestions:

  1. We use the auto align option all the time, and it works well. Generally, after running auto align, we check the aligned picks using the Multi Img Assessment tool in Appion.



< Particle Selection | CTF Estimation >



RCT

The RCT viewer is used for viewing random conical tilt (RCT) or orthogonal tilt reconstruction (OTR) pairs of images.

Notes, Comments, and Suggestions:

  1. For examining particle selection results:

< Dual Viewer



RCT Volume

This method relies on physical tilting of the specimen in the microscope to obtain 2D projection views for samples with preferred orientation. Images are taken at 0 and 45-60 degrees. Alignment and classification of the 0 degree data determines orientation parameters to be applied to the tilted data. This method was originally described by Raddermacher, M. et. al Journal of Microscopy v141,RP1-2 (1986).

General Workflow:

Note: RCT Volume can be accessed directly from the Appion sidebar, or by clicking on the "Create RCT Volume" button displayed above class averages generated through 2D Alignment and Classification

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Select the aligned stack to use for titled particle parameters from the drop down menu. Note that stacks can be identified in this menu by clustering or feature analysis run ID. Also note that there will be no drop down menu for this option if the RCT Volume page was accessed directly from the clustering run output webpage, because these parameters are then already known.
  4. Enter the class numbers to use for volume generation. To decide which classes to use, go look at your alignment/feature analysis/clustering output.
  5. Select the TILTED particle stack to use for the reconstruction.
  6. Specify a mask radius that is barely bigger than the radius of your molecule.
  7. Click on "Rct Volume" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  8. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run RCT Volume" submenu in the appion sidebar.
  9. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Feature Analysis" tab in the appion sidebar. Clicking on this link opens a summary of all RCT volume runs that have been done on this project.
  10. Click on the RCT job run name to open a new window containing all relevant alignment and classification information.
  11. Click on the FSC graph to enlarge in a new window.
  12. Click on the appropriate links to access the raw, template, and aligned stacks of particles that went into this reconstruction.
  13. Click on individual snapshots to enlarge in a new window.

Notes, Comments, and Suggestions:

  1. Specifying multiple classes will result in them being combined into a single RCT volume. Capability to launch multiple, separate RCT reconstructions at once coming soon!
  2. The default parameters for median filter, low pass, and high pass can be changed. Number of particles uses only a subset of the data. The chimera settings only dictate the "look" of your output page. It does not alter the final outcome.

< Ab Initio Reconstruction | Refine Reconstruction >



Reconfigure the database

  1. Drop the databases if they exist
     
    mysql -h localhost -u root
    
    show databases;
    

    If you see:
    projectdb
    leginondb

    drop database projectdb;
    drop database leginondb;
    
  2. Create the databases
    create database projectdb;
    create database leginondb;
    
  3. Download config.php and put it in /var/www/html/myamiweb
     
  4. Change the permission of the config file
    chmod 777 /var/www/html/myamiweb/config.php
    
  5. Point your web browser to http://localhost/myamiweb/setup
    1. The database username is root
    2. There is no database password. Leave this field empty.
    3. Click Next on each page without making any changes.
    4. Eventually, you will see a button to Create Web Tools Config. Click on this button, and ignore the Apache User error.
    5. Click on the DB Initialization button.
    6. Click Next.
    7. Click Next to insert default values.
    8. You will need to enter information for the Adminstrator Appion user. You may enter any password and your email address.
    9. When this completes, you may start using Appion, or may continue with the following steps to insert a sample project into the database which can be used for processing with Appion.
       
  6. Delete the sample images folder
    cd /myamiImages/leginon
    rm -rf sample
    
  7. Point your browser to the following url but first replace your_root_password with the root password of your CentOS installation.
    http://localhost/myamiweb/setup/autoInstallSetup.php?password=your_root_password
    
  8. When this script completes, you will be redirected to the Appion and Leginon home page.

Reconstruction Parameters

Name a.k.a. Description Used By Make common to all? Notes
General Refinement Parameters
Outer Mask Radius Particle outer mask radius (RO),Outer Radius, mask radius from center of particle to outer edge (frealign in angstroms, spider in pixels), raduius of external map in pixels frealign, spider,xmipp, eman, imagic y
Inner Mask Radius Inner Radius,Particle inner mask radius inner radius for alignment in pixels xmipp, frealign, spider y not xmipp
Outer Alignment Radius xmipp, eman, imagic y not eman, frealign
Inner Alignment Radius xmipp, eman, imagic y not eman, frealign
Symmetry Group sym ex. c1, c2... xmipp,eman,frealign,spider,imagic y
Number of iterations iteration number xmipp y
Angular Sampling Rate ang angular step for projections in degrees xmipp,EMAN y not frealign
Percentage of worst images to discard xmipp y not frealign
Filter reconstructed volume to estimated resolution flt3d y not frealign
Filter reconstructed volume to resolution computed by fsc Low Pass Filter the reference?, Constant to add to the estimated resolution y not frealign, eman
stack preparation parameters
Last particle to use frealign y specific to stack preparation, not refinement alg
lp filtering low pass filter in angstroms spider, imagic y specific to stack preparation, not refinement alg
hp filtering high pass filter in angstroms spider, imagic y specific to stack preparation, not refinement alg
Algorithm Specific Refinement Parameters
imask radius of internal mask (in pixels for eman and spider, Angstroms for frealign) EMAN
amask amask=[r],[threshold],[iter] eman
Mask Filename xmipp
Max angular change xmipp
max change offset xmipp
Search range for 5d transitional search xmipp
Reconstruction Method xmipp
Values of lambda for ART xmipp
Initial max frequency used by reconstruct fourier xmipp
Compute resolution? xmipp don't need this, should always be set to yes
maxshift max translation during image alignment in pixels eman
hard hard limit for make3d program eman
clskeep =[std dev multipier] how many raw particles discarded for each class average eman
clsiter iterative alignmnet to each other eman
xfiles =[mas in kD] eman
shrink scale down by a factor of [n] before classification eman
euler2 =[oversample factor] eman
median use median value instead of average for each pixel eman
phscls use signal to noise ratio weighted phase residual eman
refine do subpixel alignment eman
tree decimate reference populatiuon eman
coran use coran algorithm eman
eotest use even odd test eman remove this, should always be yes, takes place of Compute Resolution?
amplitude contrast (WGH) frealign
Standard deviation filtering fealign
Phase B-factor weighting constant (PBC) frealign
B-factor offset (BOFF) frealign
Number of randomized search trials (ITMAX) frealign
Number of potential matches to refine (IPMAX) frealign
Target phase residual (TARGET) frealign
Worst phase residual for inclusion (THRESH) frealign
Resolution limit of reconstruction (RREC) (in Ångstroms; default Nyquist) frealign
Lower resolution limit or high-pass filter (RMAX1) (in Ångstroms) frealign ? is this the same as stack prep lp/hp filter
Higher resolution limit or low-pass filter (RMAX2) (in Ångstroms; default 2*Nyquist) frealign ? is this the same as stack prep lp/hp filter
B-factor correction (RBFACT) (0 = off) frealign
Only use CTFFIND values frealign
MSA - num class averges to produce from raw images imagic
MSA - num factors to use for classification imagic
MSA - percentage of worst class members to ignore after classification imagic
threed reconstruction object size, low-pass filter 3d volume imagic
Center stack prior to alignment imagic
Mirror references for alignment imagic
MRA - min radius for rotational alignment in pixels imagic
MRA - max radius for rotational alignment in pixels imagic
MSA - fraction of final class averages to keep imagic
MRA - angular increment of forward projections imagic
MRA - max radial shift compared to original images imagic
MRA - max radial shift during this iteration imagic
MSA - percentage of images to ignore when calculating eiganimages imagic
Angular reconstitution angular increment of forward projections, ang inc of euler search, fraction of best ordered images to keep imagic
firstring similar to alignment radius Any pixels this far away from center will not be used spider
lastring similar to alignment radius Only pixels this far away will be used spider
xysearch translational search during projection matching will be limited to this many pixels from the center ofthe image spider
angular increments Angular Sampling Rate list of angular increments for projection mapping spider
keep determines which particles are kept for back-projection, -1 is one standard deviation worse than mean spider
xyshift particles only allowed to shift this far from center spider
Approximate mass in Kd imagic

Refinement Package Status

Refinement Package Status
package web launch multi-node garibaldi follow progress DB upload
EMAN Yes Yes Yes Yes Yes
FREALIGN Yes Yes Yes Yes Almost
Xmipp Broken No No No Yes
SPIDER Broken% No No No No
IMAGIC Broken* ?? No ?? Almost#
* data00 is hard-coded in, did not launch
# does not use refinement tables
% single iteration only

Other packages: EMAN2, SPARX, ...
Multi-model refinement: EMAN (Pick-wei)


Refinement Post-processing Procedures

For each refinement, the Reconstruction summary page automatically displays for each iteration information such as the FSC curve, Euler angle distributions, good and bad classes, 3D snapshots of the model, and allows for the following post-processing procedures.

Remove Jumpers

Remove Jumpers will remove particles with ambiguous orientation.

  1. Select the Remove Jumpers button found under the density column of your refinement iteration information.
  2. Change the value of the Average Jump field to the average median Euler jump. If this is left as 0, no particles will be removed.


< Refine Reconstruction | Quality Assessment >



Refine Reconstruction

Most initial models establish nothing more than a preliminary sense of the overall shape of the biological specimen. In order to reveal structural information that can answer specific biological questions, the model requires refining. In single particle analysis, a refinement is an iterative procedure, which sequentially aligns the raw particles, assign to them appropriate spatial orientations (Euler angles) by comparing them against the a model, and then back-projects them into 3D space to form a new model. Effectively, a full refinement takes as input a raw particle stack and an initial model and is usually carried out until no further improvement of the structure can be observed, often measured by convergence to some resolution criterion.

Refinement Options Available in Appion:

  1. EMAN Refinement
  2. Frealign Refinement
  3. SPIDER Refinement
  4. IMAGIC Refinement
  5. Refinement Post-processing Procedures

Appion Sidebar Snapshot:

Notes, Comments, and Suggestions:

Icosohedral conventions

When dealing with icosahedral particles such as viral capsids, particular attention should be given to properly orient the model (the icosahedral axes) to match specific conventions adopted by different softwares.

To convert a model from the Crowther orientation to the EMAN one, the following proc3d command is required:

proc3d model_Crowther.mrc model_EMAN.mrc icos2fTo5f

Please note that a box can be checked when importing a model within appion to allow converting from Viper to EMAN orientation.

To assess visually in which orientation a model is, the simplest way is to do it in UCSF Chimera. After opening the map, press the Orient button in the Volume viewer to orient the z axis perpendicularly to the screen. Then open a file (named axis.bild for example) containing the following lines:

.color red

.cylinder  500.0  0.0  0.0  -500.0  0.0   0.0 1

.color green

.cylinder  0.0  500.0  0.0  0.0  -500.0   0.0 1

.color blue

.cylinder  0.0  0.0  500.0  0.0  0.0   -500.0 1”

This file will display the Cartesian axes x (red), y (green) and z (blue).

Below are the conventions used by the different softwares present within appion to generate and refine reconstructions (most allow to use different conventions).


Reconstruction Parameters


< Ab Initio Reconstruction|Quality Assessment >



Refinement Web User Interface documentation

Class Diagram

The following class diagram shows the BasicForm class with it's extended classes as well as the FormParameter class and it's extended classes.
It also shows associations among the classes.
Notice that the specific refine parameter classes use polymorphism to override the validate() function. This allows the extended classes to provide more complex validations than a typical form requires.
Other forms, such as RunParameters and stack prep, just use the base FormParameters class that their parent, BasicForm, uses.

refine class diagram

Sequence Diagram

The following sequence diagram shows how the Form and Parameter classes work together to display a form, validate the user input, and create a command string.

GUI sequence diagram

Steps to add a new refinement method to the pipeline

  1. Create a new class which extends BasicRefineForm
  2. Create a new class which extends RefineFormParameters. This is optional, but recommended.
  3. Decide what information you need to collect from the user and add a form parameter for each item.
    1. example: adding parameters to a new parameter class
      1. note that the parent constuctor is called first, passing any default values along. The RefineFormParameters base class defines parameters that are common to most refinement methods. You only need to add parameters that are not already defined in the base class.
      2. use the addParam(name, value, label) method to add parameters specific to your refine method to your form.
      3. if you find that the base class includes a parameter that your method does not need, you can remove the parameter from the form with the hideParam(name) method.
        class XmippParams extends RefineFormParameters
        {
            function __construct( $id='', $label='', $outerMaskRadius='', $innerMaskRadius='', $outerAlignRadius='', 
                                    $innerAlignRadius='', $symmetry='', $numIters='', $angSampRate='', $percentDiscard='',  
                                    $filterEstimated='', $filterResolution='', $filterComputed='', $filterConstant='',
                                    $mask='', $maxAngularChange='', $maxChangeOffset='', $search5DShift='', $search5DStep='',
                                    $reconMethod='', $ARTLambda='', $doComputeResolution='', $fourierMaxFrequencyOfInterest='' ) 
            {
                parent::__construct($id, $label, $outerMaskRadius, $innerMaskRadius, $outerAlignRadius, 
                                    $innerAlignRadius, $symmetry, $numIters, $angSampRate, $percentDiscard,  
                                    $filterEstimated, $filterResolution, $filterComputed, $filterConstant );
        
                $this->addParam( "mask", $mask, "Mask filename" );
                $this->addParam( "maxAngularChange", $maxAngularChange, "Max. Angular change " );        
                $this->addParam( "maxChangeOffset", $maxChangeOffset, "Maximum change offset " );
                $this->addParam( "search5DShift", $search5DShift, "Search range for 5D translational search " );
                $this->addParam( "search5DStep", $search5DStep, "Step size for 5D translational search " );
                $this->addParam( "reconMethod", $reconMethod, "Reconstruction method " );
                $this->addParam( "ARTLambda", $ARTLambda, "Values of lambda for ART " );
                $this->addParam( "doComputeResolution", $doComputeResolution, "Compute resolution? " );
                $this->addParam( "fourierMaxFrequencyOfInterest", $fourierMaxFrequencyOfInterest, "Initial maximum frequency used by reconstruct fourier " );
        
                // disable any general params that do not apply to this method
                $this->hideParam("innerMaskRadius");        
            }
        
            function validate() 
            {
                $msg = parent::validate();
        
                if ( !empty($this->params["mask"]["value"]) && !empty($this->params["outerMaskRadius"]["value"]) )
                    $msg .= "<b>Error:</b> You may not define both the outer mask raduis and a mask file.";
        
                return $msg;
            }
        }
        
    2. example: adding parameters to the form constructor method
  4. define any restrictions that should be applied to the parameters.
  5. Override the advancedParamForm() function to add a user input field for each one.
  6. Override the buildCommand() function. There is a default implementation available which adds all form parameters as --<name>=<value>.
  7. create your new form type in selectPreparedRecon.php
    // based on the type of refinement the user has selected,
    // create the proper form type here. If a new type is added to
    // Appion, it's form class should be included in this file
    // and it should be added to this function. No other modifications
    // to this file should be necessary.
    function createSelectedRefineForm( $method, $stacks='', $models='' )
    {
        switch ( $method ) {
            case emanrecon:
                $selectedRefineForm = new EmanRefineForm( $method, $stacks, $models );
                break;
            case frealignrecon:
                $selectedRefineForm = new FrealignRefineForm( $method, $stacks, $models );
                break;
            case xmipprecon:
                $selectedRefineForm = new XmippRefineForm( $method, $stacks, $models );
                break;
            case xmippml3drecon:
                $selectedRefineForm = new XmippML3DRefineForm( $method, $stacks, $models );
                break;
            default:
                Throw new Exception("Error: Not Implemented - There is no RefineForm class avaialable for method: $method"); 
        }        
    
        return $selectedRefineForm;
    }
    
  8. Add an entry to selectRefinementType.php for the new method.

Region Mask Creation

A region mask can be created on images. Automated masking can also be assessed manually. During stack creation, particle selections within the assessed masks will not be considered for creating the stack.

Available Method:

  1. Manual Masking

Notes, Comments, and Suggestions:



< Appion Processing|Manual Masking >



Request Software Registration Key

To request a software registration key, you must first register as an Appion/Leginon user.
If you have not already created an account on this website, please do so now.


Repeat from other session

Here you can find a list of all jobs previously run on this project. If you want to rerun a job from another session with identical settings on your current session, for example rerun a particle picker, click on the specific job.

Notes, Comments, and Suggestions:


< Particle Selection


Retrieve Forgotten Password

To reset a lost password:
  1. Press the [Lost Password] button on the Login Screen.
  2. Enter the username that you registered with
  3. Press the Send Password button
     
    An email will be sent to you with a new temporary password.
     
  4. After logging in to the system, change your password by editing your user profile

Lost Password Screen
Lost Password Screen


< New User Registration | Modify Your Profile >



Run Alignment

In order to extract quantitative information out of the inherently low SNR data obtained by EM, 2D averaging must be applied to homogenous subsets of single particles. This requires the single particles to be brought into alignment with one another, so that the signal of common motifs is amplified. Alignment protocols typically operate by shifting, rotating, and mirroring each particle in the data set in order to find the orientation of particle A that maximizes a similarity function with particle B. Depending upon the existence of templates obtained from a priori information about the specimen, particle alignment algorithms are separated into reference-free and reference-based approaches.

Reference-Free Alignment

  1. Xmipp Maximum Likelihood Alignment
  2. Spider Reference-free Alignment

Reference-Based Alignment

  1. Spider Reference-based Alignment
  2. IMAGIC Multi Reference Alignment
  3. Ed's Iteration Alignment
  4. Xmipp Reference-based Maximum Likelihood Alignment

Notes, Comments, and Suggestions:


<Particle Alignment | Run Feature Analysis >



Run Database Update Script

Running the following script will indicate if you need to run any database update scripts.

cd /your_download_area/myami/dbschema
python schema_update.py

This will print out a list of commands to paste into a shell which will run database update scripts.
You can re-run schema_update.py at any time to update the list of which scripts still need to be run.


Run Feature Analysis

Feature analysis refers to systematic techniques for extracting features from a series of aligned particles with the intent of clustering images with with similar features together. Feature anaylsis is closely related to multivariate statistics . All of these feature analysis techniques fall into two categories: principal component analysis (PCA) (Spider Coran and IMAGIC MSA) and neural networks (Xmipp KerDen SOM).

Feature Analysis Procedures

  1. Spider Coran Classification
  2. Xmipp Kerden Self-Organizing Map
  3. Xmipp Rotational Kerden Self-Organizing Map

Notes, Comments, and Suggestions:


<Run Alignment | Run Particle Clustering >



Run Image Rejector

The user is able to globally reject images that do not meet certain criteria

General Workflow:

The user chooses the options to reject "bad" images. This can be based upon different critieria:

  1. Defocus range
  2. Particle on the image
  3. CTF estimation results
  4. Tilt information

Notes, Comments, and Suggestions:


< Image Assessment



Run Particle Clustering

After feature analysis, particles are ordered and summed according to their relative similarity (proximity in reduced multidimensional image point space).

Particle Clustering Procedures

  1. Hierarchical or K-means Clustering

Notes, Comments, and Suggestions:


<Run Feature Analysis | Ab Initio Reconstruction >



Security Considerations

  1. Using SSL
  2. Creating a self signed certificate
  3. Forwarding http to https

Linux distribution recommendation

We list our experience and current progress here.

Our Preference : CentOS 5.x

If you have a new computer(s) for your Leginon/Appion installation, we recommend installing CentOS because it is considered to be more stable than other varieties of Linux.

CentOS is the same as Red Hat Enterprise Linux (RHEL), except that it is free and supported by the community.

We have most experience in the installation of the supporting packages on CentOS and this installation guide has specific instruction for the process.

Start at Instructions for installing CentOS on your computer.

Other known cases of success:

Fedora 10

Start at Instructions for installing Fedora on your computer

SuSE, Ubuntu

MacOS


Instructions for installing CentOS on your computer >



Setup job submission server

In this case, we are setting up a job submission server that will have all of the data directories mounted and external packages installed (EMAN, Xmipp, etc.) on the compute nodes. Most institutions have a job submission server already, but the data is not accessible. Appion is not set up for this scenario except for large reconstruction jobs.


PBS and the Torque Resource Manager

PBS stands for a Portable Batch System. It is a job submission system meaning that users submit many jobs and the server prioritizes and executes each job as resources permit. Below we show how to install the popular open source PBS system called TORQUE.

A TORQUE cluster consists of one head node and many compute nodes. The head node runs the pbs_server daemon and the compute nodes run the pbs_mom daemon. Client commands for submitting and managing jobs can be installed on any host (including hosts not running pbs_server or pbs_mom). More documentation about Torque is available here.


Head node installation

Install Torque-server

Torque available with Fedora and CentOS 5.4 (through the EPEL). For YUM based systems type:

sudo yum -y install torque-server torque-scheduler torque-client

Initialize Torque-server, because PATH setting you will need to become root

Make sure the directory containing the pbs_server executable is in your PATH. For CentOS this is usually /usr/sbin.

sudo pbs_server -t create

Activate Torque-server

Enable the torque pbs_mom daemon on reboot:

sudo /sbin/chkconfig pbs_server on
sudo /sbin/service pbs_server restart
sudo /sbin/chkconfig pbs_sched on
sudo /sbin/service pbs_sched start

Add nodes to Torque-server nodes file: /var/torque/server_priv/nodes

The format is:

node-name[:ts] [np=] [properties]

To add the localhost with two processors as a node, you would add:

localhost np=2

You should add every compute node to this file, e.g.,

node01.INSTITUTE.EDU np=2
node02.INSTITUTE.EDU np=4
node03.INSTITUTE.EDU np=2

Compute node installation

Install Torque-mom

Torque available in with Fedora and CentOS 5.4 (through the EPEL). For YUM based systems type:

sudo yum -y install torque-mom torque-client

Configure node to receive jobs from headnode:

see http://www.clusterresources.com/products/torque/docs/1.2basicconfig.shtml#initializenode for more details

Edit the /var/torque/mom_priv/config (CentOS 5) OR /var/lib/torque/mom_priv/config (CentOS 6) file:

$pbsserver  headnode.INSTITUTE.EDU   # hostname running pbs_server

For the localhost add:

$pbsserver  localhost   # hostname running pbs_server

Activate Torque-mom

Enable the torque pbs_mom daemon on reboot:

sudo /sbin/chkconfig pbs_mom on
sudo /sbin/service pbs_mom start

Munge

http://www.clusterresources.com/torquedocs/1.3advconfig.shtml

Munge is a tool to prevent users from certain nodes and other features

sudo create-munge-key
sudo /sbin/chkconfig munge on
sudo service munge start
sudo qmgr -c 'set server authorized_users=user01@host01'
sudo qmgr -c 'set server authorized_users=user01@host02'
sudo qmgr -c 'set server authorized_users=user01@*'

___

Test Torque Setup

On the head node, see if you can run a qstat:

qstat

You can type:

pbsnodes
to check the state of the compute clusters.

On the head node, create a job and submit it:

echo "sleep 60" > test.job
echo "echo hello" >> test.job
qsub test.job
qstat

get all settings

sudo qmgr -c 'list server'


^ Setup Remote Processing | Install SSH module for PHP >



Setup Local Databases

1 Make sure MySQL is installed

Follow the installation instructions.

Also install phpMyAdmin.
Note that phpMyAdmin version 2.11.10 works with older versions of PHP (that we happen to use).

2 Dump tables from cronus4 to a local file

This will grab the actual data that we use so you can play with it.
Log into cronus3 so that you can access cronus4.

$ ssh cronus3

Use mysqldump to get any table data that you want as in the example below.
Cronus4 is the host.
We do not lock the tables because we don't have permission to.
"project" is the name of the database and "login" is the name of the Table.
We make up a file name for the data to dump to.

$ mysqldump -h cronus4 -u usr_object --skip-lock-tables --extended-insert project login > ProjectLogin.sql
mysqldump -h cronus4 -u amber -p --skip-lock-tables --extended-insert project > Project.sql

The --extended-insert option causes mysqldump to generate multi-value INSERT commands inside the backup text file which results in the file being smaller and the restore running faster. ref

More info on mysqldump is here.

Exit cronus3 when you are done dumping tables and load the dump files into your database.
If you followed the instructions for setting up MySQL in the Leginon Install guide, you have already created dbemdata and projectdata databases.
If you don't have them, create them first.

mysql -u root projectdata < ProjectLogin.sql

3 Modify Config.php

This is the Myami config file. It is being changed right now so this is in flux. Will update soon.

It should look like this:

// --- Set your leginon MySQL database server parameters

$DB_HOST        = "localhost";
$DB_USER        = "usr_object";
$DB_PASS        = "";
$DB             = "dbemdata";

// --- XML test dataset
$XML_DATA = "test/viewerdata.xml";

// --- Project database URL

$PROJECT_URL = "project";
$PROJECT_DB_HOST = "localhost";
$PROJECT_DB_USER = "usr_object";
$PROJECT_DB_PASS = "";
$PROJECT_DB = "projectdata";

4 Populate your databases automagically

Point your web browser to http://localhost/myamiweb/.
Navigate to the Administration page and then to the ProjectDB page.

Doing this will populate your database with the schema defined in myami/myamiweb/project/defaultprojecttables.xml.
If you need to repopulate tables, use phpMyadmin to empty the Install table in the project DB. Then repeat the steps above.


Setup Remote Processing

The web interface for Appion allows one to directly login to a computer and process Appion jobs. But this requires a job submission system to be installed.

Types of clusters:

Note: The Local Cluster and Refine Reconstruction Cluster can be the same machine, but you will still need to perform all the setup instructions below for each type of cluster.

Local Cluster Appion processing setup

The following applies to both the web-server computer (setup earlier) and a job submission system on a local cluster. The job submission system usually consists of a head node (main computer) for receiving and scheduling jobs and individual processing nodes (slave computers) for running jobs. All of these system CAN exist on an single computer.

  1. Setup job submission server
  2. Install SSH module for PHP
  3. Configure web server to submit job to local cluster
  4. Testing job submission
  5. Potential job submission problems

Appion Refinement Reconstruction Processing through ssh setup

  1. Edit the default_cluster.php file (Appion version 2.1 and earlier only)

< Create a Test Project



Share a Project Session with another User

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select the Sharing link next to your project in the experiment table
     
    Sharing link

     
  3. On the Experiment Sharing page, use the drop down list of users to select a person to share the experiment with
  4. Click the add button to share the experiment with the selected user
     
    Experiment Sharing Page

< Upload Images to a new Project Session | View a Summary of a Project Session >



Sort Junk

Xmipp Sort by Statistics: This function sorts the particles in a stack by how closely they resemble the average. In general, this will sort the particles by how likely that they are junk. After sorting the particles a new stack will be created, you will then have to select at which point the junk starts and Apply junk cutoff. The second function, Apply junk cutoff will then create a third stack with no junk in it.

General Workflow:

  1. Check run name and enter a description.
  2. Make sure the "Commit to Database" box is checked if appropriate. Click "SortJunk" to submit to the cluster. Alternatively, click "Just Show Command" in order to copy and paste into a unix shell.
     

     
  3. After sort junk has run, return to the stack view and select [Apply Junk Cutoff] to select a cutoff point.
  4. Update the image range to show as many images as you would liek to see. The best images appear first, the images most likely to be junk appear last. Press "Load" to view the images.
  5. Under "Selection mode" press the exclude button to toggle it to "select".
  6. Click on the last image that you would liek to include in a new stack.
  7. Select "Apply junk cutoff"

     
  8. Enter a description in the next screen and select "Apply Junk Cutoff"

Notes, Comments, and Suggestions:

< Center Particles | Create Substack >


Spider Coran Classification

This method uses the Spider CA S command to run correspondence analysis (coran), a form of principal components analysis, and classify your aligned particles.

General Workflow:

Note: If you accessed "Run Feature Analysis" directly from an alignment run, you will be greeted by the screen displayed on the left below. Alternatively, if you accessed the "Run Feature Analysis Run" from the Appion sidebar menu, you will be greeted by the screen displayed on the right below.

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  4. Check that the appropriate stack of aligned particles are being analyzed, or choose the appropriate stack from the drop-down menu. Note that stacks can be identified in this menu by alignment run name, alignment run ID, and that the number of particles, pixel and box sizes are listed for each.
  5. Click on "Run Spider Coran Classify" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  6. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Feature Analysis" submenu in the appion sidebar.
  7. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Feature Analysis" tab in the appion sidebar. Clicking on this link opens a summary of all feature analyses that have been done on this project.
  8. Click on the dendogram to enlarge in a new window.
  9. Click on individial eigen images to enlarge in a new window. Note that the % variance contribution of each eigen image is displayed underneath.
  10. To perform a clustering run, click on the grey link entitled "Run Particle Clustering on Analysis Id xxx" within the box that summarizes this alignment run.

Notes, Comments, and Suggestions:

  1. The color (extend of redness) of percent variance contribution is determined by a ratio of the percent variance contribution from eigen image n and the contribution from the lowest factor. In other words, "redness" corresponds to relative extent of contribution to variance.
  2. Clicking on "Show Composite Page" at the top of the Feature Analysis List page (accessible from the "completed" link under "Run Feature Analysis" in the Appion sidebar) will expand the page to show the relationships between alignment runs and feature analysis runs.

<Run Feature Analysis | Run Particle Clustering >



Spider Reference-based Alignment

This method uses the Spider AP MQ command to align your particles to the selected templates. Multiprocessing additions has made this extremely fast.

General Workflow:

  1. Check to boxes of templates to be used as references during alignment.
  2. Click on "use these templates."
  3. Make sure that appropriate run name is specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  4. Enter a description of your run into the description box.
  5. Make sure that appropriate directory tree is specified.
  6. Select the stack to align from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  7. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  8. Double check that the templates are the ones you want to use.
  9. Click on "Run Ref-Based Alignment" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  10. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Alignment" submenu in the appion sidebar.
  11. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Alignment" tab in the appion sidebar. This opens a summary of all alignments that have been done on this project.
  12. Click on the link next to "reference stack" to open a window that shows the class averages and that contains tools for exploring the result. Such tools include the ability to browse through particles in a given class, create templates for reference based alignment, substack creation, 3D reconstruction, etc.
  13. To perform a feature analysis, click on the grey link entitled "Run Feature Analysis on Align Stack ID xxx".

Notes, Comments, and Suggestions:

  1. In the parameters box on the right, under "Particle Params" the last and first ring radii refer to the inner and outermost rings along which alignment parameters will be determined. Good default values for a particle with a box size of 300 x 300 pixels are shown in the overview snapshot above.
  2. In the parameters box on the right, under "Alignment Params" the search range refers to the number of pixels that will be considered from the center of any given starting point during parameter determinination. A step size of 1 means that every single ring between first and last radii will be considered during the search. Good default values for a particle with a box size of 300 x 300 pixels are shown in overview snapshot above.
  3. Clicking on "Show Composite Page" in the Alignment Stack List page (accessible from the "completed" link under "Run Alignment" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Alignment | Run Feature Analysis >



Spider Reference-free Alignment

This method uses the Spider AP SR command to align your particles.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Select the stack to align from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  4. Select a method for initializing reference-free alignment by activating one of the radio buttons. If you wish to use a template image to initialize alignment, you can locate the template ID number by clicking on "1 available" under "Upload template" in the "Import tools" submenu on the appion sidebar.
  5. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  6. Click on "Run Spider NoRef Alignment" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  7. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Alignment" submenu in the appion sidebar.
  8. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Alignment" tab in the appion sidebar. Clicking on this link opens a summary of all alignments that have been done on this project.
  9. Click on the "alignstack.hed" link to browse through aligned particles.
  10. To perform a feature analysis, click on the grey link entitled "Run Feature Analysis on Align Stack Id xxx" within the box that summarizes this alignment run.

Notes, Comments, and Suggestions:

  1. WARNING: this method is very quick (~few minutes), but also very sloppy and does not always do a great job. The only way to obtain decent results is to run several times and compare the results.
  2. Particle-specific Radii parameters: The default values are pretty good, but adjust as you see fit. Make sure that your particle radius is appropriate!
  3. Alignment-specific Radii parameters: These parameters allow you to specify the radius over which alignment will be done. Generally a small first ring radius is used, and a last ring radius that just encompasses the particle is used; however, special samples might require a larger initial ring.
  4. Clicking on "Show Composite Page" in the Alignment Stack List page (accessible from the "completed" link under "Run Alignment" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Alignment | Run Feature Analysis >



SPIDER Refinement

Coming Soon! Working out a few bugs...

General Workflow:

Notes, Comments, and Suggestions:


< Refine Reconstruction|Quality Assessment >



Stacks

After particle selection, individual particles are boxed out of the micrographs and placed into stack files for further processing.

Available Stack Tools:

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:



< CTF Estimation | Particle Alignment >



Stack Creation

This procedure boxes out particles and is also able to apply CTF and astigmatism correction.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Select the particle selection run to use for boxing from the drop down menu.
  3. Enter a run description.
  4. Enter a box size '''How does one determine what box size to enter?'''
  5. Select the appropriate preset of images to search (default is en if you collected with leginon).
  6. From the drop down menu, select whether you want to box particles on tilted images as well (for RCT and OTR, make two separate stacks for tilted and untilted particles!)
  7. Check these boxes if you are creating this stack concurrent with data collection, particle selection, and CTF estimation. Appion will wait for more picms to roll in.
  8. You can pre-filter particles in accordance with your rejection (hide) or exemplar decisions in the image viewer and/or using the image assessment tool.
  9. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  10. Check the "invert image density" box if you collected ice data and wish to perform 2D alignment and classification. Also, if you wish to normalize your stack to STDEV of 1.0, select the "Normalize Stack Particles" box.
  11. To apply CTF correction, select the "Ctf Correct Particle Images" box, and choose the CTF correction method from the drop down menu. We like the ACE2 WeinerFilter Whole Image corrector.
  12. Click on "Make Stack" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  13. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Stack creation" submenu in the appion sidebar.
  14. Once the job is finished, an additional link entitled "1 complete" will appear under the "Stack Creation" tab in the appion sidebar. This opens a summary of all stacks that have been created for this project.
  15. On the stack list page, click on the "start.hed" link to browse through montages of stack particles.
  16. A variety of tools are available for centering and clean-up of particle stacks. Also see more stack tools.

Notes, Comments, and Suggestions:

  1. Default parameters work well generally.
  2. Remember that, if you low pass or high pass filter your stack particles, then any filters applied during subsequent processing will be in addition to this original filter. We tend to filter during alignment instead of during stack creation.

<Stacks | Particle Alignment >



Start with existing ANY Linux flavor OS

Appion Install Manual Referenced to Leginion Installation

=====================

The Appion Team

email: for any help you need.

1. Introduction

This document describes a general installation of Appion, concentrating on the installation and setup on the database and web servers. Most sections refer to the Leginon installation documentation since the two share the same general architecture. If you want to run real Leginon on the microscope, you just need to follow the additional installation starting from Processing Server Window Installation Chapter of Leginon installation manual.

2 Setup MySQL databases

see Leginon database server-side installation manual for details, including how to increasing cache size.

Preliminary steps if you have an existing project database from a previous installation

We need to remove a table in the project database called "install". This will allow the new default tables to be defined when we set up on the web-server side.

$ mysql projectdata -u usr_object
mysql> drop table install;
mysql> exit

Additional work

You need to decide what prefix you will use for the processing databases. We will be creating them on the fly later. Our default is ap followed by a project id number. More about this later.
At this point, you need to do the following to grant privileges to users for any database whose name starts with ap

$ mysql -u root -p

Note: If you didn't set a mysql root password, don't use -p option.

mysql> GRANT ALL PRIVILEGES ON `ap%`.* TO usr_object@"%";
mysql> exit

3 Web server side installation

Web Server Installation

TODO: The following steps are most likely no longer needed here:

Follow instructions in Leginon Database server-side installation manual.

Continue with the instructions in Leginon Web server set up and Installation.

Additional work

$DEF_PROCESSING_PREFIX = "ap";
addplugin("processing");
$PROCESSING_DB_HOST = "your_db_host";
$PROCESSING_DB_USER = "usr_object";
$PROCESSING_DB_PASS = ""; 
$PROCESSING_DB = "";
Remember that the last line should be kept empty as this will be set dynamically.

We will not include the processing host or cluster registration now. It is covered in the last part of this document.

4 Create a test project and processing database

Follow the instructions on how to create new project in the Leginon Manual.

Additional work

  1. Click on a project name on the webpage `http://your_host/project_1_2/project.php`. This will take you to a new webpage `http://your_host/project_1_2/getproject.php?pId=1`. The number following "pId=" depends on the project id automatically assigned to the project.
  2. At the end of the Info table, you should see:
processing db: not set (create processing db) db name ap1
You can create the default numbered style database ap... or give it a new name with the same prefix. If you want to specify a database name that does not use the default prefix, please note that your db user specified in the config.php in project_1_2 needs to have the necessary privileges for that database. You may additionally want to change the value assigned to $DEF_PROCESSING_PREFIX in project_1_2/config.php if you want to use your new prefix all the time.
  1. Enter the processing database name and click "create processing db".
  2. The page should refresh and display the linked database like this:
processing db: ap1

See next section on trouble shooting if you get the original page instead.

If you want all your processing databases combined in one single database (not recommended, as this becomes large very fast), just use the same name for all your projects.

The above procedure not only creates the database, but also create some of the tables that you need to start processing.

Trouble Shooting

If the 'getproject.php' webpage remains unchanged, your processing database link is not accepted. This is usually
caused by an incorrect setting such as:
  1. The mysql user does not have the privileges to create the named database.
    See Section-5.3 on what you should do. To try it again after your mysql correction, you should repeat Section-5.2 to allow reinitialization from the project_1_2 web page and then try 7.2 again.
  2. You have accessed an earlier version of project web page after you reinitialized the install table in your existing project database.
    The install table in the project database is set to deny further changes once any project web page is accessed. As a result, required table property changes and new table insertion would fail.
    Here is how to fix it:
  3. Repeat Seciont-5.2 to allow reinitialization.
  4. Try Section-7.2 again.

5 Processing server side installation

Follow Leginon processing-server-side installation on Linux

Additional work

use_processingdb_table = True
[appionData]
user:    usr_object

[Note] The module names in brackets are case sensitive and need to be exact.
The user name needs to match the name for which privileges have been granted on the `ap%` databases.

Download and install pyappion

Install prerequisite processing packages that are not included in the Leginon installation

Find a list of these packages here.

Download pyappion

Compile and Setup Instructions

Install external packages

Follow instructions for the individual packages.
Instruction for compiling Xmipp for OpenMPI is here.

6 Initialize Leginon with its admin tools

DO NOT follow Leginon administration tool instructions. We only need to upload images for appion processing, which is much simpler.

  1. Go to the webpage `http://your_host/dbem_1_5_1/addinstrument.php`
  2. Create a fake TEM instrument like this:
name: my_scope
hostname: whatever
type: Choose TEM
  1. Create a fake CCDCamera instrument like this:
name: my_scope
hostname: whatever
type: Choose CCDCamera

[Note] If you use Leginon, and still want to upload non-Leginon images, make sure that you create a pair of fake instruments like these on a host solely for uploading. It will be a disaster if you don't, as the pixelsize of the real instrument pair will be overwritten by your upload.

7 Start your first Appion session from the web

Upload Images to a new session

  1. Go to webpage `http://your_host/dbem_1_5_1/` This is the general starting point for Leginon.
  2. Follow the link for the Project DB (bottom right)
  3. Select your test project, click on the name. This takes you to the same page you created for the processing database.
  4. Below the experiment heading you will find a link that says "upload images to new session". This takes you to your first Appion processing page, where you can use the web gui and instruction to upload your images.
  5. Once all fields are filled in click on "Just Show Command", which will bring you to a page that displays a command line. Copy, paste, and execute this command into your text terminal, which will use the processing server programs and the database behind it.
    [Note] All images uploaded in an experiment session should have the same pixel size because they cannot easily be divided into groups during processing.

Good starting point for future reference

8 Remote Appion Processing through ssh

A more advanced way to run appion script is done through an ssh session. This is equivalent of having you ssh into a computer and starting the appion processes.

There are two kinds of appion processes. The first is a single-node process that can be run on a stand-alone workstation or the head-node of a computer cluster without PBS. The second is a multiple-node process that requires PBS to run on the cluster. When you use "Just Show Command" option, it is always a single-node process, but if you run through ssh it could be either, depending on the demand of the process. For example,
imageuploader.py is always run as single-node process while maxlikeAlignment.py is either run on single-node with "Just Show Command" or as a PBS job submission when you run through ssh.

Install and setup ssh2 extension for php at your web-server

This enables an ssh session initialized from appionweb
  1. Download and install the ssh2 extension for php as instructed at its download site:

The extension module is added to php in the same way as does the php-mrc module we distribute for the viewing mrc images through php. To check whether it worked and for alternative way to make php recognize the module used in newer php, see http://ami.scripps.edu/documentation/leginon/bk02ch04s07.php under the section Check php information and Alternative approach if mrc module does not show up in info.php output

configuration at web server side

IMPORTANT: What we refer here as your_cluster.php should not be taken literally. For example, if you access your cluster through network with a name "bestclusterever", you should name your cluster configuration php file bestclusterever.php, not your_cluster.php.
  1. Go to your_dbem_1_5_1/processing directory
  2. Copy default_cluster.php to your_cluster.php
  3. Edit your_cluster.php to correspond to your cluster configuration.
  4. Edit config_processing.php to add your cluster and, if desired, a stand-alone workstation as processing_host and register the your_cluster.php you just created.
$PROCESSING_HOSTS[]="your_stand-alone_processing_workstation";
$PROCESSING_HOSTS[]="your_cluster";
$CLUSTER_CONFIGS= array (
  'your_cluster'
);

Setting up PBS on processing server side

Follow these instructions to set up PBS on your processing server.

Testing

Testing php-ssh2 installation

Check your info.php as you did with mrctool installation. Corrected installed extension should show up in the output of info.php. Reference http://ami.scripps.edu/documentation/leginon/bk02ch04s07.php under the section Check php information and Alternative approach if mrc module does not show up in info.php output.

Testing ssh log in

Use the top right form on the processing page to log in as if doing an ssh session. The page will acknowledge that you have been logged in if the setup is correct. You will be able to edit description of a run and to hide failed runs when logged in. The option for submitting the job appears at the bottom of the processing form whenever available.

Testing simple appion processing job submission

Upload template is a good example to try:
  1. Create a small mrc image as your template to be uploaded.
  2. Select Upload template in Import tools in the appionweb processing menu of an existing session.
  3. Type in the needed information.
  4. When you finish filling the form, instead of clicking on "Just show command", choose your processing host and run the job by clicking "Upload Template".
  5. For simple process such as this, the webpage will take a little while to refresh until the job is completed.
  6. If you get the message "Template is uploaded", the process is successful and if you refresh the page you will find the template in the available list.

Testing PBS-required appion processing job submission

Particle selection such as DogPicker is a good example to try:
  1. Click on DoG Picking under Particle Selection menu
  2. Enter the required parameters
  3. When you finish filling a form for an appion processing, choose your processing host and run the job. Pages for monitoring the job become available after the job is queued and subsequently begins running. If the status appears as "Running or Queued" at first, the setup is likely correct.
  4. After a while, the process will be completed and the status becomes "Done" when you click to have the Status updated on the page.

Check your_cluster.php setup

For reconstructions involving iterations of different parameters such as EMAN reconstruction by refinement, the your_cluster.php is used to generate the script. Examine the script created on the web form and modify your_cluster.php You can copy the script to your cluster and test run/modify it until it is correct.


Start with existing CentOS 5.3 64-bit installation

1 Download additional software

If you are installing on CentOS, this section outlines and streamlines the installation of prerequisite packages.

Download additional Software


2 Database Server Installation

Setup MySQL database


3 Processing Server Installation

Setup Appion programs


4 Web Server Installation

Web Server Installation


5 Test the Installation

Create a Test Project


6 Setup Remote Appion Processing

Setup Remote Processing



Step by Step Guide to 3D Reconstruction in Appion

Purpose: Reconstruct a 3-dimensional electron density map in Appion using a streamlined protocol.

General Workflow:

  1. View the raw micrographs: Images may be acquired using Leginon or uploaded to the database using the “Upload images” functionality in Appion (Figure, c). Once images are uploaded they are automatically tracked by the database and can be viewed using a web-based Imageviewer. Clicking the “processing” button in the Imageviewer takes the user to the main processing page of Appion (Figure, b).
  2. Select particles: Appion provides several methods: “DoG Picker” is a reference-free approach, “Template Correlator” uses the reference-based approach in FindEM, and “Manual Picker” provides for interactive particle picking by the user. An example workflow for particle selection includes: (a) Selecting a subset of particles via DoG Picker or Manual Picker, (b) Aligning and classifying the subset to produce class averages, (c) Using selected class averages for Template Correlator. Users always have the option of cleaning up picks from any of the automated picking runs using Manual Picker. Overall results are provided on Appion summary pages or can be viewed overlaid on the individual images in the Image viewer by selecting the “P” button (Figure, a). As with CTF estimation, particle picking can be started concurrently with image acquisition.
  3. Estimate the CTF: ACE, and ACE 2 provide fairly robust algorithms for CTF estimation on untilted images that generally require no adjustments to the provided default settings. CTFTilt can be used for CTF estimates on tilted micrographs. A summary of results can be viewed by clicking on the “complete” CTF items in the Appion menu. Results for individual images can also be viewed on the Imageviewer pages, by selecting the “ACE” button. Individual results include graphical overlays, estimated parameters and associated fitness values. We generally find that fitness values of >0.8 are acceptable. CTF correction can be started concurrently with image acquisition; the estimation program once started, will keep querying for new images as they come in.
  4. Create a particle stack: The “Stack Creation” page is used to extract particles from the images based on the picks from a particle selection run, or the picks associated with a previously created and modified stack. Inputs include options for filtering, binning, CTF correction, etc. Particles can be rejected based on CTF fitness parameters, particle correlation values, or defocus range. Results pages provide a summary of the stack, a link to view the stack as individual particles, and further options to clean up the stack using a variety of filters.
  5. Align the raw particles: Reference-free procedures include Xmipp maximum-likelihood and SPIDER reference-free alignment, which can be used to create references for subsequent reference-based alignment. Reference-based procedures include Xmipp reference-based maximum likelihood, SPIDER multi-reference, IMAGIC multi-reference, and EMAN multi-reference alignments. Most procedures can be run initially using default input parameters. The aligned particles can be examined and manipulated further from the summary pages.
  6. Classify the aligned particles: As with the alignment routines, Appion provides several different options of classification that can be applied to any alignment run. These include: SPIDER Correspondence Analysis, IMAGIC Multivariate Statistics Analysis, or the Xmipp Kerden Self-Organizing Map routine. The specified feature analysis routine locks the user into a clustering procedure, which generates summed class averages. Class averages can be viewed and manipulated further from the summary pages after requesting “View montage as a stack for further processing”. Options include viewing the raw particles associated with each class, creating templates or substacks from selected classes or running common lines procedures to create an initial model form selected classes.
  7. Generate an initial model: Many options are available. Models can be uploaded from the PDB or EMDB, read in from a file, or imported from previously reconstructed datasets. If tilted data has been acquired, it is possible to perform “one-click” RCT or OTR, and tomographic reconstructions from selected 2D class averages or Z-projected subtomogram averages. These options are presented when viewing the stacks or class averages of appropriate datasets. Other options include common-lines approaches either utilizing EMAN’s cross-common lines protocol or an automated version of IMAGIC’s angular reconstitution. These options are available when viewing class averages or from the Ab Initio Reconstruction menu option.
  8. Refine: Options include procedures using EMAN1, Frealign, or Spider application packages. Results can be viewed on summary pages and more detailed results that provide output for each iteration step including data and graphical output for Resolution curves, Euler angle distributions, snapshots of 3D maps, class averages and particles contributing to the map etc.

Notes, Comments, and Suggestions:



< Terminology | Quality Assessment >



Subversion

How to create a new branch in svn

Information on branching.

  1. Before creating the branch, you need to update myamiweb/xml/projectDefaultValues.xml which holds the version number.
    The version number is stored in the database at installation time.
     
  2. Create a new branch:
    $ svn copy http://ami.scripps.edu/svn/myami/trunk http://ami.scripps.edu/svn/myami/branches/myami-2.1 -m "Creating a branch for myami 2.1" 
    
    Committed revision 14869.
    
    

     
  3. If this is a release branch, ask Christopher to make sure only a couple of people are allowed to check in changes to this branch.

Synthetic Data

This section enables the user to create a synthetic projection dataset from an input 3D model with the application of randomized rotations and translations, white Gaussian noise, as well as an envelope and contrast transfer function.

Synthetic Dataset Creation:

  1. Synthetic Dataset Creation: create a synthetic dataset either from evenly distributed or axially preferred projections of a selected 3D model.

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:


< Appion Processing



Synthetic Dataset Creation

This method uses projections of a 3D model in order to create a synthetic dataset. Although it can be modified according to the options specified, the scheme consists of 12 basic steps, as shown and summarized below:

  1. a model is chosen from which projectinos are created
  2. the model is projected either in an even distribution or with axial preference
  3. the projections are randomly rotated in the XY plane
  4. the projections are randomly shifted in the XY plane
  5. white Gaussian noise is added
  6. a contrast transfer function is added according to the specified defocus parameter and the spherical aberration constant of the microscope
  7. an envelope function is added according to an experimentally determined decay function from 3000 real micrographs (see Voss, N, Lyumkis, D. et al, JSB (2010) 169, 3, 389-98).
  8. a second level of Gaussian noise is added, usually to bring the signal-to-noise ratio down to ~0.05, consistent with real ice data (see Baxter, WT, et al, JSB (2009) 166, 2, 126-32).
  9. (optional) the CTF is estimated by ACE2 and corrected
  10. (optional) the particle is band-pass filtered
  11. final particle of 50S ribosomal subunit with a SNR of 0.05
  12. particle is added to a growing stack of synthetic particles

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Specify whether or not you want to commit the results to the database
  4. The model selected on the previous page is shown here
  5. the model parameters are shown here. NOTE: the boxsize of the model will be the boxsize of the resulting stack.
  6. The user can specify how the projections are to be carried out. In this example, the projections are evenly distributed at an angular increment of 5 degrees (one can note that, because "evenly distributed" is selected in the pulldown, the third option is blacked out). They are then randomly shifted by 5 pixels and randomly rotated in every possible direction. Once can also specify the "axial preference" option, in which case the projections mainly revolve around the 3 axes with a specified standard deviation about the projection axis. This was initially used to test common lines routines, which usually perform better with highly variable views of the particle
  7. the two levels of signal-to-noise are specified. In test cases, it was found that more realistic looking particles are created when the 1st SNR level is 2x bigger than the second. For example, if you want the final SNR level to be 0.1, spcify 0.2 in the first box, then 0.1 in the second.
  8. a contrast transfer function is applied according to the specified defocus in the X and Y directions, as well as the angle of astigmatism. If you would like to randomize the defocus values, then the values will also be perturbed according to the standard deviation. For example, if the defocus is -1.5 (X), -1.5 (Y) and the standard deviation is 0.3, then 67% of the defoci will fall into the range of -1.2 - -1.8 microns.
  9. Specify this option ONLY if you would like to correct for the applied CTF. In the example figure, the checkbox is not marked, and the options are blacked out. "Applied CTF" means that the CTF correction uses equivalent values as CTF application (essentially adding a Wiener filter to the projections). "Use ACE2 Estimate" uses the ACE2 program in order to estimate the CTF for synthetically created micrographs, which is usually quite robust unless the SNR is very low. "Perturb Applied CTF" attempts to simulate errors in CTF estimation algorithms by slightly perturbing the value of the corrected CTF compared to the applied CTF. For example, the applied CTF might be based on a defocus of - 1.5 microns, but, when this option is specified with a standard deviation of 0.05, then 67% of the corrected parameters will be off by +/- 0.05 microns or less.
  10. the final stack can be optionally band-pass filtered and normalized
  11. when the run is finished, the user is notified of its completion. NOTE: the synthetic dataset creation utility effectively creates a stack of particles. Therefore clicking either on the number completed within the "synthetic dataset creation" category OR within the stack category will take you to the same stack summary page. The former, however, will only display the synthetic stacks, while the latter will display all stacks.

System Requirements

Any Linux flavor that you can run your favorite 3DEM package will probably work as long as you have the ability to also perform the less-standard database and web server functions. They include:
  1. start up web server (generally Apache)
  2. start and configure MySQL database
  3. compile and add extension to php
    Together, this is known as LAMP (Linux-Apache-Mysql-Php)

Some of the flavors have good package installation management. Therefore, it is to your advantage to use them.
We have a step-by-step installation guide for CentOS. If you use other flavors, it is up to you to find the details.

Software requirements

General Packages you need to have

Check to see if you already have these packages, if not download and install them with your package management program.

Packages included in the Leginon installation

You can find more detail installation notes at: Complete Installation Chapter of Leginon installation manual.

Processing server
Name: Download site
Python 2.4 or newer http://www.python.org
wxPython 2.5.2.8 or newer http://www.wxpython.org
MySQL Python client 1.2 or newer http://sourceforge.net/projects/mysql-python
Python Imaging Library (PIL) 1.1.4 or newer http://www.pythonware.com/products/pil/
Python XML module 0.8.3 or newer http://pyxml.sourceforge.net
NumPy 1.0.1 or newer http://numpy.scipy.org
SciPy 0.5.1 or newer http://www.scipy.org, http://repos.opensuse.org/science
Database server
Name: Download site
MySQL-Server 5.0 or higher http://www.mysql.com
MySQL-Client 5.0 or higher http://www.mysql.com
Web server
Name: Download site
Apache www.apache.org
php www.php.net
php-devel rpmfind.net/linux/RPM/Development_Languages_PHP.html
php-gd (including GD library, its development libraries and header) www.libgd.org (Use gd2)
fftw3-devel library (including development libraries and header) www.fftw.org (Use fftw3)

Packages not included in the Leginon installation

Name which program needs it
ImageMagick appion stack creation
Grace appion summary reports
Matplot Lib appion summary reports
GNU Plot SPIDER
GCC Fortran95 FINDEM
GCC Fortran77 FINDEM
GCC Objective-C ACE2
GNU Scientific Library ACE2

Additional php extension needed by the web server

name where to get installation instruction
ssh2 extension for php http://us.php.net/manual/en/book.ssh2.php

Structure biology software packages that are used at NRAMM with the current Appion release

Since we are not up-to-date on all packages, we can't guarantee that the newest version you have will work.

Required
Package Version Notes
EMAN 1.9 cluster download binary
UCSF Chimera 1.2509 download v1.2509 binary
rmeasure 1.05 download binary
Recommended
Package Version Notes
SPIDER 15 download binary
Xmipp 2.3 download source
ctftilt ? download binary
FREALIGN 8.08 download binary
IMAGIC 5
EM-BFACTOR ? download binary
Optional
Package Version Notes
Matlab 7.5.0 for ACE1 but not ACE2

Known issues

Software from NRAMM

NRAMM software is available from two separate sites our local servers


< What is Appion? | Credits >



Template Picking

Template picking is usually the most accurate and convenient way to extract particles. Once an initial model or 2D averages have been acquired they can be used as templates to identify similar particles within the micrograph.

General Workflow:

  1. Choose Templates: Templates are usually created by backprojections of a 3D model or by 2D averages of similar single particles(Upload Template), you can also create them from single 2D images or classaverages (whenever you look at the results of a stack creation (single images) or classification (classaverages) you have the option to select the images you like and click on a Create Templates button). Choose characteristic templates (something like top view, side view and one or two tilted views) and assign an angular increment for the search (a cylindrical view probably needs no rotation at all, a rectangular view something between 0-90deg). Be as accurate as you like but keep in mind that this decision is directly correlated to the processing time required.
  2. Test first then submit: Choose a mask diameter (if you have no idea make it big) and play around with the different parameters (simply paste the filename of a typical image in the test settings box). It is a good idea to optimize the settings on one image and then test a second image. Don’t worry about the final boxsize or binning this will be determined in the next step: stacks !
  3. Click on Run Template Picker to submit the job. If you submit the job while you are still collecting data use the option wait for more images after finishing.
  4. Continue with stacks

Notes, Comments, and Suggestions:

  1. If you want to rerun the job with identical settings go to Repeat from other session and select the desired job
  2. Image size must be a multiple of 2 after your binning for this function and all other functions that use FFT.

< Particle Selection | CTF Estimation >


Terminology

Overview of Processing Pages:

A. Project name, EM Session name for current dataset, and directory path for images

B. Appion SideBar is where all jobs are launched and tracked. It consists of several drop-down menus, organized in accordance with image processing stage. Only the processing steps possible for a given project at a given time are displayed (i.e. if you are working with untilted data, all tilt-data processing options are hidden). The top of this menu contains options to hide, expand, or contract the side bar.

C. Submenus contain links for running a particular process, and also keeps track of the the jobs completed, queued, or running at a given time. Clicking on the link for running a particular procedure opens the options available for that procedure in a window next to the appion sidebar. If the procedure has several packages associated with it (such as alignment), an initial page opens with click able links to the Appion processing pages for the various algorithms available.

D. The left side of processing pages display the minimal parameters that a user should check before running, and provides drop down menus where appropriate.

E. The right side of processing pages is a gray box containing parameters that more experienced users are familliar with. Floating help boxes appear when mousing over a particular operation to guide the user in appropriately setting these parameters. Default parameters are automatically entered.

F. To run a procedure the user can click "Run Command" to submit the job or "Just Show Command" to copy and paste the command into a unix terminal. If the "Commit to Database" box is checked, either method will track the process in the database and display the results in the appion processing pages.

G. Underneath user defined parameter boxes appion displays any additional information relevant to the process. In this case, the template that was selected for reference based alignment is displayed.

H. References to the particular software used for any given procedure are provided. Please let us know if we have missed or need to update a reference!

Appion SideBar and Processing Page Snapshot:

Notes, Comments, and Suggestions:



Common Workflow >



Testing job submission

Testing

Testing php-ssh2 installation

Check your info.php as you did with mrctool installation. Corrected installed extension should show up in the output of info.php. Reference http://ami.scripps.edu/documentation/leginon/bk02ch04s07.php under the section Check php information and Alternative approach if mrc module does not show up in info.php output.

Testing ssh log in

Use the top right form on the processing page to log in as if doing an ssh session. The page will acknowledge that you have been logged in if the setup is correct. You will be able to edit description of a run and to hide failed runs when logged in. The option for submitting the job appears at the bottom of the processing form whenever available.

Testing simple appion processing job submission

Upload template is a good example to try:
  1. Create a small mrc image as your template to be uploaded.
  2. Select Upload template in Import tools in the appionweb processing menu of an existing session.
  3. Type in the needed information.
  4. When you finish filling the form, instead of clicking on "Just show command", choose your processing host and run the job by clicking "Upload Template".
  5. For simple process such as this, the webpage will take a little while to refresh until the job is completed.
  6. If you get the message "Template is uploaded", the process is successful and if you refresh the page you will find the template in the available list.

Testing PBS-required appion processing job submission

Particle selection such as DogPicker is a good example to try:
  1. Click on DoG Picking under Particle Selection menu
  2. Enter the required parameters
  3. When you finish filling a form for an appion processing, choose your processing host and run the job. Pages for monitoring the job become available after the job is queued and subsequently begins running. If the status appears as "Running or Queued" at first, the setup is likely correct.
  4. After a while, the process will be completed and the status becomes "Done" when you click to have the Status updated on the page.

Check your_cluster.php setup

For reconstructions involving iterations of different parameters such as EMAN reconstruction by refinement, the your_cluster.php is used to generate the script. Examine the script created on the web form and modify your_cluster.php You can copy the script to your cluster and test run/modify it until it is correct.


< Configure web server to submit job to local cluster | Potential job submission problems >



Test Appion

You need to edit leginon.cfg.

Note: check3rdPartyPackages.py is currently only available with development svn checkout, will be included in version 2.2


< Install Ace2 | Processing Server Installation ^



Test Appion

You need to edit leginon.cfg.

Note: check3rdPartyPackages.py is currently only available with development svn checkout, will be included in version 2.2


Test datasets at AMI

The AMI database includes several test sessions that point to copies of collected images that can be used for testing purposes.
See issue #1229 for more information.


H1 TEST PAGE!!!!

Appion Processing

  1. Terminology
  2. Common Workflow
    1. Step by Step Guide
    2. Quality Assessment
  3. Processing Cluster Login
  4. Particle Selection
    1. Template Picking
    2. Dog Picking
    3. Manual Picking
    4. Repeat from other session
    5. Random Conical Tilt Picking
  5. CTF Estimation
    1. Ace Estimation
    2. Ace 2 Estimation
    3. CtfFind Estimation
    4. Repeat from other session
  6. Stacks
    1. Stack creation
    2. more stack tools
      1. Combine Stacks
      2. View Stacks
      3. Alignment SubStacks
      4. Jumpers SubStack
      5. Convert Stack into Particle Picks
  7. Particle Alignment
    1. Run Alignment
    2. Run Feature Analysis
    3. Run Particle Clustering
  8. Ab Initio Reconstruction
    1. RCT Volume
    2. OTR Volume
    3. EMAN Common Lines
    4. IMAGIC Angular Reconstitution
  9. Refine Reconstruction
    1. EMAN Refinement
    2. Frealign Refinement
    3. SPIDER Refinement
    4. IMAGIC Refinement
  10. Tomography
    1. Align tilt series
    2. Create full tomogram
    3. Upload tomogram
    4. Create tomogram subvolume
    5. Average subvolumes
  11. Import Tools
    1. PDB to Model
    2. EMDB to Model
    3. Upload Particles
    4. Upload Template
    5. Upload Model
    6. Upload More Images
    7. Upload Stack
  12. Image Assessment
    1. Web Img Assessment
    2. Multi Img Assessment
    3. Run Image Rejector
  13. Region Mask Creation
    1. Manual Masking
  14. Synthetic Data
    1. Synthetic Dataset Creation
  1. test page



< Appion and Leginon Database Tools



Tomography

This section contains the step-by-step procedures for calculating tomograms.

General Workflow:

  1. Align Tilt Series
  2. Create Full Tomogram
  3. Create Tomogram Subvolume

Appion SideBar Snapshot:

Notes, Comments, and Suggestions:


< CTF Estimation | Align Tilt Series >



Tomography Tool

For information about Tomography, see Tomography.


< LOI | Hole Template Viewer >



Tomo alignment appiondata and leginondata

An alignment run is one or more iterations of alignment of images from one or more tilt series of the same interested area.
A tilt series is defined as a group of images acquired during a single axis tilt sequence.

Run identification:

Input information:

Alignment method parameter references:

appiondata.ApTomoAlignmentRunData has references to one of the three tables that stores parameters for each method of alignment.

Protomo iterative alignment parameters:

Alignment results:


Tom Goddard visit February 2010

What do we Tom to do while he is here:


Troubleshooting Notes

The following is an internal e-mail that describes a case in which we were able to run php-mrc module through text terminal with php command but not being able to see the images it produced through a network mounted drive,
The bottom line is that there is a permission issue. The tests were to eliminate the possibility one by one.

Hi, Amber,

I got it to work.

The webserver user apache has no permission to serve files from
/home/linux_user/

What I did was:
(1) as root

$ cd /
$ mkdir data
$ chmod -R 755 data

This way, if you check with ls -l
you will get

drwxr-xr-x 5 root root 73 Dec 4 11:42 data

(2) change leginon.cfg in the installation

$ cd /usr/lib/python2.4/site-packages/leginon/config
$ vi leginon.cfg

[Images]
/data/leginon

This way, when I upload images to a new session, it will create a
directory under /data/leginon that is readable by everyone.

I figured it out by changing the user assigned
in /etc/httpd/conf/httpd.conf to linux_user, then, after restarting
apache, it could read the test images in /linux_user/Desktop/myami/myamiweb/test.

Then I realize that the system we use at linux_box allows read access
to all, and that a file is still not readable by others if its
parent directories are not readable by others.

It is probably something that we need to formulate better with
our system administrator. It is likely that we can do something more
acceptable by other groups. Apache has all kinds of permission
settings I didn't read through.

Anchi


Troubleshooting the Web Server:

Run the web server troubleshooter

A web server troubleshooting tool is available at http://YOUR_HOST/myamiweb/test/checkwebserver.php.
You can browse to this page from the Appion and Leginon Tools home page (http://YOUR_HOST/myamiweb) by clicking on [test Dataset] and then [Troubleshoot].

This page will automatically confirm that your configuration file and PHP installation and settings are correct and point you to the appropriate documentation to correct any issues.

Firewall settings

You may need to configure your firewall to allow incoming HTTP (port 80) and MySQL (port 3306) traffic:

$ system-config-securitylevel

Security-enhanced linux

Security-enhanced linux may be preventing your files from loading. To fix this run the following command:

$ sudo /usr/bin/chcon -R -t httpd_sys_content_t /var/www/html/

see this website for more details on SELinux


Unlink a Project Processing Database

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select your project by clicking on the project name.
  3. Click on the unlink button found in the info section next to processing db:
     
    Unlink db button

Unlinking a project and a database does not delete the database.

To link to an existing db:
  1. Enter the db name
  2. Select Create processing db

< Create a Project Processing Database | Upload Images to a new Project Session >



Upgrade From 2.0.x

Download myami 2.1.x source code

Download Myami 2.2 (contains Appion and Leginon) using one of the following options:

 
This is a stable supported branch from our code repository.
Change directories to the location that you would like to checkout the files to (such as /usr/local) and then execute the following command:

svn co http://ami.scripps.edu/svn/myami/branches/myami-2.2 myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.
 

Option 2: Release Version as tar file - NOT AVAILABLE YET FOR 2.2

Option 3: SVN Development version

 
This contains features that may still be under development. It is not supported and may not be stable. Use at your own risk.

svn co http://ami.scripps.edu/svn/myami/trunk myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.

Install Appion Packages

Install all the myami python packages except appion using the following script:

cd /your_download_area/myami
sudo ./pysetup.sh install

That will install each package, and report any failures. To determine the cause of failure, see the generated log file "pysetup.log". If necessary, you can enter a specific package directory and run the python setup command manually. For example, if sinedon failed to install, you can try again like this:

cd /your_download_area/myami/sinedon
sudo python setup.py install

Install the Appion python package

Important: You need to install the current version of Appion packages to the same location that you installed the previous version of Appion packages. You may have used a flag shown below (--install-scripts=/usr/local/bin) in your original installation. If you did, you need to use it this time as well. You can check if you installed your packages there by browsing to /usr/local/bin and looking for ApDogPicker.py. If the file is there, you should use the flag. if the file is not there, you should remove the flag from the command to install Appion to the default location.

The pysetup.py script above did not install the appion package. Since the appion package includes many executable scripts, it is important that you know where they are being installed. To prevent cluttering up the /usr/bin directory, you can specify an alternative path, typically /usr/local/bin, or a directory of your choice that you will later add to your PATH environment variable. Install appion like this:

cd /your_download_area/myami/appion
sudo python setup.py install --install-scripts=/usr/local/bin 

Update the web interface

Copy the entire myamiweb folder found at myami/myamiweb to your web directory (ex. /var/www/html). You may want to save a copy of your old myamiweb directory first.

Run Database Update Script

Running the following script will indicate if you need to run any database update scripts.

cd /your_download_area/myami/dbschema
python schema_update.py

This will print out a list of commands to paste into a shell which will run database update scripts.
You can re-run schema_update.py at any time to update the list of which scripts still need to be run.


Upgrade Instructions ^



Upgrade From 2.1.x

Download myami 2.2.x source code

Download Myami 2.2 (contains Appion and Leginon) using one of the following options:

 
This is a stable supported branch from our code repository.
Change directories to the location that you would like to checkout the files to (such as /usr/local) and then execute the following command:

svn co http://ami.scripps.edu/svn/myami/branches/myami-2.2 myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.
 

Option 2: Release Version as tar file - NOT AVAILABLE YET FOR 2.2

Option 3: SVN Development version

 
This contains features that may still be under development. It is not supported and may not be stable. Use at your own risk.

svn co http://ami.scripps.edu/svn/myami/trunk myami/

Note: If you are installing this file on a microscope Windows PC, you may use Tortoise SVN to checkout the files.

Install Appion Packages

Install all the myami python packages except appion using the following script:

cd /your_download_area/myami
sudo ./pysetup.sh install

That will install each package, and report any failures. To determine the cause of failure, see the generated log file "pysetup.log". If necessary, you can enter a specific package directory and run the python setup command manually. For example, if sinedon failed to install, you can try again like this:

cd /your_download_area/myami/sinedon
sudo python setup.py install

Install the Appion python package

Important: You need to install the current version of Appion packages to the same location that you installed the previous version of Appion packages. You may have used a flag shown below (--install-scripts=/usr/local/bin) in your original installation. If you did, you need to use it this time as well. You can check if you installed your packages there by browsing to /usr/local/bin and looking for ApDogPicker.py. If the file is there, you should use the flag. if the file is not there, you should remove the flag from the command to install Appion to the default location.

The pysetup.py script above did not install the appion package. Since the appion package includes many executable scripts, it is important that you know where they are being installed. To prevent cluttering up the /usr/bin directory, you can specify an alternative path, typically /usr/local/bin, or a directory of your choice that you will later add to your PATH environment variable. Install appion like this:

cd /your_download_area/myami/appion
sudo python setup.py install --install-scripts=/usr/local/bin 

Update the web interface

Copy the entire myamiweb folder found at myami/myamiweb to your web directory (ex. /var/www/html). You may want to save a copy of your old myamiweb directory first.

Update configuration files

  1. Move your leginon.cfg file
    1. Ensure you have a leginon.cfg file in the proper location per these Leginon config instructions
  2. Add a .appion.cfg file
  3. Ensure your config.php file is up to date on the web server per these config.php instructions

Run Database Update Script

Running the following script will indicate if you need to run any database update scripts.

cd /your_download_area/myami/dbschema
python schema_update.py

This will print out a list of commands to paste into a shell which will run database update scripts.
You can re-run schema_update.py at any time to update the list of which scripts still need to be run.


Upgrade Instructions ^



Upgrade from pre-2.0

From a pre-2.0 Appion release

If you have a pre-2.0 Appion release and would like to Upgrade to 2.0:

  1. Follow the
    Leginon upgrade instructions.
     
    Note: Ignore instructions to update the Microscope computer if you are not using the Leginon Application.
     
  2. Next, update the Processing Packages by following the instructions for
    Installing Appion with an existing Leginon installation

Upgrade Instructions ^



Upgrade Instructions

  1. Upgrade from pre-2.0 to the current release
  2. Upgrade From 2.0.x to the current release
  3. Upgrade From 2.1.x to the current release
  4. General Instructions for a minor release update (ex. 2.0.0 to 2.0.1)

< Complete Installation | Appion User Guide >



Upload tomogram

General Workflow:

  1. Click on the"Upload tomogram" link under the Tomography submenu in the Appion Sidebar. Enter the file name and directory path for the tomogram that is to be uploaded. This information can be found in the summary page for the tomogram:
    a. Click on "X complete" under the "Create full tomogram" menu in the appion sidebar.
    b. Click on the tomogram id corresponding to the tomogram that is to be uploaded.
    c. Copy the tomogram path name and paste into the box for main step 1.
    d. Copy the tomogram file name and paste into the box for main step 1. Add ".rec" to the end of the filename!!
  2. Select the orientation of your tomogram or subvolume. Default for a full tomogram collected in Leginon is XZY: left. See the floating help box to decide what is appropriate for your tomogram.
  3. Enter a description of the tomogram to be uploaded.
  4. Select the tiltseries corresponding to the tomogram. This will automatically fill out the remainder of the form.
  5. Click "Upload Tomogram" to submit to the cluster. Alternatively, click "Just Show Command", for a command that can be copied and pasted into a unix shell.
  6. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Upload Tomogram" submenu in the appion sidebar.
  7. Once the job is finished, an additional link entitled "X Complete" will appear under the "Upload Tomogram" tab in the appion sidebar. Click on this link, and a page will open with a summary of all the tomograms that have been uploaded for this session.
  8. Clicking on the "id" opens a summary page for the calculated tomogram that includes input parameters and file directory paths.
  9. The next step is to create a tomogram subvolume.

Notes, Comments, and Suggestions:


< Create Full Tomogram | Create Tomogram Subvolume >



Upload Images

Upload Images to a new session

  1. Go to webpage http://your_host/myamiweb This is the general starting point for Appion.
  2. Follow the link for the Project DB (bottom right)
  3. Select your test project, click on the name. This takes you to the same page you created for the processing database.
  4. Below the experiment heading you will find a link that says "upload images to new session". This takes you to your first Appion processing page, where you can use the web gui and instruction to upload your images.
  5. Once all fields are filled in click on "Just Show Command", which will bring you to a page that displays a command line. Copy, paste, and execute this command into your text terminal, which will use the processing server programs and the database behind it.
    [Note] All images uploaded in an experiment session should have the same pixel size because they cannot easily be divided into groups during processing.

Good starting point for future reference


Upload Images to a new Project Session

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select your project by clicking on the project name.
  3. Under the Experiments section, select Upload images to new session.
     
    Upload Images to new session button

     
  4. Follow the instruction provided in mouse-over tool tips.
     
    Upload Images Screen

Tips

Change the cs value associated with an uploaded session

The cs value that you enter during upload should be the correct value for the scope that the images were collected on. It is highly recommended that you make an effort to discover this value prior to uploading images to Appion. If you have uploaded images with the incorrect cs value, a system administrator with access to the Appion and Leginon databases can assist with changing the cs value associated with your images.

These are the steps that need to be done to change the cs value. (Applies to version 2.2 and later)
  1. Note the session Id that you wish to change
  2. go to the leginon database (ex. dbemdata)
  3. go to the InstrumentData table and see if there is an instrument called AppionTEM with a cs value that is the same as the cs value you wish to change to. If it exists, note the DefId, if not, add one with the correct cs value and note its defId.
  4. Go to ScopeEMdata table and search for the session that you wish to change.
  5. Modify the REF|InstrumentData|TEM for that session with the defId of the AppionTEM with the correct cs value.
  6. Go to pixelSizeCalibrationData table and search for the session.
  7. Edit this entry and turn off the option to update the timestamp to the current time. Make sure the timestamp is not changed when editing this table. Update REF|InstrumentData|TEM with the defId of the correct TEM.

< Unlink a Project Processing Database | Share a Project Session with another User >



Upload Model

The user is able to upload 3D models to be used in 3D refinement.

General Workflow:

To launch:

  1. The user prepares the model in MRC format (user provides the entire file path)
  2. The user provides a brief description for the model
  3. The user inputs the symmetry for the model
  4. The user inputs the resolution of the model
  5. The user inputs the pixel size for the model

Output:

  1. Once the model is uploaded, it will appear viewable under "Upload Model".
  2. The model will be usable for 3D refinement

Notes, Comments, and Suggestions:


< Upload Template | Upload More Images >



Upload More Images

The user is able to upload raw micrographs to either existing or new session to be used in appion processing

General Workflow:

To launch:

  1. The user provides the session name (new or exisiting) as well as a description to the session (if new)
  2. The user has the option to select the origin of the micrographs (e.g. tecnai2 on tietz camera)
  3. If it is a tilt series, the user will need to provide the number of images within the tilt series as well as the tilt information in the following parameter file
  4. Following this, the user has the option to upload via specifying a parameter file or enter the parameter manually.

Specify a Parameter File:

  1. This file has to adhere to the following information:
    1. complete path and name of micrograph
    2. pixel size in meters [1.63e-10]
    3. binning in x [1]
    4. binning in y [1]
    5. nominal scope magnification [50000]
    6. intended defocus in meters [-2e-06]
    7. high tension in volts [200000]
    8. (optional) stage alpha tilt in degrees if loading tilt group [55]
  2. The values for each micrographs should be on the same line separated by tabs:

/home/abc/xxx.mrc 1.63e-10 1 1 50000 -2e-06 200000 55

/home/abc/xyz.mrc 1.63e-10 1 1 50000 -2e-06 200000 10

Enter parameters manually:

  1. The user provides the path of the directory containing the images
  2. The user provides the file format
  3. The user provides the pixel size of the micrographs in Angstrom
  4. The user provides the binning in x
  5. The user provides the binning in y
  6. The user provides the magnification
  7. The user provides the defocus in micron
  8. The user provides the high tension of the microscope in kV

For a series of untilted images, it might be easier to enter parametes manually (assuming all your micrographs are in the same directory and all the other informations are the same for each image).

For tilted images, the user will definiltey need to create a parameter file to incorporate the stage alpha tilt information during upload.

Output:

  1. Once the micrographs are uploaded, it will appear in the viewer by choosing the correct session.
  2. The user can then start processing the micrographs using appion.

Notes, Comments, and Suggestions:


< Upload Model | Upload Stack >



Upload Particles

Users can upload particle picks from doing a manual picking session outside of appion using EMAN's boxer program.

General Workflow:

To launch:

  1. User has to make sure that the particle selection file (*.box) has the same name as the micrography the particles are picked from (e.g. if the name of the micrograph is test.mrc, make sure that the particle selection file is called test.box).
  2. Provide the full path to which the selection file(s) is located (Note: wild cards are accepted).
  3. Provide the diameter of the particle
  4. If the images from which the particles are picked are binned relative to the images in the database, then provide the scaling value.

Output:

  1. The output of the particle selection will be visible under the "Particle Selection" menu

Notes, Comments, and Suggestions:


< EMDB to Model | Upload Template >


Upload Refinement

How to upload a refinement that has been carried out in an external package, i.e. outside of Appion

  1. create a directory tree in the "recon" folder titled "Your_Refinement_Procedure/external_refinement_results", as for example: /ami/data17/appion/11jan11a/recon/external_package_test/external_package_results
  2. all files described below must reside in the "external_refinement_results" directory, otherwise the upload will not work.
  3. define a timestamp, e.g. "11jun03a" or "my_favorite_refinement_procedure_june03" or whatever you want.
  4. AT LEAST 2 files are needed per iteration per reference number for the refinement (the latter being the number of output models / references produced)
    1. 3D mrc file titled "recon_'timestamp'_it#_vol#.mrc (it# and vol# have 3 integers), as for example "recon_11jul18z_it001_vol001.mrc"
    2. particle data file titled "particle_data_'timestamp'_it#_vol#.txt" (it# and vol# have 3 integers), as for example "particle_data_11jul18z_it001_vol001.txt". An example file is attached. This file MUST contain the following columns:
      1. particle number - !!! PARTICLE NUMBERING STARTS WITH 1 !!!
      2. phi Euler angle - rotation Euler angle around Z, in degrees
      3. theta Euler angle - rotation Euler angle around new Y, in degrees
      4. omega Euler angle - rotation Euler angle around new Z (in-plane rotation), in degrees
      5. shiftx - in pixels
      6. shifty - in pixels
      7. mirror - specify 1 if particle is mirrored, 0 otherwise. If mirrors are NOT handled in the package, and are represented by different Euler angles, leave as 0
      8. 3D reference # - 1, 2, 3, etc. Use 1 for single-model refinement case
      9. 2D class # - the number of the class to which the particle belongs. Leave as 0 if these are not defined
      10. quality factor - leave as 0 if not defined
      11. kept particle - specifies whether or not the particle was discarded during the reconstruction routine. If it was KEPT, specify 1, if it was DISCARDED, specify 0. If all particles are kept, all should have a 1.
      12. post Refine kept particle (optional) - in most cases just leave as 1 for all particles
  5. the optional additional files are:
    1. An FSC file titled "recon_'timestamp'_it#_vol#.fsc (it# and vol# have 3 integers), as for example "recon_11jul18z_it001_vol001.fsc" Lines that are not read should begin with a "#". Otherwise, the first column must have values in inverse pixels. The second column must have the Fourier shell correlation for that spatial frequency. You can have as many additional columns as you would like, but they will be skipped.
    2. .img/.hed files describing projections from the model and class averages belonging to those Euler angles. The suggested format is as follows: image 1 - projection 1, image 2 - class average 1, image 3 - projection 2, image 4 - class average 2, etc., but you can have any format you like. see below
  6. run the command uploadExternalRefine.py, specifying, at a minimum, the following options:

An example command is:

uploadExternalRefine.py --rundir=/ami/data17/appion/11jan11a/recon/external_package_test --runname=external_package_test --description="testing out external upload on 11jan11 data, emanrecon11, first iteration" --projectid=224 --no-commit --expId=8397 --uploadIterations=1,2,3,4,5 --stackid=127 --modelid=19 --mass=3800 --apix=2.74 --box=160 --numberOfReferences=1 --numiter=1 --timestamp=11jul18z --symid=25


Upload Stack

The user is able to upload particle stack to be used within Appion processing.

General Workflow:

To launch:

  1. The user starts from the session to which the stack will be added
  2. The user provides the full path and name of the stack (e.g. /home/abc/start.hed)
  3. The user provides a brief description for the stack
  4. The user inputs the particle diameter in Angstrom
  5. The user inputs the pixel size for the model
  6. The user checks if the stack is CTF corrected (default: unchecked)
  7. The user checks if the particles will be normalized (default: checked)
  8. The user checks if the stack is committed to the database (default: checked)

Output:

  1. Once the stack is uploaded, it will appear under "stacks".
  2. The stack will be available for subsequent processing.

Notes, Comments, and Suggestions:

  1. Stack must be in IMAGIC format (.hed or .img)


< Upload More Images | Image Assessment >



Upload Template

The user is able to upload 2D templates to be used in template-based particle picking or reference-based alignment.

General Workflow:

To launch:

  1. The user prepares the template(s) in MRC format (user provides the entire file path with wild cards being accepted)
  2. The user inputs the diameter of the particle in Angstrom
  3. The user inputs the pixel size for the template(s)
  4. The user provides a brief description for the template(s)

Output:

  1. Once the template(s) is uploaded, it will appear viewable under "Upload Template".
  2. The templates(s) will be usable for template-based particle picking or reference-based alignment.

Notes, Comments, and Suggestions:


< Import Tools | Upload Model >



Useful shell commands

Link to top 10 cheat sheets.

make a folder writable

 chmod -R g+rw eman_recon14

change the owner of a folder and its contents

chown -R <usename> <folder>

check the status of a job

  1. ssh to the processing server
  2. qstat -au YOUR_USER_NAME
  3. will list your jobs

check which nodes are currently being used on processing machine

qstat -an

check the status of each node on the cluster

  1. spew a bunch of info about each of the nodes including status:
    pbsnodes
    
  2. For a graphic version use the command ( remember to use -X with
    ssh to display it back to your computer):
    xpbsmon
    

kill a process

  1. Log into the machine it is running on
    ssh fly
  2. Look for processes with your user name
    ps -ef |grep [your_username]
  3. Kill the process using the number in the first column after your username
    kill [process id]
  4. If the process was a copy, remove the destination folder
    rm [destination folder]
  5. list system stats
    top

submit a job to a job manager

qsub <jobfilename>

Kill a job running through the job manager

Start an interactive session on a node

qsub -I

Check how much space is available on a data drive

see how large the files are in a directory

du
du -sch *

See what groups a user belongs to

id <username>

Find the version of CentOS installed

# cat /etc/*release*

Fix a Stale NFS file handle. error

Do a lazy unmount followed by mount.

umount -l <drive>
mount <drive>

See what modules are available on a cluster

On Garibaldi at least:

module avai

displays a list of all the installed modules that can be added to your .cshrc file.


Users

From the Administration tool, an administrator may:

Note: Currently, users may not be removed from the database.

The following tasks may be completed with the Project tool:

Default users

Username Firstname Lastname Displayed Group Description
administrator Leginon-Appion Administrator administrators Default leginon settings are saved under this user
anonymous Public User guests If you want to allow public viewing to a project or an experiment, assign it to this user

View all Users

Users may be viewed and managed within the Administration tool.

  1. Open a web browser. Go to 'http://yourhost/myamiweb/admin.php'.
  2. Click on the Users icon:
A list of the registered users is displayed with the following information:

You may sort the users by clicking on the column headers.

Add a User

  1. Open a web browser. Go to 'http://yourhost/myamiweb/admin.php'.
  2. Click on Users.
  3. Click on Add A New User (Only administrators can do this if the login feature is activated)
  4. Add a "username" (required) for the user. (This is like a one-word username that people use to log into a computer).
  5. Enter all required information.
  6. Add this user to a previously created group. Or, create a new group for this user. (required)
  7. Click on the Add button to add the new user to the database.

Modify a User's profile

  1. Open a web browser. Go to 'http://yourhost/myamiweb/admin.php'.
  2. Click on Users.
  3. Click on the edit icon (pencil) next to the user you wish to modify.
  4. Modify the desired fields.
  5. To change the password, you must check the box located above the password field.
  6. Click on the Update button to save your changes.


< Groups | Instruments >



User Management

  1. What to Expect if Authentication is Enabled
  2. New User Registration
  3. Retrieve Forgotten Password
  4. Modify Your Profile
  5. Logout

< Administration | Project DB >



Use Cronus3 or Fly to view your web app

(Fly is a good choice since it is not a production server. You can create your own databases on Fly as well.)

  1. Open a terminal and go to your home directory: $ cd ~
  2. Create a directory called "ami_html" if it does not exist: $ mkdir ami_html/
  3. Change directories to ami_html: $ cd ami_html
  4. Create a symbolic link to the web app directory in your workspace: $ ln -s /home/[username]/amiworkspace/myami/myamiweb
  5. Try to browse to the project in a web browser ( http://cronus3/~username/myamiweb ). You should see an error because config.php failed to open. Keep reading "Create your config file" to fix this.

Using basicReport.inc

basicReport.inc is a class that can be used to quickly display run information as well as parameters and results. It creates html tables by reading database tables.
The only input it needs is the expId, jobtype and a database table name. BasicReport is used for reporting results from automated testing.

This is a sample file using the class to display a list of all the makestack runs that have occured for the session:

<?php
require_once "inc/basicreport.inc";

$expId = $_GET['expId'];

try {
        // Create an instance of the BasicReport class to display all the makestack runs from this session.
    $testSuiteReport = new BasicReport( $expId, "makestack2", "ApStackRunData");

    if ($testSuiteReport->hasRunData()) {
        $runDatas =$testSuiteReport->getRunDatas(True);

        // For each testsuite run, set the URL for it's report page and display it's summary info.
        foreach ($runDatas as $runData) {
            $runReportPageLink = 'testsuiterunreport.php?expId='.$expId.'&rId='.$runData['DEF_id'];
            $summaryTable .=  $testSuiteReport->displaySummaryTable($runData, $runReportPageLink);
        }

    } else {
        $summaryTable = "<font color='#cc3333' size='+2'>No Test Run information available</font>\n<hr/>\n";
    }

} catch (Exception $e) {
    $message = $e->getMessage();
    $summaryTable = "<font color='#cc3333' size='+2'>Error creating report page: $message </font>\n";
} 

// Display the standard Appion interface header
processing_header("Test Suite Results", "Test Suite Results", $javascript, False);

// Display the table built by the BasicReport class or errors
echo $summaryTable;

// Display the standard Appion interface footer
processing_footer();
?>

Here is a sample file using the class to display all the information from a single makestack run:

<?php

require_once "inc/basicreport.inc";

$expId = $_GET['expId'];
$runId = $_GET['rId'];

try {
    // Create an instance of the BasicReport class using the makeStack jobtype and DB table
    $testSuiteReport = new BasicReport( $expId, "makestack2", "ApStackRunData");

    // Get the run data for the specific test run we are reporting on
    $runData =$testSuiteReport->getRunData($runId);

    // The jobReportLink provides a link to another page for further information. 
    // If there is no more data to display, this should link back to the current page.
    // If the 3rd param to displaySummaryTable() is True, sub tables will be parsed and displayed.
    $jobReportLink = 'testsuiterunreport.php?expId='.$expId.'&rId='.$runData['DEF_id'];
    $summaryTable =  $testSuiteReport->displaySummaryTable($runData, $jobReportLink, True);

} catch (Exception $e) {
    $message = $e->getMessage();
    $summaryTable = "<font color='#cc3333' size='+2'>Error creating report page: $message </font>\n";
} 

// Display the standard Appion interface header
processing_header("MakeStack Run Report","MakeStack Run Report for $runData[stackRunName]");

// Display the table built by the BasicReport class or errors
echo $summaryTable;

// Display the standard Appion interface footer
processing_footer();

?>


Appion/Leginon 2.0.0

Appion/Leginon 2.0.0 will be the initial deployment of Appion. Work toward this milestone will focus on ease of installation and user friendliness as well as robustness.
The Version number is 2.0 so that Appion and Leginon may continue with the same versioning.

  1. Deployment
    1. Evaluate existing code and development tools (#31)
      1. Add a few features to aid process
    2. Evaluate and adopt issue tracking software (Redmine) (#32)
    3. Setup an IDE to increase developer productivity (#33)
    4. Regression Test Appion (#34)
      1. Setup test environment
      2. Define testing process
      3. Setup test case tracking
    5. Migrate production code that is not under source control (ProjectDB) (#7)
      1. Move the Login page to be the first thing user sees
      2. Use groups to give 3 permission levels
      3. Convert Project Sharing table to Sinedon format
      4. Move password code from Project_tools to DBEM
      5. Move Project_Tools directory under DBEM
      6. Create scripts to migrate data from old DB tables to new tables
      7. Try to fix our user database by hand, or just require users to re-register.
    6. Create a web interface for configuration (#35)
      1. Combine config files
      2. Tweak include paths
    7. Reduce number of repositories required for install (#36)
      1. Combine DBEM and Project_Tools so that web parts are in single dir (#7)
      2. Move all appion and leginon code into single repository (#36)
    8. Distribute compatible versions of dependencies from a single location (#37)
    9. Create tools for troubleshooting (#38)
      1. Tool to check the versions of dependencies are correct
    10. Make sure the mrc module installation works well (#34)
    11. Improve documentation (#40)
      1. Convert docbook to redmine
      2. Convert google wiki to redmine
      3. Change redmine URL
    12. Standardize the web interface of appion (#41)
      1. Make the first page a login page, then display what user can see (#7)

<Edit this page>


Appion/Leginon 2.1

Major Goals:

Establish a cohesive interface to the world

  1. Complete User Guides
    #40 #625 #622 #621 #617 #614 #613 #637 #622 #621 #613 #669 #655 #668
     
  2. Establish forum in Redmine
     
    1. Leginon forum migration #626
       
    2. Forum sharing in redmine #627
       
  3. Move components of ami website to redmine
     
  4. Establish mailing list #44

Streamline installation on CentOS

  1. Error Checking in the Setup Wizard #569
     
  2. Troubleshooting tools #38
     
  3. A single interactive setup.py script for CentOS that does everything. #597
     
  4. Change method for running scripts to appion wrapper #319, #661, #662, #675, #676

<Edit this page>


Appion/Leginon 2.2

Appion/Leginon 2.2 will focus on the Extensibility of Appion.
The goal is to make it easy for outside labs to add new processing modules to the Image Pipeline.


Appion/Leginon 2.3

Appion/Leginon 2.3 will expand the ways the user may interact with data. Add the ability to filter particles based on DB fields and create reports.

  1. Filtering
    1. Create a web interface for filtering
      1. Use Redmine interface as an example for filter setup
    2. Allow min/max ranges and outlyer exclusion
  2. GUI Upgrade
    1. Redesign the image viewers
    2. Add some bells and whistles to data presentation
  3. Write a paper about our experience of setting up a software shop in an academic environment
    1. Create outline by May 2010
      1. What works or does not work
      2. Best tools to use
      3. Best processes

<Edit this page>


Version Change Log

 
Use the following links to view new features, bug fixes and known bugs for all versions of Appion and Leginon.

New Features
Bug Fixes
Known Bugs


< An Introduction to Appion | Complete Installation >



Version future

This is the destination for Feature to implement or Bugs to fix that have no specific timeline

  1. Automated testing #1005 (Amber) (2 weeks)
    1. Develop test scripts where possible #1007
    2. Establish a permanent test data set in Data00
       
  2. Expand Auto-Installer #1015 (Amber)
    1. add processing packages to installer (spider, frealign, eman)
       
  3. Add modules to pipeline
    1. Add Protomo2 (Amber, Eric) #1026
    2. Add new version of Chimera to existing code to learn what is involved #82 #25
    3. Chose other modules to add after code changes
    4. See if StokesLabProcedure will integrate their stuff as a Beta test
       
  4. Create a developers tutorial #1022
    1. Add a developers tab to the appion website with links to all the resources available #1021
    2. Define coding standards (#10)
      1. python doc string viewer/editor (#162)
      2. Edit several key files (such as often copied ones) to use standards rigorously as samples #1012
      3. PHP standards doc #1013
      4. python standards doc #231
         
  5. Features for public cluster (release with 2.1.1)
    1. Users need to be able to run imageuploader remotely. (#274)
    2. Need to be able to define max number of procs per node for each processing host - Advanced version (#366)
    3. Add single user sign-on functionality for SDSC roll-out (#1010)
    4. Investigate how data will move between AMI and SDSC (#1011)
    5. Having a different password for the imageviewer login and the cluster login is confusing. #364
       

Moved to low priority:

  1. Expand Auto-Installer #1015
    1. multiple servers #1016
    2. multiple platforms #1017 - Mac (highest priority), Fedora, Suse, Ubuntu
    3. more options (Advanced vs Novice user) #1018
    4. yum, rpm ? #1019
       
  2. Automated testing #1005 (Amber) (4 weeks)
    1. Develop unit tests where applicable #1006
    2. Look into automated GUI test apps #1008
    3. Static testing for code standards? #1009
       
  3. Better error reporting
    1. Biggest problem is jobs management which could be helped with an agent
    2. Show jobs that are errors (#603)
    3. Create an error log (#75)
    4. Remote cluster recons should not return as done if job failed (#531)
       
  4. Improve help tools
    1. Add links to redmine wiki help pages to appion pages (#666)
    2. Add pop up dialogs to report pages as well (#516)
    3. Image viewer tool tips
       
  5. Create or use an object-relational mapping for PHP #1020
    1. Look into Zend library
      1. example
         
  6. Discuss a strategy to modify database schema without effect external developers.
    1. we will wait and address this when someone is ready to make a change.
       
  7. clean up web interface
    1. Remove inconsistencies in web interface (#41 #670)
    2. Add session name to window so that window are not reloaded from another session (#512)
    3. Remove select project box in getproject page. (#14)
    4. Implement sorting algorithm into project management tool. (#13)
    5. Job Status updates missing on some tools (#994)
    6. Reorganize the last column in the view project page. (#15)
       
  8. Leginon
    1. Feature to measure focus change in a random direction. (#226)
    2. import preset by searching for session that uses an application (#654)
    3. Allow averaging of multiple focus measurements (#225)
    4. Leginon image viewer should cache the FFT images as well. (#217)
    5. target queue editor (#214)
       
  9. misc
    1. Data Location tool-find data and push it to external drive (#954)
    2. Snapshots of projection views for uploaded models. (#857)
    3. Put variables from config.php into the database (#699)

View a Summary of a Project Session

  1. Select the project DB tool from the Appion and Leginon Tools start page at http://YOUR_SERVER/myamiweb.
  2. Select the Summary link next to your project in the experiment table to display session detatils
     
    Session Summary link

     

< Share a Project Session with another User | Grid Management >



View Projects

To view a list of the Projects that you have permission to access:
  1. Click on the Project DB icon in the Appion and Leginon Tools start page or browse to http://YOUR_HOST/myamiweb/project/project.php.
  2. The following screen will be displayed:
     
    Projects Screen
    View Projects Screen
     
  3. Click on the name of the project you wish to view
  4. A simple view of the project will be displayed:
     

     
  5. To see more information about the project, click the <Detailed View> button shown above
     

Create New Project >



View Stacks

This option takes the user to the stack summary page, which is also accessible from the "X Complete" link under the "Stack" submenu in the Appion SideBar.

Description and Options Available:

Notes, Comments, and Suggestions:

<More Stack Tools | Particle Alignment >


Web Img Assessment

The user is able to assess individual images on the web by either:

  1. keep
  2. reject
  3. none

General Workflow:

Input:
  1. The user assigns a status to the current image, whether to keep (check) the image or to reject (cross) it
  2. Upon clicking the "check" or "cross" button, the next image will be presented
  3. If the user is undecided, he or she can just move forward without assigning a status to the image
  4. The user can assess the images based on particle picks or raw micrographs (the load time for raw micrographs is longer)

Output:

  1. Once the images are assessed, the result can be viewed by clicking the link right next to "Img Assessment"
  2. The result of the assessment can be used for limiting images used for subsequent processing (e.g. making stack)

Notes, Comments, and Suggestions:

  1. Assessment can be resumed from previous session and the user can choose to show only unassessed images.


< Image Assessment


Web Server Installation

The following applies to the computer that will host the web-accessable image viewers and project management tools. This also provides the main user interface for Appion.

  1. Differences between Linux flavors
  2. Install Web Server Prerequisites
  3. Configure php.ini
  4. Install Apache Web Server
  5. Check php information
  6. Download Appion and Leginon Files
  7. Install the MRC PHP Extension
  8. Install SSH module for PHP (Appion only)
  9. Install the Web Interface
  10. Install phpMyAdmin (optional)
  11. Troubleshooting


< Processing Server Installation | Additional Database Server Setup >



Web Server Installation

The following applies to the computer that will host the web-accessable image viewers and project management tools. This also provides the main user interface for Appion.

  1. Differences between Linux flavors
  2. Install Web Server Prerequisites
  3. Configure php.ini
  4. Install Apache Web Server
  5. Check php information
  6. Download Appion and Leginon Files
  7. Install the MRC PHP Extension
  8. Install SSH module for PHP (Appion only)
  9. Install the Web Interface
  10. Install phpMyAdmin (optional)
  11. Troubleshooting

What does User Authentication do to myamiweb

Purpose of optional user authentication system in combination of Project Management in the Leginon/Appion Database Tools on the webserver is to provide different levels of user privileges at institution where the webserver is available to all. In addition, by assigning project owners, users in the lower privileged group will not see projects from others. It makes finding an experiment session easier once data are accumulated. Enabling of the system is not required, but is recommended if the web server can be accessed freely outside the intended group. Once enabled, no myamiweb pages can be accessed without login at its required privilege.

Four levels of group privileges are included with the complete installation of Leginon/Appion 2.0, and four user groups are created by default to reflect them, respectively

privilege level default group name
All at administration level administrators
View all but administrate owned power users
Administrate/view only owned projects and view shared experiments users
View owned projects and shared experiments guests

These four default groups would not appear in the database of a system upgraded from earlier version to 2.0. During the upgrading, the group in which the "administrator" user is in is assigned to "All at administration level" privilege. All other groups are assigned to "Administrate/view only owned projects and view shared experiments" privilege. This can be changed after the database upgrade is completed.

Rule examples:

To enable or disable user authentication, run the setup wizard at http://YOUR_SERVER/myamiweb/setup.


Related Topics:

Install the Web Interface
Leginon upgrade instruction
User Guide on User Authentication/Management



What is Appion?

Appion is a "pipeline" for single particle reconstruction. Appion is integrated with Leginon data acquisition but can also be used stand-alone after uploading images (either digital or scanned micrographs) or stackes using a set of provided tools. Appion consists of a web based user interface linked to a set of python scripts that control several underlying integrated processing packages. These include EMAN, Spider, Frealighn, Imagic, XMIPP, findEM, ACE, Chimera. All data input and output is managed using tightly integrated MySQL databases. The goal is to have all control of the processing pipeline managed from the web based user interface and all output from the processing presented using web based viewing tools.
These notes are provided as a rough guide to using the pipeline but are not guaranteed to be up to date or accurate.

Appion users usually start off at a web page that presents them with a range of options for processing, reconstruction, analysis. This may look something like the following:

The user can select to proceed with any of the steps in the left hand menu options but some of these may be dependent on earlier steps. For example a stack cannot be made until particles have been selected. After any of the steps has been run the user can chose to view the results by clicking on the "completed" or "available" labels.

Appion and Leginon depend on the same basic architecture so you can install either one or both together with almost no extra effort. You will need to perform the same basic three parts of system installation for either or both packages. Following this basic installation, if you want to run Leginon on the microscope, you will need to perform a few additional steps, and instructions can be found in the Leginon Manual.

The four basic parts of Appion are :

Installation instructions for all of these parts are included in the Appion installation instructions.

In addition, Appion also needs:

All 4 servers can run on the same machine. However, for an installation where high volume of data, processing and users is anticipated, it is recommended that the first three parts of the system are installed onto 3 separate computers.

System Requirements >



Wiki Tips

1 Add "breadcrumbs" to your wiki page

Breadcrumbs are links that appear at the top of a wiki page that show the previous pages that you visited.
To add breadcrumbs you must set up parent/child relationships for wiki pages.
To set the parent of the page that you are currently viewing, select "Rename" at the top right.
Copy and paste the name of the parent page from the desired parent page's "Rename" section.

2 Add a Table of Contents to your wiki page

If your wiki page is long with multiple headings, you may want a table of contents.

{{toc}} adds it to the left hand side of your page.

{{>toc}} adds it to the right hand side.

[[AMI Redmine Quick Start Guide]] displays a link to the page named 'AMI Redmine Quick Start Guide': AMI Redmine Quick Start Guide

[[AMI_Redmine_Quick_Start_Guide|AMI Redmine QSG]] displays a link to the same page but with a different text: AMI Redmine QSG

[[AMI_Redmine_Quick_Start_Guide#1-Register-as-a-user|How to register]] displays a link to the header on the same page with a different text: How to register

Note that when linking to a header on another wiki page, the header must be labeled h1., h2., or h3. (h4. will not work.) Also, there may not be special characters such as period(.) or dash(-) in the header that you are linking to.

4 Add super cool images to a wiki page

At the bottom of the wiki page is an "upload file" link. Use this to upload your image file to Redmine.
Then right click on the link to the file and select "Copy Link Location".
Next, edit the wiki page and paste the link location. Put an exclamation point(!) at the start and end of the URL.
You can also just put the name of the file you uploaded between the exclamation points...as long as you are referring to an images that is attached to the specific page you are editing. You can use the url on any wiki page.

Example:

Move it to the right hand side of the page with a greater than symbol(>) after the first (!).

Example:

You can also turn the image into a link to a url by adding a colon (:) after the last (!) and then the url to link to.

Example:

5 Upload an Apple Keynote file

Since a Keynote file is actually a folder, it will not upload properly in Redmine. You will need to put the Keynote into an archive like Zip and then upload the zip file.
Reference: http://www.redmine.org/boards/2/topics/992?r=1032#message-1032

6 Add a reference to a bug number from a subversion commit comment

If you add refs and the bug number to your subversion message it automatically links them, e.g., 'refs #139'.


Wrapper Testing

Issue #828

The following files contain a call to showOrSubmitCommand() or submitAppionJob(). These function calls need to be executed to verify the following:

1. Commands using showOrSubmitCommand should pre-pend the Appion Wrapper Path defined in the config file when both Show Command and Run Command are selected by the user.
2. Commands using submitAppionJob should pre-pend the path when Run Command is selected.
3. Commands using submitAppionJob should NOT pre-pend the path when Just Show Command is selected. In this case, the user will need to manually modify the command prior to executing it to include the wrapper path.

Appion wrapper path should look like: "/opt/myamisnap/bin/appion prepFrealign.py --stackid=1337 --modelid=22 ..."

To get to the trunk installation: http://cronus3.scripps.edu/betamyamiweb/

Filename uses new showOrSubmitCommand() uses old submitAppionJob() Show Command (pass/fail) Submit Command (pass/fail) Notes
alignSubStack.php X pass
applyJunkCutoff.php X
bootstrappedAngularReconstitution.php X
centerStack.php X
coranSubStack.php X
createmodel.php X
createSyntheticDataset.php X
emdb2density.php X
imagicMSA.php X pass pass
jumpSubStack.php X
makegoodavg.php X
multiReferenceAlignment.php X
pdb2density.php X
postproc.php X FAIL (adding "appionlib" to Appion_Lib_Dir var in wrapper fixes this FAIL (same issue)
prepareFrealign.php X pass pass
runAppionScript.php.template X
runMaskMaker.php X pass pass
runMaxLikeAlign.php X pass pass
sortJunk.php X
subStack.php X pass
uploadFrealign.php X
uploadmodel.php X
uploadParticles.php X
uploadrecon.php X pass pass
uploadstack.php X
uploadtemplate.php X
uploadTemplateStack.php X
uploadtomo.php X pass
uploadXmippRecon.php X
imagic3d0.php X(Dmitry wants this left alone) not working
imagic3dRefine.php a bit complicated to test X (Dmitry wants this left alone) not working
manualMaskMaker.php r15438 pass n/a
runAce2.php r15424
runAffinityProp.php r15439
runClusterCoran.php r15459
runCombineStacks.php r15440
runCoranClassify.php r15461 pass
runCtfEstimate.php r15431
runDogPicker.php r15423 pass pass
runEdIterAlignment.php r15441 getting a file error getting a file error
runEmanRefine2d.php r15455
runImgRejector.php r15444
runJpgMaker.php r15490
runKerDenSom.php r15445 pass
runLoopAgain.php r15446
runMakeStack2.php r15435 pass pass
runOtrVolume.php r15492 need tilt pairs
runPyAce.php r15447 has errors with matlab stuff
runRctVolume.php r15493 need a tilted stack
runRefBasedAlignment.php r15451
runRefBasedMaxlikeAlign.php r15452
runRotKerDenSom.php r15448
runSignature.php r15491
runSpiderNoRefAlignment.php r15449
runStackIntoPicks.php r15450
runSubTomogram.php r15277 pass pass
runTemplateCorrelator.php r15475
runTiltAligner.php r15494
runTiltAutoAligner.php r15495
runTomoAligner.php r15275 pass pass
runTomoAverage.php r15277 pass pass
runTomoMaker.php r15277 pass pass
runTopolAlign.php r15454
runUploadMaxLike.php r15458 error with database - student key
uploadimage.php X pass pass if from expt;fail if from project (disabled, see #864)
imagicMSAcluster.php r15496

Xmipp Kerden Self-Organizing Map

Kerden SOM stands for 'Kernel Probability Density Estimator Self-Organizing Map'. It maps a set of high dimensional input vectors (aligned particles) onto a two-dimensional grid as described in Pascual-Montano et. al Journal of Structural Biology v133(2),233-245 (2001). Note that this method combines feature analysis and clustering into a single step.

General Workflow:

Note: If you accessed "Run Feature Analysis" directly from an alignment run, you will be greeted by the screen displayed on the left below. Alternatively, if you accessed the "Run Feature Analysis Run" from the Appion sidebar menu, you will be greeted by the screen displayed on the right below.

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Check and/or change the dimensions of the two-dimensional grid (SOM) of averages that will be the output.
  4. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  5. Check that the appropriate stack of aligned particles are being analyzed, or choose the appropriate stack from the drop-down menu. Note that stacks can be identified in this menu by alignment run name, alignment run ID, and that the number of particles, pixel and box sizes are listed for each.
  6. Click on "Run KerDen SOM" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  7. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Feature Analysis" submenu in the appion sidebar.
  8. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Feature Analysis" tab in the appion sidebar. Clicking on this link opens a summary of all feature analyses that have been done on this project.
  9. To view the resulting SOM grid, click on the "view particle clusters" link. This opens a new window showing a summary of all clustering runs that have been performed.
  10. Click on "view montage of self-organizing map" to enlargen the montage image.
  11. Click on "view montage as a stack for further processing" to access a new window with tools for sub-stack creation and 3D reconstruction.
  12. In the further processing window, use the boxes and pull down menus to set the range, binning, quality, and info of images to display, and click "load" to affect setting changes.
  13. Change the mouse selection mode from exclude (red) to include (green), depending on your needs. Use your mouse to select images to include or exclude. Note that a list of included and excluded images is automatically generated.
  14. Select from the options to perform on selected images. A new tab will open for processing, and the current window will remain open so that you can come back and perform multiple operations.
  15. Alternatively, select from the options to perform on excluded images.

Notes, Comments, and Suggestions:

  1. The interactive mode of our webpages do not work with the Safari web-browers. Firefox works well.
  2. Clicking on "Show Composite Page" in the Feature Analysis List page (accessible from the "completed" link under "Run Feature Analysis" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Feature Analysis | Ab Initio Reconstruction >



Xmipp Maximum Likelihood Alignment

This method is unbiased and very thorough, but also the slowest of the methods (~days). Maximum likelihood also does a course search, integer pixels shifts and ~5 degree angle increments, so it is best to get templates with this method and use ref-based alignment to get better alignment parameters.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Select the stack to align from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  4. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  5. Click on "Run Maxlike Alignment" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  6. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Alignment" submenu in the appion sidebar.
  7. Once the job is finished, an additional link entitled "1 ready to upload" will appear under the "Run Alignment" tab in the appion sidebar. Click on this link, and a page will open with a summary of the run output. Clicking on the link next to "reference stack" will open a new window that shows the class averages obtained via this analysis.
  8. If you are satisfied with the alignment and want to continue processing its output, click on "Upload Job". This shouldn't take too long to finish.
  9. Now click on the "1 Complete" link under the "Run Alignment" tab. This opens a summary of all alignments that have been done on this project.
  10. Click on the link next to "reference stack" to open a window that shows the class averages and that contains tools for exploring the result. Such tools include the ability to browse through particles in a given class, create templates for reference based alignment, substack creation,3D reconstruction, etc.
  11. To perform a feature analysis, click on the grey link entitled "Run Feature Analysis on Align Stack ID xxx".

Notes, Comments, and Suggestions:

  1. In the parameters box on the right, the most important one to play around with is "Number of References" under "Job parameters". This specifies the number of "seeds" (classes) that will be used for reference free alignment of your particles. A good starting place is 10 references.
  2. For the rest of the parameters, default values are a good place to start, but can be adjusted in accordance with the demands of your project.
  3. If you place your mouse cursor over a parameter name, a help box appears with a thorough description of it.
  4. Maximum likelihood usually generates a few "empty" classes that contain junk or no particles, as well as a few classes that appear to have two particles where one of them is on the edge. In our hands, such classes are eliminated using the sub-stack tools.
  5. We have not noticed significant advantage to decreasing the angular increment, so that we generally use maximum likelihood with a 5 degree angular increment to create templates for subsequent finer alignment via reference-based tools.
  6. Clicking on "Show Composite Page" in the Alignment Stack List page (accessible from the "completed" link under "Run Alignment" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Alignment | Run Feature Analysis >



Xmipp Reference-based Maximum Likelihood Alignment

This method is similar to reference-free maximum likelihood but you select templates first.

General Workflow:

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Select the stack to align from the drop down menu. Note that stacks can be identified in this menu by stack name, stack ID, and that the number of particles, pixel and box sizes are listed for each.
  4. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  5. Click on "Run Maxlike Alignment" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  6. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Alignment" submenu in the appion sidebar.
  7. Once the job is finished, an additional link entitled "1 ready to upload" will appear under the "Run Alignment" tab in the appion sidebar. Click on this link, and a page will open with a summary of the run output. Clicking on the link next to "reference stack" will open a new window that shows the class averages obtained via this analysis.
  8. If you are satisfied with the alignment and want to continue processing its output, click on "Upload Job". This shouldn't take too long to finish.
  9. Now click on the "1 Complete" link under the "Run Alignment" tab. This opens a summary of all alignments that have been done on this project.
  10. Click on the link next to "reference stack" to open a window that shows the class averages and that contains tools for exploring the result. Such tools include the ability to browse through particles in a given class, create templates for reference based alignment, substack creation,3D reconstruction, etc.
  11. To perform a feature analysis, click on the grey link entitled "Run Feature Analysis on Align Stack ID xxx".

Notes, Comments, and Suggestions:

  1. Still untested as to how much bias the reference gives you, but this may be useful in some cases.
  2. When no particles align to a particular template, it goes black and unused in further iterations.
  3. Clicking on "Show Composite Page" in the Alignment Stack List page (accessible from the "completed" link under "Run Alignment" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Alignment | Run Feature Analysis >



Xmipp Rotational Kerden Self-Organizing Map

This function applies the Kerden SOM to rotational symmetric particles after alignment. This is especially useful for classifying particles with difference cyclic symmetries.

General Workflow:

Note: If you accessed "Run Feature Analysis" directly from an alignment run, you will be greeted by the screen displayed on the left below. Alternatively, if you accessed the "Run Feature Analysis Run" from the Appion sidebar menu, you will be greeted by the screen displayed on the right below.

  1. Make sure that appropriate run names and directory trees are specified. Appion increments names automatically, but users are free to specify proprietary names and directories.
  2. Enter a description of your run into the description box.
  3. Check and/or change the dimensions of the two-dimensional grid (SOM) of averages that will be the output.
  4. Make sure that "Commit to Database" box is checked. (For test runs in which you do not wish to store results in the database this box can be unchecked).
  5. Check that the appropriate stack of aligned particles are being analyzed, or choose the appropriate stack from the drop-down menu. Note that stacks can be identified in this menu by alignment run name, alignment run ID, and that the number of particles, pixel and box sizes are listed for each.
  6. Click on "Run Spider Coran Classify" to submit your job to the cluster. Alternatively, click on "Just Show Command" to obtain a command that can be pasted into a UNIX shell.
  7. If your job has been submitted to the cluster, a page will appear with a link "Check status of job", which allows tracking of the job via its log-file. This link is also accessible from the "1 running" option under the "Run Feature Analysis" submenu in the appion sidebar.
  8. Once the job is finished, an additional link entitled "1 complete" will appear under the "Run Feature Analysis" tab in the appion sidebar. Clicking on this link opens a summary of all feature analyses that have been done on this project.

Notes, Comments, and Suggestions:

  1. Clicking on "Show Composite Page" at the top of the Feature Analysis Procedures page will expand the page to show the relationships between alignment runs and feature analysis runs.
  2. Clicking on "Show Composite Page" in the Feature Analysis List page (accessible from the "completed" link under "Run Feature Analysis" in the Appion sidebar) will expand the page to show the relationships between alignment, feature analysis, and clustering runs.

<Run Feature Analysis | Ab Initio Reconstruction >