Project

General

Profile

Actions

Online reconstruction quick start guide

This is a stripped down set of basic instructions to get distributed spill transformation and online reconstruction up and running. For full details, see the pages under MAUS developer documentation - online reconstruction.

It assumes you have installed RabbitMQ, MongoDB and ImageMagick.

Build MAUS and install Python libraries

  • Download and unpack MAUS to a directory, e.g. /home/mice/maus.
  • Build,
    cd maus
    source env.sh
    ./install_build_test.bash 
    
  • Install MAUS web front-end dependencies,
    easy_install pil
    easy_install django
    easy_install magickwand
    

Download and configure the MAUS web-front end

  • Open a new window,
    xterm &
    
  • Download and unpack MAUS web front end to a directory e.g. /home/mice/maus-apps.
  • Configure the web front-end,
    cd /home/mice/maus
    source env.sh
    cd /home/mice/maus-apps
    ./configure --with-maus
    
  • Copy in sample data,
    cp images/sample-* media/raw/
    

EITHER set up Django web server

  • Start up the web server:
    source env.sh
    python src/mausweb/manage.py runserver localhost:9000
    
  • Go to web site http://localhost:9000/maus
  • You should see a MAUS page listing no histograms.
  • Type sample into the search form.
  • A new page should appear with two histograms.
  • Delete the images and the thumbnails:
    rm -rf media/thumbs/*
    rm -rf media/raw/*
    

OR set up Apache 2 web server

  • Set up Apache 2,
    $ su
    $ chmod go+rx /home/mice
    $ emacs -nw /usr/local/apache2/bin/envvars
    
  • Add before the line "LD_LIBRARY_PATH":
    source /home/mice/maus/env.sh
    source /home/mice/maus-apps/env.sh
    
  • Restart Apache 2,
    /usr/local/apache2/bin/apachectl restart
    
  • Go to web site http://localhost:80/maus
  • You should see a MAUS page listing no histograms.
  • Type sample into the search form.
  • A new page should appear with two histograms.
  • Delete the images and the thumbnails:
    rm -rf media/thumbs/*
    rm -rf media/raw/*
    

Start up Celery worker

  • Open a new window,
    xterm &
    
  • Configure the environment:
    cd /home/mice/maus
    source env.sh
    
  • Start up a Celery worker that will use up to 8 cores:
    celeryd -c 8 -l INFO --purge
    

Check Celery worker

  • Open a new window,
    xterm &
    
  • Configure the environment:
    cd /home/mice/maus
    source env.sh
    
  • Check that the Celery worker has spawned 8 sub-processes,
    ps -a 
    
  • There should be 9 celeryd processes in total.

Run a simple example

  • Run a sample histogram workflow,
    ./bin/examples/simple_histogram_example.py -type_of_dataflow=multi_process  
    
  • After 4 spills have been processed it should just sit there so,
    CTRL-C
    
  • Check that 4 histograms have been output
    ls -l
    
  • There should be 4 eps and 4 json files.
  • Check the database contains the associated documents
    ./bin/utilities/summarise_mongodb.py --database ALL 
    
  • mausdb should contain 4 spills.

Run an offline reconstruction example

  • Open a new window,
    xterm &
    
  • Configure the environment:
    cd /home/mice/maus
    source env.sh
    
  • Edit bin/user/reconstruct_daq.py and ensure that the line,
    # my_input = MAUS.InputCppDAQOnlineData()
    
  • is commented out, and the line,
    my_input = MAUS.InputCppDAQOfflineData()
    
  • is uncommented.
  • Start a client to read data and transform it,
    ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform -daq_data_file="03386.000" -daq_data_path=/home/mice/data/ 
    
  • Open a new window,
    xterm &
    
  • Configure the environment:
    cd /home/mice/maus
    source env.sh
    source /home/mice/maus_apps/env.sh
    
  • Start a client to merge data and output it,
    ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output
    

Run an online reconstruction example

  • Open a new window,
    xterm &
    
  • Configure the environment:
    cd /home/mice/maus
    source env.sh
    
  • Edit bin/user/reconstruct_daq.py and ensure that the line,
    # my_input = MAUS.InputCppDAQOfflineData()
    
  • is commented out, and the line,
    my_input = MAUS.InputCppDAQOnlineData()
    
  • is uncommented.
  • Set the DAQ connection settings:
    export DATE_DB_MYSQL_DB=DATE_CONFIG
    export DATE_DB_MYSQL_USER=daq
    export DATE_DB_MYSQL_PWD=daq
    export DATE_DB_MYSQL_HOST=miceacq07
    export DATE_SITE=/dateSite
    export DATE_HOSTNAME=`hostname`
    
  • Start a client to read data and transform it,
    ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform
    
  • Open a new window,
    xterm &
    
  • Configure the environment:
    cd /home/mice/maus
    source env.sh
    source /home/mice/maus_apps/env.sh
    
  • Start a client to merge data and output it,
    ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output
    

Updated by Jackson, Mike almost 9 years ago ยท 4 revisions