Project

General

Profile

MAUSOnlineReconstructionQuick » History » Version 1

Jackson, Mike, 29 March 2012 13:40

1 1 Jackson, Mike
h1. Online reconstruction quick start guide
2
3
This is a stripped down set of basic instructions to get distributed spill transformation and online reconstruction up and running. For full details, see the pages under [[MAUSDevs#Online-reconstruction|MAUS developer documentation - online reconstruction]].
4
5
h2. Build MAUS and install Python libraries
6
7
* Download and unpack MAUS to a directory, e.g. @/home/mice/maus@.
8
* Build,
9
<pre>
10
cd maus
11
source env.sh
12
./install_build_test.bash 
13
</pre>
14
* Install MAUS web front-end dependencies,
15
<pre>
16
easy_install pil
17
easy_install django
18
easy_install magickwand
19
</pre>
20
21
h2. Download and configure the MAUS web-front end
22
23
* Open a new window,
24
<pre>
25
xterm &
26
</pre>
27
* Download and unpack MAUS web front end to a directory e.g. @/home/mice/maus-apps@.
28
* Configure the web front-end,
29
<pre>
30
cd /home/mice/maus
31
source env.sh
32
cd /home/mice/maus-apps
33
./configure --with-maus
34
</pre>
35
* Copy in sample data,
36
<pre>
37
cp images/sample-* media/raw/
38
</pre>
39
40
h2. EITHER set up Django web server
41
42
* Start up the web server:
43
<pre>
44
source env.sh
45
python src/mausweb/manage.py runserver localhost:9000
46
</pre>
47
* Go to web site http://localhost:9000/maus
48
* You should see a MAUS page listing no histograms.
49
* Type @sample@ into the search form.
50
* A new page should appear with two histograms.
51
* Delete the images and the thumbnails:
52
<pre>
53
rm -rf media/thumbs/*
54
rm -rf media/raw/*
55
</pre>
56
57
h2. OR set up Apache 2 web server
58
59
* Set up Apache 2,
60
<pre>
61
$ su
62
$ chmod go+rx /home/mice
63
$ emacs -nw /usr/local/apache2/bin/envvars
64
</pre>
65
* Add before the line "LD_LIBRARY_PATH":
66
<pre>
67
source /home/mice/maus/env.sh
68
source /home/mice/maus-apps/env.sh
69
</pre>
70
* Restart Apache 2,
71
<pre>
72
/usr/local/apache2/bin/apachectl restart
73
</pre>
74
* Go to web site http://localhost:80/maus
75
* You should see a MAUS page listing no histograms.
76
* Type @sample@ into the search form.
77
* A new page should appear with two histograms.
78
* Delete the images and the thumbnails:
79
<pre>
80
rm -rf media/thumbs/*
81
rm -rf media/raw/*
82
</pre>
83
84
h2. Start up Celery worker
85
86
* Open a new window,
87
<pre>
88
xterm &
89
</pre>
90
* Configure the environment:
91
<pre>
92
cd /home/mice/maus
93
source env.sh
94
</pre>
95
* Start up a Celery worker that will use up to 8 cores:
96
<pre>
97
celeryd -c 8 -l INFO --purge
98
</pre>
99
100
h2. Check Celery worker
101
102
* Open a new window,
103
<pre>
104
xterm &
105
</pre>
106
* Configure the environment:
107
<pre>
108
cd /home/mice/maus
109
source env.sh
110
</pre>
111
* Check that the Celery worker has spawned 8 sub-processes,
112
<pre>
113
ps -a 
114
</pre>
115
* There should be 9 @celeryd@ processes in total.
116
117
h2. Run a simple example
118
119
* Run a sample histogram workflow,
120
<pre>
121
./bin/examples/simple_histogram_example.py -type_of_dataflow=multi_process  
122
</pre>
123
* After 4 spills have been processed it should just sit there so,
124
<pre>
125
CTRL-C
126
</pre>
127
* Check that 4 histograms have been output
128
<pre>
129
ls -l
130
</pre>
131
* There should be 4 @eps@ and 4 @json@ files.
132
* Check the database contains the associated documents 
133
<pre>
134
./bin/utilities/summarise_mongodb.py --database ALL 
135
</pre>
136
* @mausdb@ should contain 4 spills.
137
138
h2. Run an offline reconstruction example
139
140
* Open a new window,
141
<pre>
142
xterm &
143
</pre>
144
* Configure the environment:
145
<pre>
146
cd /home/mice/maus
147
source env.sh
148
</pre>
149
* Edit @bin/user/reconstruct_daq.py@ and ensure that the line,
150
<pre>
151
# my_input = MAUS.InputCppDAQOnlineData()
152
</pre>
153
* is commented out, and the line,
154
<pre>
155
my_input = MAUS.InputCppDAQOfflineData()
156
</pre>
157
* is uncommented.
158
* Start a client to read data and transform it,
159
<pre>
160
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform -daq_data_file="03386.000" -daq_data_path=/home/mice/data/ 
161
</pre>
162
163
* Open a new window,
164
<pre>
165
xterm &
166
</pre>
167
* Configure the environment:
168
<pre>
169
cd /home/mice/maus
170
source env.sh
171
source /home/mice/maus_apps/env.sh
172
</pre>
173
* Start a client to merge data and output it,
174
<pre>
175
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output
176
</pre>
177
178
h2. Run an online reconstruction example
179
180
* Open a new window,
181
<pre>
182
xterm &
183
</pre>
184
* Configure the environment:
185
<pre>
186
cd /home/mice/maus
187
source env.sh
188
</pre>
189
* Edit @bin/user/reconstruct_daq.py@ and ensure that the line,
190
<pre>
191
# my_input = MAUS.InputCppDAQOfflineData()
192
</pre>
193
* is commented out, and the line,
194
<pre>
195
my_input = MAUS.InputCppDAQOnlineData()
196
</pre>
197
* is uncommented.* Set the DAQ connection settings:
198
<pre>
199
export DATE_DB_MYSQL_DB=DATE_CONFIG
200
export DATE_DB_MYSQL_USER=daq
201
export DATE_DB_MYSQL_PWD=daq
202
export DATE_DB_MYSQL_HOST=miceacq07
203
export DATE_SITE=/dateSite
204
export DATE_HOSTNAME=`hostname`
205
</pre>
206
* Start a client to read data and transform it,
207
<pre>
208
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform
209
</pre>
210
211
* Open a new window,
212
<pre>
213
xterm &
214
</pre>
215
* Configure the environment:
216
<pre>
217
cd /home/mice/maus
218
source env.sh
219
source /home/mice/maus_apps/env.sh
220
</pre>
221
* Start a client to merge data and output it,
222
<pre>
223
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output
224
</pre>