Project

General

Profile

MAUSOnlineReconstructionQuick » History » Version 2

Jackson, Mike, 29 March 2012 13:41

1 1 Jackson, Mike
h1. Online reconstruction quick start guide
2
3 2 Jackson, Mike
{{>toc}}
4
5 1 Jackson, Mike
This is a stripped down set of basic instructions to get distributed spill transformation and online reconstruction up and running. For full details, see the pages under [[MAUSDevs#Online-reconstruction|MAUS developer documentation - online reconstruction]].
6
7
h2. Build MAUS and install Python libraries
8
9
* Download and unpack MAUS to a directory, e.g. @/home/mice/maus@.
10
* Build,
11
<pre>
12
cd maus
13
source env.sh
14
./install_build_test.bash 
15
</pre>
16
* Install MAUS web front-end dependencies,
17
<pre>
18
easy_install pil
19
easy_install django
20
easy_install magickwand
21
</pre>
22
23
h2. Download and configure the MAUS web-front end
24
25
* Open a new window,
26
<pre>
27
xterm &
28
</pre>
29
* Download and unpack MAUS web front end to a directory e.g. @/home/mice/maus-apps@.
30
* Configure the web front-end,
31
<pre>
32
cd /home/mice/maus
33
source env.sh
34
cd /home/mice/maus-apps
35
./configure --with-maus
36
</pre>
37
* Copy in sample data,
38
<pre>
39
cp images/sample-* media/raw/
40
</pre>
41
42
h2. EITHER set up Django web server
43
44
* Start up the web server:
45
<pre>
46
source env.sh
47
python src/mausweb/manage.py runserver localhost:9000
48
</pre>
49
* Go to web site http://localhost:9000/maus
50
* You should see a MAUS page listing no histograms.
51
* Type @sample@ into the search form.
52
* A new page should appear with two histograms.
53
* Delete the images and the thumbnails:
54
<pre>
55
rm -rf media/thumbs/*
56
rm -rf media/raw/*
57
</pre>
58
59
h2. OR set up Apache 2 web server
60
61
* Set up Apache 2,
62
<pre>
63
$ su
64
$ chmod go+rx /home/mice
65
$ emacs -nw /usr/local/apache2/bin/envvars
66
</pre>
67
* Add before the line "LD_LIBRARY_PATH":
68
<pre>
69
source /home/mice/maus/env.sh
70
source /home/mice/maus-apps/env.sh
71
</pre>
72
* Restart Apache 2,
73
<pre>
74
/usr/local/apache2/bin/apachectl restart
75
</pre>
76
* Go to web site http://localhost:80/maus
77
* You should see a MAUS page listing no histograms.
78
* Type @sample@ into the search form.
79
* A new page should appear with two histograms.
80
* Delete the images and the thumbnails:
81
<pre>
82
rm -rf media/thumbs/*
83
rm -rf media/raw/*
84
</pre>
85
86
h2. Start up Celery worker
87
88
* Open a new window,
89
<pre>
90
xterm &
91
</pre>
92
* Configure the environment:
93
<pre>
94
cd /home/mice/maus
95
source env.sh
96
</pre>
97
* Start up a Celery worker that will use up to 8 cores:
98
<pre>
99
celeryd -c 8 -l INFO --purge
100
</pre>
101
102
h2. Check Celery worker
103
104
* Open a new window,
105
<pre>
106
xterm &
107
</pre>
108
* Configure the environment:
109
<pre>
110
cd /home/mice/maus
111
source env.sh
112
</pre>
113
* Check that the Celery worker has spawned 8 sub-processes,
114
<pre>
115
ps -a 
116
</pre>
117
* There should be 9 @celeryd@ processes in total.
118
119
h2. Run a simple example
120
121
* Run a sample histogram workflow,
122
<pre>
123
./bin/examples/simple_histogram_example.py -type_of_dataflow=multi_process  
124
</pre>
125
* After 4 spills have been processed it should just sit there so,
126
<pre>
127
CTRL-C
128
</pre>
129
* Check that 4 histograms have been output
130
<pre>
131
ls -l
132
</pre>
133
* There should be 4 @eps@ and 4 @json@ files.
134
* Check the database contains the associated documents 
135
<pre>
136
./bin/utilities/summarise_mongodb.py --database ALL 
137
</pre>
138
* @mausdb@ should contain 4 spills.
139
140
h2. Run an offline reconstruction example
141
142
* Open a new window,
143
<pre>
144
xterm &
145
</pre>
146
* Configure the environment:
147
<pre>
148
cd /home/mice/maus
149
source env.sh
150
</pre>
151
* Edit @bin/user/reconstruct_daq.py@ and ensure that the line,
152
<pre>
153
# my_input = MAUS.InputCppDAQOnlineData()
154
</pre>
155
* is commented out, and the line,
156
<pre>
157
my_input = MAUS.InputCppDAQOfflineData()
158
</pre>
159
* is uncommented.
160
* Start a client to read data and transform it,
161
<pre>
162
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform -daq_data_file="03386.000" -daq_data_path=/home/mice/data/ 
163
</pre>
164
165
* Open a new window,
166
<pre>
167
xterm &
168
</pre>
169
* Configure the environment:
170
<pre>
171
cd /home/mice/maus
172
source env.sh
173
source /home/mice/maus_apps/env.sh
174
</pre>
175
* Start a client to merge data and output it,
176
<pre>
177
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output
178
</pre>
179
180
h2. Run an online reconstruction example
181
182
* Open a new window,
183
<pre>
184
xterm &
185
</pre>
186
* Configure the environment:
187
<pre>
188
cd /home/mice/maus
189
source env.sh
190
</pre>
191
* Edit @bin/user/reconstruct_daq.py@ and ensure that the line,
192
<pre>
193
# my_input = MAUS.InputCppDAQOfflineData()
194
</pre>
195
* is commented out, and the line,
196
<pre>
197
my_input = MAUS.InputCppDAQOnlineData()
198
</pre>
199
* is uncommented.* Set the DAQ connection settings:
200
<pre>
201
export DATE_DB_MYSQL_DB=DATE_CONFIG
202
export DATE_DB_MYSQL_USER=daq
203
export DATE_DB_MYSQL_PWD=daq
204
export DATE_DB_MYSQL_HOST=miceacq07
205
export DATE_SITE=/dateSite
206
export DATE_HOSTNAME=`hostname`
207
</pre>
208
* Start a client to read data and transform it,
209
<pre>
210
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform
211
</pre>
212
213
* Open a new window,
214
<pre>
215
xterm &
216
</pre>
217
* Configure the environment:
218
<pre>
219
cd /home/mice/maus
220
source env.sh
221
source /home/mice/maus_apps/env.sh
222
</pre>
223
* Start a client to merge data and output it,
224
<pre>
225
./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output
226
</pre>