MAUSOnlineReconstructionQuick » History » Version 4
Jackson, Mike, 29 March 2012 13:43
1 | 1 | Jackson, Mike | h1. Online reconstruction quick start guide |
---|---|---|---|
2 | |||
3 | 2 | Jackson, Mike | {{>toc}} |
4 | |||
5 | 1 | Jackson, Mike | This is a stripped down set of basic instructions to get distributed spill transformation and online reconstruction up and running. For full details, see the pages under [[MAUSDevs#Online-reconstruction|MAUS developer documentation - online reconstruction]]. |
6 | |||
7 | 3 | Jackson, Mike | It assumes you have installed RabbitMQ, MongoDB and ImageMagick. |
8 | |||
9 | 1 | Jackson, Mike | h2. Build MAUS and install Python libraries |
10 | |||
11 | * Download and unpack MAUS to a directory, e.g. @/home/mice/maus@. |
||
12 | * Build, |
||
13 | <pre> |
||
14 | cd maus |
||
15 | source env.sh |
||
16 | ./install_build_test.bash |
||
17 | </pre> |
||
18 | * Install MAUS web front-end dependencies, |
||
19 | <pre> |
||
20 | easy_install pil |
||
21 | easy_install django |
||
22 | easy_install magickwand |
||
23 | </pre> |
||
24 | |||
25 | h2. Download and configure the MAUS web-front end |
||
26 | |||
27 | * Open a new window, |
||
28 | <pre> |
||
29 | xterm & |
||
30 | </pre> |
||
31 | * Download and unpack MAUS web front end to a directory e.g. @/home/mice/maus-apps@. |
||
32 | * Configure the web front-end, |
||
33 | <pre> |
||
34 | cd /home/mice/maus |
||
35 | source env.sh |
||
36 | cd /home/mice/maus-apps |
||
37 | ./configure --with-maus |
||
38 | </pre> |
||
39 | * Copy in sample data, |
||
40 | <pre> |
||
41 | cp images/sample-* media/raw/ |
||
42 | </pre> |
||
43 | |||
44 | h2. EITHER set up Django web server |
||
45 | |||
46 | * Start up the web server: |
||
47 | <pre> |
||
48 | source env.sh |
||
49 | python src/mausweb/manage.py runserver localhost:9000 |
||
50 | </pre> |
||
51 | * Go to web site http://localhost:9000/maus |
||
52 | * You should see a MAUS page listing no histograms. |
||
53 | * Type @sample@ into the search form. |
||
54 | * A new page should appear with two histograms. |
||
55 | * Delete the images and the thumbnails: |
||
56 | <pre> |
||
57 | rm -rf media/thumbs/* |
||
58 | rm -rf media/raw/* |
||
59 | </pre> |
||
60 | |||
61 | h2. OR set up Apache 2 web server |
||
62 | |||
63 | * Set up Apache 2, |
||
64 | <pre> |
||
65 | $ su |
||
66 | $ chmod go+rx /home/mice |
||
67 | $ emacs -nw /usr/local/apache2/bin/envvars |
||
68 | </pre> |
||
69 | * Add before the line "LD_LIBRARY_PATH": |
||
70 | <pre> |
||
71 | source /home/mice/maus/env.sh |
||
72 | source /home/mice/maus-apps/env.sh |
||
73 | </pre> |
||
74 | * Restart Apache 2, |
||
75 | <pre> |
||
76 | /usr/local/apache2/bin/apachectl restart |
||
77 | </pre> |
||
78 | * Go to web site http://localhost:80/maus |
||
79 | * You should see a MAUS page listing no histograms. |
||
80 | * Type @sample@ into the search form. |
||
81 | * A new page should appear with two histograms. |
||
82 | * Delete the images and the thumbnails: |
||
83 | <pre> |
||
84 | rm -rf media/thumbs/* |
||
85 | rm -rf media/raw/* |
||
86 | </pre> |
||
87 | |||
88 | h2. Start up Celery worker |
||
89 | |||
90 | * Open a new window, |
||
91 | <pre> |
||
92 | xterm & |
||
93 | </pre> |
||
94 | * Configure the environment: |
||
95 | <pre> |
||
96 | cd /home/mice/maus |
||
97 | source env.sh |
||
98 | </pre> |
||
99 | * Start up a Celery worker that will use up to 8 cores: |
||
100 | <pre> |
||
101 | celeryd -c 8 -l INFO --purge |
||
102 | </pre> |
||
103 | |||
104 | h2. Check Celery worker |
||
105 | |||
106 | * Open a new window, |
||
107 | <pre> |
||
108 | xterm & |
||
109 | </pre> |
||
110 | * Configure the environment: |
||
111 | <pre> |
||
112 | cd /home/mice/maus |
||
113 | source env.sh |
||
114 | </pre> |
||
115 | * Check that the Celery worker has spawned 8 sub-processes, |
||
116 | <pre> |
||
117 | ps -a |
||
118 | </pre> |
||
119 | * There should be 9 @celeryd@ processes in total. |
||
120 | |||
121 | h2. Run a simple example |
||
122 | |||
123 | * Run a sample histogram workflow, |
||
124 | <pre> |
||
125 | ./bin/examples/simple_histogram_example.py -type_of_dataflow=multi_process |
||
126 | </pre> |
||
127 | * After 4 spills have been processed it should just sit there so, |
||
128 | <pre> |
||
129 | CTRL-C |
||
130 | </pre> |
||
131 | * Check that 4 histograms have been output |
||
132 | <pre> |
||
133 | ls -l |
||
134 | </pre> |
||
135 | * There should be 4 @eps@ and 4 @json@ files. |
||
136 | * Check the database contains the associated documents |
||
137 | <pre> |
||
138 | ./bin/utilities/summarise_mongodb.py --database ALL |
||
139 | </pre> |
||
140 | * @mausdb@ should contain 4 spills. |
||
141 | |||
142 | h2. Run an offline reconstruction example |
||
143 | |||
144 | * Open a new window, |
||
145 | <pre> |
||
146 | xterm & |
||
147 | </pre> |
||
148 | * Configure the environment: |
||
149 | <pre> |
||
150 | cd /home/mice/maus |
||
151 | source env.sh |
||
152 | </pre> |
||
153 | * Edit @bin/user/reconstruct_daq.py@ and ensure that the line, |
||
154 | <pre> |
||
155 | # my_input = MAUS.InputCppDAQOnlineData() |
||
156 | </pre> |
||
157 | * is commented out, and the line, |
||
158 | <pre> |
||
159 | my_input = MAUS.InputCppDAQOfflineData() |
||
160 | </pre> |
||
161 | * is uncommented. |
||
162 | * Start a client to read data and transform it, |
||
163 | <pre> |
||
164 | ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform -daq_data_file="03386.000" -daq_data_path=/home/mice/data/ |
||
165 | </pre> |
||
166 | |||
167 | * Open a new window, |
||
168 | <pre> |
||
169 | xterm & |
||
170 | </pre> |
||
171 | * Configure the environment: |
||
172 | <pre> |
||
173 | cd /home/mice/maus |
||
174 | source env.sh |
||
175 | source /home/mice/maus_apps/env.sh |
||
176 | </pre> |
||
177 | * Start a client to merge data and output it, |
||
178 | <pre> |
||
179 | ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output |
||
180 | </pre> |
||
181 | |||
182 | h2. Run an online reconstruction example |
||
183 | |||
184 | * Open a new window, |
||
185 | <pre> |
||
186 | xterm & |
||
187 | </pre> |
||
188 | * Configure the environment: |
||
189 | <pre> |
||
190 | cd /home/mice/maus |
||
191 | source env.sh |
||
192 | </pre> |
||
193 | * Edit @bin/user/reconstruct_daq.py@ and ensure that the line, |
||
194 | <pre> |
||
195 | # my_input = MAUS.InputCppDAQOfflineData() |
||
196 | </pre> |
||
197 | * is commented out, and the line, |
||
198 | <pre> |
||
199 | my_input = MAUS.InputCppDAQOnlineData() |
||
200 | </pre> |
||
201 | 4 | Jackson, Mike | * is uncommented. |
202 | * Set the DAQ connection settings: |
||
203 | 1 | Jackson, Mike | <pre> |
204 | export DATE_DB_MYSQL_DB=DATE_CONFIG |
||
205 | export DATE_DB_MYSQL_USER=daq |
||
206 | export DATE_DB_MYSQL_PWD=daq |
||
207 | export DATE_DB_MYSQL_HOST=miceacq07 |
||
208 | export DATE_SITE=/dateSite |
||
209 | export DATE_HOSTNAME=`hostname` |
||
210 | </pre> |
||
211 | * Start a client to read data and transform it, |
||
212 | <pre> |
||
213 | ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_input_transform |
||
214 | </pre> |
||
215 | |||
216 | * Open a new window, |
||
217 | <pre> |
||
218 | xterm & |
||
219 | </pre> |
||
220 | * Configure the environment: |
||
221 | <pre> |
||
222 | cd /home/mice/maus |
||
223 | source env.sh |
||
224 | source /home/mice/maus_apps/env.sh |
||
225 | </pre> |
||
226 | * Start a client to merge data and output it, |
||
227 | <pre> |
||
228 | ./bin/user/reconstruct_daq.py -type_of_dataflow=multi_process_merge_output |
||
229 | </pre> |