Support #817
Added by Dobbs, Adam almost 12 years ago.
Updated almost 9 years ago.
Start date:
08 December 2011
Description
I am trying to recreate the functionality of the old Scalers app from G4MICE. The way to do this seems to be running the analyse_data_offline.py script over a particular run file. The resulting json document holds one spill per line. Reading this into python, for each line, entering the keys ['daq_data']['V830']['channels'] returns a python dictionary holding the channels data, with the channel number in the form ch1 through ch31 as the keys, and the scaler data as the values.
So, two questions:
1. Is what I have said above correct?
2. Is there a table somewhere which says ch1 = trigger requests, ch2 = something else, etc?
The next step would then be to make a python script which reads in this channel data, spill by spill, and then dumps it in easily readable ascii or ROOT file.
- Assignee changed from Karadzhov, Yordan to Rogers, Chris
Excellent, thanks Yordan.
Chris, the mapper presently outputs to screen, whereas as I want it sent to a nicely formatted text file. The easiest way to do this is edit the mapper directly, exchanging the print commands for fout.write or something. I suspect the way I should do it however, is to make a new output mapper. What would you suggest?
Should be a reducer. Mapper has no internal state (so we can parallelise), whereas we want to take averages over few spills and over the entire run. If you write to a file at the end of every run then we can figure a way to get it into the online monitoring application. But there is this data structure discussion that I just threw out on email...
For Mike Jackson's info, it would be a text table about 4 columns x 10 rows.
For now, if it's a text file, we can use e.g. "tail -f <filename>" command.
- Status changed from Open to Closed
- % Done changed from 0 to 100
Also available in: Atom
PDF