Project

General

Profile

Support #817

Scalers Analysis

Added by Dobbs, Adam almost 11 years ago. Updated about 8 years ago.

Status:
Closed
Priority:
Normal
Assignee:
Start date:
08 December 2011
Due date:
% Done:

100%

Estimated time:

Description

I am trying to recreate the functionality of the old Scalers app from G4MICE. The way to do this seems to be running the analyse_data_offline.py script over a particular run file. The resulting json document holds one spill per line. Reading this into python, for each line, entering the keys ['daq_data']['V830']['channels'] returns a python dictionary holding the channels data, with the channel number in the form ch1 through ch31 as the keys, and the scaler data as the values.

So, two questions:

1. Is what I have said above correct?
2. Is there a table somewhere which says ch1 = trigger requests, ch2 = something else, etc?

The next step would then be to make a python script which reads in this channel data, spill by spill, and then dumps it in easily readable ascii or ROOT file.

#1

Updated by Karadzhov, Yordan almost 11 years ago

Have a look here:

http://bazaar.launchpad.net/~maus-release/maus/release/view/head:/src/map/MapPyScalersDump/MapPyScalersDump.py

ch0 - triggers
ch1 - trigger req.
ch2 - GVA
ch3 - TOF0
ch4 - TOF1

ch9 - LM 1&2
ch10- LM 3&4
ch11- LM 1&2&3&4
ch12- 1 MHz clock

#2

Updated by Dobbs, Adam almost 11 years ago

  • Assignee changed from Karadzhov, Yordan to Rogers, Chris

Excellent, thanks Yordan.

Chris, the mapper presently outputs to screen, whereas as I want it sent to a nicely formatted text file. The easiest way to do this is edit the mapper directly, exchanging the print commands for fout.write or something. I suspect the way I should do it however, is to make a new output mapper. What would you suggest?

#3

Updated by Rogers, Chris almost 11 years ago

Should be a reducer. Mapper has no internal state (so we can parallelise), whereas we want to take averages over few spills and over the entire run. If you write to a file at the end of every run then we can figure a way to get it into the online monitoring application. But there is this data structure discussion that I just threw out on email...

For Mike Jackson's info, it would be a text table about 4 columns x 10 rows.

For now, if it's a text file, we can use e.g. "tail -f <filename>" command.

#4

Updated by Rogers, Chris about 8 years ago

  • Status changed from Open to Closed
  • % Done changed from 0 to 100

Also available in: Atom PDF