OUT OF DATE. Use as reference for what we plan on doing.¶
List of Program Components¶
MAUS follows a Map-Reduce dataflow which is discussed in detail in the MAUS Design page. The core of MAUS will handle some of these steps (in the lingo: the partition and comparison functions) but it's up to the user to specify the following three things:
- The input reader (ROOT file, socket, database, your favourite file format, etc.)
- The map workers, which means what operation you want performed on each event
- The reduce workers, which performs operations on all events after the map step is done (histograming, fitting, etc.)
- The output writer (same as input reader in term of formats)
Input Readers¶
Input readers yield events to be processed.
Reader type |
Comments |
BytestreamFile |
Read prewritten DAQ bytestream data from a file |
CouchDBServer |
Read JSON documents that are spills from a CouchDB server |
DATEServer |
Read DAQ bytestream data that needs to be unpacked from the DAQ DATE |
JSONFile |
Read JSON documents that are spills sequentially from a file |
ROOTFile |
Write ROOT file of run |
Map Workers¶
The mapping workers are broken into three categories:
- MC
- Data
- Both MC and Data
- Testing
Worker name |
Comments |
BeamMaker |
Make a MC beam with certain properties |
CovarianceEvolver |
Evolve a covariance matrix. Either run worker once per run, or use cached evolved covariance matrix, or be recomputed per spill since we get those currents. |
Digitization |
Digitize MC into ADC/TDC counts |
Simulation |
Track particles and get energy deposited for those particles. Don't run this in parallel (default is default, so don't worry) unless you're really sure what you're doing with random number seeds. |
Spill |
Create a spill out of many triggers |
Trigger |
Trigger simulation. Make triggers out of digitized MC information. |
TransferMatrices |
Track particle with transfer matrices |
TransferMatricesCreate |
Create the transfer matrices from simulation events |
VirtualPlanes |
Generate virtual planes at certain points using MC tracks. This interpolates between steps assuming no material. |
Data¶
Worker name |
Comments |
DAQUnpacker |
Unpack DAQ bytestream data |
InstrumentalDataQuality |
Perform low level checks on the data. Examples: dead channels or do the currents in the bytestream agree with EPICS |
Both MC and Data¶
Worker name |
Comments |
ApplyCalibration |
Apply the calibration on either MC or data to go from ADC and TDC counts to energy deposited and time, respectively. Specify if MC or data. |
Cut |
Remove data based on a cut string |
EPICSAlarm |
Create an EPICS alarm, if detected, and post to alarm handler if enabled |
FitGlobalTrack |
Fit a track using information from all detectors |
FitSciFiTrack |
Fit a track using just the SciFi tracker with recpack |
FitTOFTrack |
Fit a track using just the SciFi tracker using Mark Rayner's method |
Testing¶
Worker name |
Comments |
FakeDAQData |
Spit out precomputed DAQ bytestream data |
FakeFitTrack |
Spit out precomputed tracks |
FakeMCTruth |
Spit out precomputed MC truth |
FakeMCDigitized |
Spit out precomputed digitized MC information |
Reduce Workers¶
Whatever histogram, correction, etc. you want. Current MAUS efforts are focused on mappers and only 'example' reducers will be in v1.0.
Output Writers¶
Reader type |
Comments |
CouchDBServer |
Write JSON documents that are spills from a CouchDB server |
JSONFile |
Write JSON documents that are spills sequentially to a file |
ROOTFile |
Write ROOT file of run |