Support #1552
Step IV Physics Block Challenge
100%
Description
The challenge has been issued to simulate the nine proposed settings for the Step IV beam line using the full grid based machinery. This constitutes the following activities:
1. Defining the settings for the simulation.
2. Ensuring that the mechanics for the simulation are in place. This includes (but is probably not limited to) configuration database interfaces, running scripts, and a working MAUS release.
3. Testing the simulation.
4. Nursing the simulation runs through the grid.
5. Analyzing the resulting simulations for important figures of merit.
Details to follow.
Files
Updated by Rogers, Chris about 9 years ago
Will be discussed as part of geometry workshop on Wednesday
Updated by Blackmore, Victoria about 9 years ago
Uploading an open office spreadsheet giving approximate current densities for the cooling channel using "as built" dimensions, keeping beta ~ 42cm at the centre of a LH2 absorber. Also included are the approximate diffuser configurations to give 3, 6 and 10mm beams at 200 MeV/c, and the expected energy lost in them.
These won't give perfectly matched beams, it's highly likely we can improve on them, but they should be sufficient to 'test the system' and get going.
Updated by Rogers, Chris about 9 years ago
Please see Geom_workshop_151014 for a detailed breakdown of the issue status to date.
Updated by Bayes, Ryan about 9 years ago
A the first pass of a 6pi 200 MeV beam physics block simulation is located at
http://ppes8.physics.gla.ac.uk/~rbayes/MICE_6pi200_1
I have put all of the necessary information in a tar-ball for your convenience.
These simulations were generated using CDB geometry 43 with the g4beamline interface files provided, using the configuration settings in "config_G4BL_6pi200.py". Note that this first pass was run using a version of MAUS pulled from the trunk. The nearest release tag is 0.9.1, but a number of necessary additions to the tracker reconstruction and geometry handling were introduced to the MAUS trunk after the fact. Therefore these simulations are not suitable for publication but should provide a starting point for analysis. Also note that there are a number of failed runs that I was not quite able to weed out, so I suggest either removing all runs with file sizes less than 14K in size or write your analysis scripts to reject zombie files (i.e. in python; if file.IsZombie(): continue) and that the spill object is defined (i.e. in python; if not spill: continue).
Updated by Dobbs, Adam about 9 years ago
I have the data at Imperial, and would like to run an analysis on it for the tracker. The data seems to be split over hundreds of files ranging from a few 10kb to a few 10Mb. The usual way to analyse data like this in ROOT is with a TChain, however in MAUS we have our own irstream object for data access, which seems to take one file at a time. So... how best do I load all the data for an analysis? Could an example script be provided and added to MAUS?
Updated by Bayes, Ryan about 9 years ago
- File plot_virtuals.py plot_virtuals.py added
I loop over all of the files and run the analysis separately for each file.
It is far more pedestrian and slow than a Chain, but it seems to work. I have attached an example analysis script (plot_virtual.py) that runs over all of the virtual planes and histograms the position, magnetic field, and muon momentum as a function of z. It is currently set up to loop over the first pass MC files contained in the parent directory relative to the working directory.
Note that some of the files are empty because they died prematurely for whatever reason (memory or wall time constraints). This script filters those files out of the analysis.
Updated by Bayes, Ryan about 9 years ago
- File test_list_example.txt test_list_example.txt added
- File jsondocother11_100.txt jsondocother11_100.txt added
- File jsondocother11_101.txt jsondocother11_101.txt added
The simulations on the grid are in preparation. The local tests of the jobs need to be completed in the a similar manner to how they would be run on the grid.
The idea is to post a list (such as the one attached) to the web somewhere (CDB), containing references to all of the json documents to be used in the grid job (examples are also attached) that are derived from G4beamline. These documents should also exist somewhere that is web accessible. At submission a number is fed into the simulation script that then corresponds to a line in the list file as well as the web location of the json document list, so that each subjob of the submission downloads first the json document list and then the json document corresponding to the assigned line to the working space of the grid subjob. The output is then indexed by the list line number.
The issues that still need to be sorted out is where to place the G4beamline interface files. The possibilities are being explored by John Nugent right now.
Updated by Bayes, Ryan about 9 years ago
The url was changed on me. The correct URL is now;
http://www.ppe.gla.ac.uk/~rbayes/MICE_6pi200_1/
Updated by Pidcott, Celeste about 9 years ago
Thanks for that.
Also, out of interest, I've looked at the configuration file, and I was wondering why the muons seem to start at 12500 and the electrons and pions start at 1640?
Updated by Bayes, Ryan about 9 years ago
This is a red herring I am afraid. Those lines in the configuration file are ignored because MapPyBeamMaker was not used. Instead, a json input file supplies the beam which was generated using G4beamline. The interface point is roughly a meter upstream of Dipole 2.
Updated by Bayes, Ryan almost 9 years ago
I have re-run the simulation using maus v0.9.2. A zipped tar-ball of the maus output can be found at
http://www.ppe.gla.ac.uk/~rbayes/MICE_6pi200_1/pass2_simulation_mausv0p9p2.tar.gz
The beam interface files are the same and CDB geometry ID 48 was used. The only significant changes were the removal of two spurious vacuum windows in the AFC, the removal of an spurious asymmetric window in the upstream spectrometer solenoid, and re-positioning the Trackers 70 mm away from the absorber.
Updated by Bayes, Ryan almost 9 years ago
- File Analysis_20150219.pdf Analysis_20150219.pdf added
The pass2 simulation has been updated so that there are now 1280 good runs out of 1418 available G4BL interface files. The tar-ball has the same name and URL.
I have completed an analysis of the updated simulation using the above simulation to evaluate the use of a cut on the single particle amplitude on the measured emittance. My notes (in the form of a set of slides) are attached. The executive summary is that a cut that accepts single particle amplitudes (as defined in the slides) between 16 mm and 48 mm in coordination with a cut that rejects pz > 210 MeV/c yields the best emittance change (of -0.23 ± 0.18 mm). Either an improvement in the emittance change (by reducing beta, if possible) or an order of magnitude improvement in statistics is required.
Updated by Rogers, Chris almost 9 years ago
Ryan, I promised you a link to the beam selection routines I developed in 2008. The link is at:
http://accelconf.web.cern.ch/accelconf/e08/papers/tupc088.pdf
I see it is not a MICE note, I should correct that. The only comment I would add is that a python API for QHull program (see text) is also available using the python packaging program that comes with MAUS, easy_install:
https://pypi.python.org/pypi/pyhull/1.5.3
You can see the script MAUS uses for accessing easy_install packages at maus/third_party/bash/40python_extras.bash
Updated by Rogers, Chris almost 9 years ago
Uploaded as MICE Note 460.
I also note that David Adey did some work as part of his thesis... chapter 4, page 116-143.
Updated by Rogers, Chris over 8 years ago
I put the voronoi weighting algorithm into xboa. It is in module xboa.bunch.weighting
in xboa-0.16.2. This is now the current version in MAUS, since the weekend.
Updated by Rogers, Chris over 8 years ago
- File extract_recon_data.py extract_recon_data.py added
Ryan, do you know which geometry and magnet settings were used for the "pass 2" dataset?
I ran an analysis against the reconstructed dataset in order to complete MPB action "Do a blind analysis ...". I ran the analysis against
http://www.ppe.gla.ac.uk/~rbayes/MICE_6pi200_1/pass2_simulation_mausv0p9p2.tar.gz
- I note that tracker reconstruction does not have any error matrix associated with the tracks.
- I note that TOF is uncalibrated (presumably we don't have a calibration appropriate for this TOF geometry)
I constrained myself to the recon_events branch in the data structure (hence blind). I tried to do PID using uncalibrated TOF vs tracker momentum to reject pion impurities. I did a statistical weighting using the voronoi algorithm in x,px,y,py to generate matched 6 mm emittance beam followed by a statistical weighting to select energy with 5 MeV sigma gaussian distribution and mean 226 MeV (200 MeV/c pz). I measured exactly no cooling! Results below:
Number upstream 10060 weight 10268.6736441 Number downstream 10060 weight 10268.6736441 Emittance upstream: 6.26319650991 [[ 1228.81324105 -71.05796649 76.69789367 -740.27516681] [ -71.05796649 691.50433735 638.7959465 76.89672699] [ 76.69789367 638.7959465 1192.80588937 -64.05287378] [ -740.27516681 76.89672699 -64.05287378 839.24662183]] Emittance downstream: 6.29066900614 [[ 1598.90347602 -292.16254374 213.90940464 926.62538153] [ -292.16254374 913.4983059 -1056.10643249 -55.95572244] [ 213.90940464 -1056.10643249 1738.57475123 -69.65882614] [ 926.62538153 -55.95572244 -69.65882614 838.63803593]]
Full report in the analysis meeting.
Updated by Bayes, Ryan over 8 years ago
The geometry used for the pass2 simulation is CDB id48 which is a preliminary step IV geometry using a LiH absorber. The field settings are the 6-200 M0 tune namely
D2: 0.396 T,
Q4: 0.908 T/m,
Q5: -1.218 T/m,
Q6: 0.808 T/m,
Q7: 0.797 T/m,
Q8: -1.205 T/m
Q9: 1.029 T/m.
The simulation was generated using MAUS v0.9.3 with the default "simulate_mice.py" script.
Updated by Rogers, Chris over 8 years ago
What are the cooling channel SC coil currents? Are they hard coded?
Updated by Bayes, Ryan over 8 years ago
They are not "hard coded" per se, but they are set to the default settings contained in the Maus_Information file native to the geometry download.
The default currents are:
Upstream
End Coil 2: 134 A
Centre Coil: 147 A
End Coil 1: 131 A
Match Coil 2: 135 A
Match Coil 1: 113 A
Focus Coil US: 104 A
Focus Coil DS: -104 A
Downstream
Match Coil 1: -112 A
Match Coil 2: -140 A
End Coil 1: -131 A
Centre Coil: -147 A
End Coil 2: -134 A
To change from the default settings the currents must be changed by hand for the time being. The code exists to alter the current densities from entries in the CDB, but these have not been exercised due to a lack of cooling channel entries in the CDB. This can be remedied quickly however using the tags currently in CDB.
Updated by Rogers, Chris over 8 years ago
Just to follow up on comments in MAUS meeting, I am using attached scripts to read data and get output reconstructed up to slabs but no space points. E.g. for "maus_output_root/maus_output_1.root" from Ryan's dataset:
number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 3 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 1 number of tof0 slabs: 0 number of tof1 slabs: 2 number of tof0 digs: 0 number of tof1 digs: 4 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 3 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2 number of tof0 sps: 0 number of tof1_sps: 0 number of tof0 slabs: 2 number of tof1 slabs: 2
Updated by Rajaram, Durga over 8 years ago
Puzzled. I'm getting TOF0,1 space points simulating with the latest MAUS.
In fact there's an integration in MAUS test tests/integration/test_simulation/test_tof/test_tof.py designed to catch anything off with the simulation or reconstruction & it has been passing.
Wonder if it has anything to do with using a g4bl input, or geometry related. I can't imagine how, but those are the only things the integration test doesn't include.
Silly, but can you confirm if the TOF1SlabHit->GetPlane() distribution shows both 0&1?
And that the TOF1SlabHit->GetCharge() is not empty?
[ or alternately, can I get access to one of the root outputs? Anything with a few tens of TOF1 slab hits should be fine. ]
These were with MAUS-v0.9.2 from ~a month ago?
Is there a way I can reproduce the root files you are using?
With for instance a smallish beam input file & whatever cards were used in producing these files?
Updated by Rogers, Chris over 8 years ago
- File maus_output_1.root maus_output_1.root added
Here is the first root file (of a thousand or so) so that you can have a play...
Updated by Rajaram, Durga over 8 years ago
Thanks.
Which geometry ID was used? And can I get the G4BL JSON input that produced this?
Updated by Bayes, Ryan over 8 years ago
The geometry ID used was 48. The G4BL interface file used can be found at http://www.ppe.gla.ac.uk/~rbayes/MICE_6pi200_1/1810_6200_pos/jsondocother11_1.txt.
I actually have been looking at the output myself. Only 1% or so of events are generating TOF events are producing space points. I am not sure if the file that Chris posted will produce any useful events.
The configuration file used the following arguments.
=================================
input_json_file_name = "1810_6200_pos/jsondocother11_XXX.txt"
input_json_file_type = "text"
output_json_file_name = "maus_output.json"
output_json_file_type = "text"
input_root_file_name = "maus_input.root"
output_root_file_name = "maus_output_XXX.root"
simulation_geometry_filename = "/data/neutrino04/common_SW/MICE/maus-v0.9.2/tmp/test_latest_geometry_48/geometry/ParentGeometryFile.dat"
====================================
Updated by Bayes, Ryan over 8 years ago
- File reduced_ensemble_analysis.tar reduced_ensemble_analysis.tar added
- File virtual_planes.pdf virtual_planes.pdf added
I have written an analysis algorithm in c++, largely because I have been having problems with xboa in terms of processing time, and I didn't find out about Adam's until after I had written it.
The attached tar file contains the program files. It should build in the MAUS environment with a simple make command after the tar file is unpacked.
The program uses a two step process to first identify events in a set of MAUS output files and place them into a "reduced" tree file. This reduces the space requirement of the batch analysis from 16 G to 1.8 G. The resulting events are then passed through a simple muon PID selection algorithm based on the TOF and the particle momenta at the absorber (based on the average momenta between reference planes). The TOF selection is fixed so that the TOF1 - TOF0 difference must be between 42 ns and 48 ns. The momentum range is flexible with respect to the software, but after discussion with Jaroslaw, I have been using a momentum range of 200 ± 5 MeV in my analyses. The results for the 6-200 MeV batch simulation are given below.
Upstream tracker measurements¶
- Summary
Number of tracks = 3356 mean X = 2.46407±0.631982 mean Y = -3.58828±0.630931 mean Z = 15049.7±0.566714 mean Px = -5.3386±0.452144 mean Py = 8.99147±0.47886 mean Pz = 203.982±0.110495
- Covariance
X Y Px Py Z Pz 1340.39 -44.3757 -35.8242 -736.47 -3.44362 46.1577 -44.3757 1335.94 755.583 -30.4149 46.7451 13.1694 -35.8242 755.583 686.08 -30.0475 42.0154 7.15832 -736.47 -30.4149 -30.0475 769.555 31.429 -35.3736 -3.44362 46.7451 42.0154 31.429 1077.83 12.8347 46.1577 13.1694 7.15832 -35.3736 12.8347 40.974
- Emittance(x,y) = 6.05213±0.104471
- Beta(x,y) = 427.457±7.37873
Downstream tracker measurements¶
- Summary
Number of tracks = 3356 mean X = -10.367±0.565674 mean Y = 16.638±0.633988 mean Z = 18868.2±0.839434 mean Px = -10.5996±0.492107 mean Py = -5.8433±0.476276 mean Pz = 187.887±0.130534
- Covariance Matrix
X Y Px Py Z Pz 1073.88 -143.281 71.2935 677.234 78.629 33.7998 -143.281 1348.91 -814.987 -117.019 -139.69 -23.9673 71.2935 -814.987 812.72 45.0992 145.2 22.0473 677.234 -117.019 45.0992 761.271 -22.1385 45.4622 78.629 -139.69 145.2 -22.1385 2364.8 6.5534 33.7998 -23.9673 22.0473 45.4622 6.5534 57.1834
- Emittance(x,y) = 5.91149±0.102044
- Beta(x,y) = 365.188±6.30384
I have not applied any correction to the emittance result shown.
I have also extracted the emittance and beta from the virtual planes with the same analysis so that the behaviour of the true particles can be accessed away from the tracker reference planes.
I have used this virtual plane analysis to explore the effect of the momentum cut. This shows that relaxing the cut has a dramatic effect on the measured emittance, increasing both the overall emittance of the sample and the emittance change across the absorber. The results are shown in the attached file "virtual_planes.pdf".
Updated by Rogers, Chris about 7 years ago
- Status changed from Open to Closed
- % Done changed from 0 to 100
Now we have data this is no longer useful