Trigger Vertex Study Recipe - by Steve Sekula
Instructions for running the Vertex/Pile-Up Study
Basic Information - Quickstart
RELEASE:
15.6.9.37
- tested with this release; likely works in later releases but tags of packages below may vary.
PACKAGES:
cmt co -r TrigCostAthena-00-00-01 Trigger/TrigCost/TrigCostAthena
cmt co Trigger/TrigCost/TrigCostAlgs
cmt co Trigger/TrigCost/TrigCostData
cmt co Trigger/TrigCost/TrigCostBase
cmt co Trigger/TrigCost/TrigCostRate
COMMANDS:
To generate a small ntuple containing vertex information:
cd /tmp/<USERNAME>/
dq2-get -f RDO.139442._000690.pool.root.1 mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307_tid139442_00
athena -c 'PoolRDOInput=["/tmp/sekula/mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307_tid139442_00/RDO.139442._000690.pool.root.1"];setGEO="ATLAS-GEO-10-00-00";setMenu="MC_pp_v1";setDebug="true";eventMax=100' ./RunTrigCost.py >& ./RunTrigCost.log
To analyze the vertex information in the
ROOT files resulting from the previous step:
./runRates.py . -k TrigCost -r cost_debug.root --test="algs vtx"
Setting up the Release
Create a location where your cmthome/ and test area directories will
live. I have chosen to make the subdirectory structure of that
location as follows:
cmthome/
code/
code/
is my "testarea".
My requirements file is as follows:
#----------------------
set CMTSITE CERN
set SITEROOT /afs/cern.ch
macro ATLAS_DIST_AREA ${SITEROOT}/atlas/software/dist
macro ATLAS_TEST_AREA ${HOME}/<PATH TO TESTAREA>/
apply_tag runtime
apply_tag opt
#simple workarea directories
apply_tag oneTest
apply_tag setup
apply_tag 32
macro ATLAS_SETTINGS_AREA "$(ATLAS_DIST_AREA)" \
betaSettings "$(ATLAS_DIST_AREA)/beta"
use AtlasLogin AtlasLogin-* $(ATLAS_SETTINGS_AREA)
set CMTCONFIG i686-slc5-gcc43-opt
#---------------------
Setup the release as usual, with the "requirements" file in the cmthome/ subdirectory.
cd cmthome/
source /afs/cern.ch/sw/contrib/CMT/v1r20p20090520/mgr/setup.sh
cmt config
source ./setup.sh -tag=15.6.9.13,AtlasProduction
Packages for the Test Area
You need to checkout the following to your test area (
code/
, in my case):
cd code/
svn co svn+ssh://svn.cern.ch/reps/penn/PhysicsNtuple/PhysicsBase/trunk PhysicsNtuple/PhysicsBase
svn co svn+ssh://svn.cern.ch/reps/penn/PhysicsNtuple/TrigMonCosts/trunk PhysicsNtuple/TrigMonCosts
svn co svn+ssh://svn.cern.ch/reps/penn/PhysicsNtuple/TriggerAlgs/trunk PhysicsNtuple/TriggerAlgs
svn co svn+ssh://svn.cern.ch/reps/penn/PhysicsNtuple/TriggerAthena/trunk PhysicsNtuple/TriggerAthena
svn co svn+ssh://svn.cern.ch/reps/penn/PhysicsNtuple/TriggerData/trunk PhysicsNtuple/TriggerData
cmt co -r TrigMonitoringEvent-00-00-10 Trigger/TrigEvent/TrigMonitoringEvent
cmt co -r TrigCostMonitor-00-00-59 Trigger/TrigMonitoring/TrigCostMonitor
(these last two are based on information from
https://twiki.cern.ch/twiki/bin/view/Atlas/TrigCostMonitor and
Rustem, who created
TrigCostMonitor-00-00-59
to incorporate the vertex study algorithm setup and fix
bugs that prevented the
Trig::Vertex
object from being written to the
TrigRate*.root files.)
Then execute:
setupWorkArea.py
to get a
WorkArea directory in code/, which is a handy place from which to build. Then
build:
cd WorkArea/cmt/
cmt broadcast cmt config
cmt broadcast source setup.sh
cmt broadcast make -j2
If that works, you're ready to run.
Key Classes and Configuration Files
- TriggerData/TriggerData/Vertex.h
- a class, capable of being stores in a .root file, which contains information about a vertex (number of tracks, chi2, etc.). Defined in the "Trig" namespace.
- TriggerData/TriggerData/TriggerDataDict.h
- defines the Vertex class in the ROOT dictionary.
- TriggerData/selection.xml
- puts the Trig::Vertex class, and the
std::vector<Trig::Vertex>
classes in the lcgdict.
- TriggerAthena/TriggerAthena/TrigNtVertexTool.h
- processes an event, retrieves vertex lists, and prepares a std::vector object for storage in a ROOT file. This algorithm is executed when the RunTrigCost.py or RunOfflineTrigCost.py jobOptions are executed in Athena.
- TriggerAthena/python/TriggerAthenaConfig.py
- contains python code (for Athena) that loads a python object pointing to the TrigNtVertexTool. Allows it to be configured when Athena runs.
- TriggerAthena/share/RunOfflineTrigCost.py
- top-level jobOptions for running TrigNtVertexTool in Athena (with offline reconstruction).
- TriggerAlgs/TriggerAlgs/StudyVertex.h
- processes the "vertices" folder, in which std::vector objects are stored in the ROOT files produced by RunTrigCost.py or RunOfflineTrigCost.py. Produces histograms, TTrees, and tables containing information about vertices.
- TriggerAlgs/config/StudyVertex.xml
- defines the structure of the histograms filled by StudyVertex.h.
- TriggerAlgs/python/TriggerAlgsConfig.py
- creates the implementations of the StudyVertex algorithm run by Athena. There are 2 instances defined: one for L2 vertices, one for offline vertices. Also loads the StudyVertex.xml file into run run_module.
- TrigMonCosts/macros/runRates.py
- the script you execute to process the TrigRate.root file(s) produced after running "athena...RunOfflineTrigCost.py".
Processing Samples to Incorporate Vertexing
This step processed ESD or AOD files and produces lightweight
TrigRate.root files, which
can then be analyzed using runRates.py.
First locate the data containers you need. The MC containers are listed on the Trigger Rates
Twiki:
https://twiki.cern.ch/twiki/bin/view/Atlas/AtlasTriggerRates#Datasets
For instance:
mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307 is the pile-up MinBias MC
mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d300 is the non pile-up MinBias MC
To process this MC on the
GRID, first grab one of the files in the container off the
GRID:
cd /tmp/${USER}/
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
dq2-ls -f mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307
dq2-get -f RDO.139442._000690.pool.root.1 mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307
Then run the following from something under your test area:
get_files -jo RunOfflineTrigCost.py
pathena -c 'PoolRDOInput=["/tmp/USERNAME/mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307/RDO.139442._000690.pool.root.1"];setMenu="InitialBeam_v3"' ./RunOfflineTrigCost.py --inDS 'mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307/' --outDS user10.YourUserName.AStringThatCharacterizesTheContainer --supStream=GLOBAL --nGBPerJob=1 --notSkipMissing
Or to run interactively over a few events:
athena -c 'PoolRDOInput=["/tmp/USERNAME/mc09_7TeV.105001.pythia_minbias.digit.RDO.e517_s764_s767_d307/RDO.139442._000690.pool.root.1"];setMenu="InitialBeam_v3";eventMax=100' ./RunOfflineTrigCost.py
Look for the files called
TrigRate*.root in the output container (or in the current directory, depending on
how you ran).
Once you have
TrigRate*.root files, you are ready to run the
StudyVertex algorithm on them.
To do this, execute the following:
<PATH TO TESTAREA>/PhysicsNtuple/TrigMonCosts/macros/runRates.py <PATH TO TrigRate*.root FILES> -k TrigRate -r cost_debug.root --test=study
The above tells runRates.py to process
TrigRate files, output the results to cost_debug.root (which
will then contain histograms, TTrees, etc.), and execute the study algorithms (e.g.
StudyVertex).
--
StephenSekula - 24-Jul-2010