<!------------------------------------------------------------------------------------------------------------->
---+ Introduction
* This page describes commands for running ttH analysis using new analysis package:
PhysicsAnpttH * General tutorial for setting up PhysicsAnpttH package:
PhysicsAnpTutorial * Git project for code to read (D)xAOD and making analysis ntuples:
PhysicsAnpProd * Git project for code to submit jobs:
AnpBatch/macros * References for old ttH code
*
PhysicsttH SVN * Instructions for running old code:
PhysicsLightttH
<!------------------------------------------------------------------------------------------------------------->
---+ How to set up PhysicsAnpttH with release 20.7 at first time
* This package is still under development.
* If "git clone" above fails outside CERN, please try getting CERN Kerberos ticket first:
<verbatim>
kinit -f -r7d -A $USER@CERN.CH
</verbatim>
Set up ttH analysis package
* This section describes how to set up this code for the first time using release 20.7 of ATLAS analysis software
* Please run these commands only once and then exit shell:
<verbatim>
$ mkdir -p ~/testarea/AnpttH
$ cd ~/testarea/AnpttH
$ git clone https://:@gitlab.cern.ch:8443/ustc/PhysicsAnpttH.git
$ source PhysicsAnpttH /macros/setup/first_setup_rel20.sh
$ exit
</verbatim>
Set up PhysicsAnpProd code for ntuples generating
* ustc/PhysicsAnpProd is a standalone package that reads (D)xAOD and produces ROOT ntuples with flat or vector branches
* This section describes how to set up this code for the first time using release 20.7 of ATLAS software
* Please run these commands only once and then exit shell:
<verbatim>
$ mkdir -p ~/testarea/AnpProd20
$ cd ~/testarea/AnpProd20
$ git clone https://:@gitlab.cern.ch:8443/ustc/PhysicsAnpProd.git
$ source PhysicsAnpProd /macros/setup/first_setup_rel20.sh
$ exit
</verbatim>
Set up AnpBatch package for managing batch jobs
* AnpBatch contains scripts that help with managing batch jobs at CERN and USTC
* There are two main macros for managing jobs:
* subCERN.py prepares shell scripts for individual jobs and to submit them to LXBATCH
* procJob.py copies job's input files to local disk on worker node and copies output ROOT files
* Commands to check out this package:
<verbatim>
$ cd ~/testarea/
$ git clone https://:@gitlab.cern.ch:8443/ustc/AnpBatch.git</verbatim>
</verbatim>
<!------------------------------------------------------------------------------------------------------------->
---+ Work flow examples
Examples of run tth locally
* Running locally will be helpful for test.
* Step 0: setup environment.
<verbatim>
$ cd ~/testarea/AnpBase20/
$ source setup_atlas_analysis_release.sh
</verbatim>
* Step 1: make mini-ntuple.
<verbatim>
$ cd ${TestArea}/PhysicsAnpttH/
$ python macros/runttHMiniNtp.py ${path_of_input_ntuples} --do-flip-ntuple=2 --do-tau -o out_minintp.root -n 0
</verbatim>
* Step 2: event selection.
<verbatim>
$ python macros/runttH.py ${path_of_input_minintuples} --btag-wp=70 -o out.root --do-Zincl --noTrigSF -n 0
</verbatim>
* Step 3: make tables and stacked plots
* This step need both data and MC output root files from step 2.
<verbatim>
$ python macros/plotCand.py ${path_of_MC_dir} --data-file=${path_of_data.root}/data.root --xsec-list=data/plot/xsec_list.txt --get-regions --counts-dir=Counts --config-path=data/plot/plot_stack_2l.txt -r -s --do-fixrange -o plots
</verbatim>
---++ Examples of how to run tth validation works with branch jobs
* Before you follow the instruction below, you have to setup your local environment.
<verbatim>
$ cd ~/testarea/AnpBase20/
$ source setup_atlas_analysis_release.sh
</verbatim>
Generate mini-ntuples of Data and MC
* Prepare input ntuples list:
* subCERN.py will help us with preparing and dividing input Data/MC ntuples for each jobs.
* The list of input Data/MC ntuples is needed.
<verbatim>
$ python makeFileList.py --eos-path=/eos/escience/UniTexas/HSG8/multileptons_ntuple_run2/25ns_v29/01/Nominal_PLI/ --eos-partition=public -o input_cern_tth_siml_v29.txt
$ python makeFileList.py --eos-path=/eos/escience/UniTexas/HSG8/multileptons_ntuple_run2/25ns_v29/01/Data --eos-partition=public -o input_cern_tth_data_v29.txt
</verbatim>
* Prepare script to submit batch jobs :
* There is a shell script to prepare batch jobs with subCERN.py as an example.
* The path of input and output files in this script should be changed to local one.
<verbatim>
$ source macros/tth/runCERN_base20_tth.sh mini v29
</verbatim>
* Submit batch jobs:
<verbatim>
$ cd ${TestArea}/PhysicsAnpttH/work/batch/tth/Hist/
$ source config-data-v29/submit_all.sh
$ source config-siml-v29/submit_all.sh
</verbatim>
Make stacked plots and tables
* Using the outputs from the previous step.
* Data files need to be hadded into one file - data.root.MC can stay unhadded.
<verbatim>
$ cd ${TestArea}/PhysicsAnpttH/work/batch/tth/Hist/out/tth_hist_${date}_Anp_data_${version}
$ hadd data.root job_0*
</verbatim>
* Make plots and tables.
<verbatim>
$ python macros/plotCand.py ${path_of_MC_dir} --data-file=${path_of_data.root}/data.root --xsec-list=data/plot/xsec_list.txt --get-regions --counts-dir=Counts --config-path=data/plot/plot_stack_2l.txt -r -s -o plots
</verbatim>
<!------------------------------------------------------------------------------------------------------------->
---+ Plans for migrating old ttH code to PhysicsAnpttH
*
Migrate code to make mini-ntuples *
Ask Rhys for commands to make mini-ntuples with the old code *
Use new batch scripts to make new mini-ntuples
* AnpBatch/macros
* Example of using new batch script for RPC analysis *
Migrate code to select control regions *
Update code to make plots and tables *
Plot making is already migrated by Rhys *
Update this TWiki with complete instructions
<!------------------------------------------------------------------------------------------------------------->
---+ Timeline
*
New PhysicsAnpttH code working with release 20.7 ntuples - mid-January * New PhysicsAnpttH code working with release 21 ntuples - late January to early February
* Validate PromptLeptonIso/Veto with release 21 data/MC - February
* Calibrate new muon working points - February to March
* Study old variables with detailed truth for prompt and non-prompt leptons - March
* Study new variables with detailed truth - April to July (possibly longer, depend on outcome)
* Start physics analysis project - June