Introduction
CoEPPNtupGen is a c++ package which incorporates all the necessary ATLAS tools to produce derived ntuples that can be easily analysed.
Layout:
-
CoEPPNtupGen
- header files
-
config
- base xml configuration stubbs
-
misc
- miscellaneous files
-
root
- configuration rootfiles (eg histos for pileup reweighting)
-
scripts
- command line tools (added to path in setup.sh)
-
share
- python scripts designed to be modified for custom use (help in setting up sample configuration)
-
src
- source files
Latest News
Installation
Follow these steps to install:
- Follow the instructions to setup the CoEPP suite here
- Setup athena (the release is not critical, although 17.0.2 is tested. In principle the package can be compiled and run against any recent version of ROOT compiled against python, without an athena install) (if you havent setup athena before look here)
- In the CoEPPDir checkout the LATEST tag of CoEPPNtupGen (you should check which is latest in SVN):
-
svn co svn+ssh://svn.cern.ch/reps/atlasusr/wdavey/CoEPPNtupGen/tags/CoEPPNtupGen-XX-XX-XX CoEPPNtupGen
- Check out dependancies and build the package. This can all be done by issuing these commands:
Note: It is highly recommended that you setup ssh keys to svn.cern.ch (see
here) before issuing the make command to avoid putting in your password 2 times for every checkout. Also, note that the checkout
WILL NOT WORK if your local username and lxplus username are different and you have not configured this in .ssh/config. This file should have an entry something like this:
Host svn.cern.ch svn
Hostname svn.cern.ch
User wdavey
GSSAPIAuthentication yes
GSSAPIDelegateCredentials yes
Protocol 2
ForwardX11 no
where
User
is your lxplus username.
Running
The code is run with the executable:
./runAnalysis [options] XML_CONFIG
Where
XML_CONFIG
is a mandatory xml config file. Please use
./runAnalysis --help
for a list of command line options.
Examples
As an example we will run over a skimmed
D3PD. For the example we will use skims from release 17
D3PDs with the new 'tau' TTree (as opposed to the old 'tauPerf' TTree).
Either use the skim you made in
CoEPPGridTools#Skimming_Example, or use
dq2-get
to retrieve this one
user.wdavey.Skim.Ztautau.r17default.TESSkim_v3/
from the grid (dq2 instructions
here).
Single Local File Example
To run over a single file execute this command:
./runAnalysis --isMC --files <file> run-example.xml
This should produce an ntuple called
CoEPPNtup.example.root
.
Note:
< file >
can be a comma separated list of input files. This is typically used for grid running, however there are more convenient ways to configure running over multiple files locally.
Multiple Local File Example
The easiest way to run over multiple files locally is to generate an xml file with the list of input files. To do this, first make sure you have setup
CoEPPGridTools. Then issue the following command:
genInputXML -o input.files.xml -c -t tau PATH/TO/FILES/*.root*
where the
-c
and
-t tau
options will engage checking of the
tau
TTree in the input files.
Note
PATH/TO/FILES/*.root
may look something like this:
/lustre/user/wedavey/data/Skims/TestTESSkims/user.wdavey.Skim.Ztautau.r17default.TESSkim_v3.111120200140/*.root*
Now you must include this list of files in the
XML_CONFIG
file. To do this, first you must import this file into the main config. To do this make sure the top of your
run-example.xml
contains a line:
<!ENTITY input_files SYSTEM "input.files.xml">
that points to this new file.
Then you need to include this in the main config. Do this by replacing:
<In FileName="PATH/TO/FILE/file.root" Lumi="1."/>
with
&input_files;
Now simply execute the new config like this:
./runAnalysis run-example.xml
Full Local Analysis
It would be possible to manually create input file configs for each of the datasets you wish to run over. However, there are a couple of scripts that make this task a lot easier. If you want to try this whole example, you can download the full TES analysis skim
user.wdavey.Skim.r17default.TESSkim_v2/
(~80GB).
Generating Input File config with share/genSampleConfig.py: (unfortunately this script is a little more complicated since there are both 'tau' and 'tauPerf'
D3PDs, but this should be simplified in the future).
To configure
share/genSampleConfig.py
you should:
- set
indir
to the path where you downloaded the skims.
- make an output dir and set it to
outdir
(here its just called runTES
).
- hash out or incude all the datasets you want to use.
Then from the CoEPPNtupGen top dir execute the script:
python share/genSampleConfig.py
This should generate all the input files in the outdir.
Generating Run XML with share/genRunConfig.py:
Again, this script is complicated by the fact that at the moment we have ZtautauAlpgen and Embedded samples with the old tauPerf tree. So if you prefer, you can simplify it by removing these. The most important things to configure are:
-
indir
- should be the dir where your input file configs are
-
outdir
- this is where your output ntuples will be saved (fine to have it the same as the config dir)
-
runtag
- the run scripts will have the form run--.xml (useful if you have multiple configurations you are running)
- Then comment / include the datasets so they match your input dataset configs.
Again, from the CoEPPNtupGen top dir execute the script:
python share/genRunConfig.py
This should generate all the run config files.
Executing Run XML:
If you have access to a PBS queue, you can easily launch all the jobs at once. First edit the batch script
batchexec.sh
and change the following lines to perform the appropriate athena configuration on your platform:
## Athena Config
HOMEDIR=/lustre/user/wedavey
cd $HOMEDIR
source .bashrc
source setAth
Then launch the jobs like this:
launchPBS run.TES.*.xml
Configuration
--
WillDavey - 20-Nov-2011