nTupler and Analysis Code

Getting Started

At first we need to get Delphes. Commands to get the code:

wget http://cp3.irmp.ucl.ac.be/downloads/Delphes-3.0.10.tar.gz

tar -zxf Delphes-3.0.12.tar.gz

Commands to compile the code:

cd Delphes-3.0.12

make -j7

In the same directory use the following comment to get the nTupler code from github:

git clone https://www.github.com/trippk/nTupler_topness.git

cd nTupler_topness

To compile the code just do:

ln -s ../Delphes-3.0.10/classes .

ln -s ../Delphes-3.0.10/external/ExRootAnalysis .

ln -s ../afs/desy.de/user/t/trippk/dust/snowmass/Delphes-3.0.10/libDelphes.so .

Linkdef.h  Makefile  nTuplerCode.cpp  nTuplerCode.h  nTupler.cpp

To use the nTupler Code:

cd Batch

Background Samples

The background samples of our cut-and-count search for the Stau-Coannihilation models are Top-Antitop+jets (ttbar), Single Top production (tjets), W+jets & Z+jets (bjets) and (diboson).

The background samples are stored at different places.

One is at red-gridftp11.unl.edu. The background samples there were produced for the snowmass studies. There are several running scripts, called i.e. runTTBAR in Batch/ to run the nTupler on these snowmass samples.

./runTTBAR

The other place is on eos at CERN.

To see the samples from lxplus:

eoscms ls /eos/cms/store/group/phys_higgs/upgrade/PhaseII/Configuration4v2/140PileUp/

The runEOS_* -scripts are provided for these samples.

Before running these scripts, make sure that the output directory exists. Otherwise they have to be produced, i.e.

mkdir -p /Output/140PU/ttbar/0-600

Signal Samples

We have three different Stau-Coannihilation models. For details look at http://arxiv.org/abs/1307.8076

Model name Mass parameter / GeV Stop1 mass / GeV NLO xsec(pp, 14TeV) / pb
STC5 500 416 3.3
STC6 600 527 2.0
STC8 800 736 1.6

For running:

./runSTC

Make sure that the output directories exist.

Analysis Code

cd nTuples_topness/tupleAnalyzer

./mkLinks.sh

This script creates links to the storage location of the nTuples:

- Snowmass: /scratch/hh/dust/naf/cms/user/kruecker/nTuples/

- ECFA / CMS Upgrade Studies: /afs/desy.de/user/t/trippk/dust/snowmass/NewTupler_topness/Batch/Output

We investigate in three toy searches:

Search Final state Code name inspired by
Stop Full-hadronic (no lepton) AtlasH ATLAS-CONF-2013-001
Single lepton SingleS ATLAS-CONF-2013-001
EWKino Two same-sign leptons EWKino CMS-PAS-SUS-12-022

To start the analysis code just do:

ini ROOT534

./all.sh

In all.sh you have three commands:

./go.sh AtlasH

./go.sh SingleS

./go.sh EWKino

This is the syntax to run single analyses.

In go.sh you have to choose which background / signal samples and pileup events you would like to use for your analysis. For our STC scenarios we have:

" NoPU_BosonJets 50PU_BosonJets 140PU_BosonJets NoPU_DiBoson 50PU_DiBoson 140PU_DiBoson NoPU_TopJets 50PU_TopJets 140PU_TopJets NoPU_TTbar 50PU_TTbar 140PU_TTbar NoPU_TDR4 50PU_TDR4 140PU_TDR4 NoPU_TDR5 50PU_TDR5 140PU_TDR5 NoPU_TDR6 50PU_TDR6 140PU_TDR6 NoPU_TDR8 50PU_TDR8 140PU_TDR8 "

In the runReader.py script the cross sections for the different samples have to be add (already added).

For the background samples the cross sections are provided on the snowmass homepage http://www.snowmass2013.org/tiki-index.php?page=Energy_Frontier_FastSimulation

The (LO) cross sections for the signal samples are calculated as the mean of the cross sections calculated by Pythia 8.176. The NLO cross sections were calculated with prospino (already in runReader.py).

(all.sh -> go.sh -> readersScript.sh -> runReader.py -> readerAtlasH.C / readerSingleS.C / readerEWKino.C )

.root and .txt files are the output of the analysis code.

./cutflow.py

gives a cutflow table of the cut and count analysis

Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r6 - 2014-04-22 - KarimTrippkewitz
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback