Our Delphes Upgrades

This is a list with all the modules / readers / classes and external packages we have changed in order to produce an "upgraded" version of Delphes. Our Delphes has been designed and tested to compile correctly only inside CMSSW_6_X, in particular we are using SLHC versions. Everything should be installed under $CMSSW_BASE/src following what tells this README

Class Objects

  • classes/DelphesClasses.h and classes/DelphesClasses.cc :
  1. Electron, Muon and Photon objects : added isolation information as calculate in PF (charged, neutral, charged from PU, all particles and combine isolation).
  2. Jet : added substructure information, pileup jet ID and Q/G Likelihood input variables.
  3. Candidate : updated in order to fill correctly the new variables inside TreeWriter module. This object is used for internal only pourposes since this is the dataformat used by each module

Fastjet and Contrib tools

  • external/fastjet : has been cancelled and a standalone recipe for fastjet and contrib package installation has been inserted in this README.Having the possibility to use contrib classes in delphes is useful for: substructure analysis and pileup mitigation tools. Makefile has been modified in order to get rid of this new structure of external package.

External Packages

  • external/MinBiasProduction/genMiniBias_14TeV.cpp : code used to generate Pythia8 Minimum Bias events. It is compiled by the new version of Delphes Makefile. The output fileof this code should be converted in a binary version that can be used by PileUpMerger module via converters/hepmc2pileup.cpp or converters/stdhep2pileup.cpp

  • external/PUPPI : the package has been re-designed in order to have a code as close as possible to the official one in CMSSW.
  1. external/PUPPI/RecoObj.hh : particle id convention moved to the same one used in the official code.
  2. external/PUPPI/puppiParticle.hh : new class introduce for internal only used in the puppi algorithm. It store the info of each particle after the metric evaluation for each selected algo and the location of the particle in the original collection.
  3. external/PUPPI/puppiAlgoBin.hh : container for each puppi algorithm parameters. This object is build reading the information parsed from the Delphes card.
  4. external/PUPPI/puppiCleanContainer.hh and external/PUPPI/puppiCleanContainer.cc : is the object which runs the algorithm and gives back a new list of particles and weights. This code has been widely changed with respect to the version of Seth Zenz. It can take into account more than one algorithm for the same eta bin, combining the weights and different parameters for each alternative version inside the same eta region.

Modules

  • modules/BTagging.cc : for each jet in a input collection evaluate the flavour taking all the partons and gluons matched inside a dR cone wrt to the jet axis. Different Btagging flags and flavour values are stored taking into account different possibilities, as done in CMSSW jetFlavour . Example: loop on partons inside the jet with status =3, if not parton daughter store the jet flavour as the flavour of the heaviest parton or highest pt one or the one closer to the jet axis.

  • modules/FastJetFinder.cc : used to calculate Rho for an event or cluster particles into a jet collection. Cleanup of the code done, added the evaluation of Rho averaging the energy density evaluated in a fixed (eta,phi) grid with respect to the default method, which evaluates Rho clustering the event with kt algorithm. When jets are clustered, new contrib package is used to fill substructure information like: number of subjets, kinematic of subjects and groomed jets after trimming, pruning or softdrop, N-jettiness. Up to now parameters are hardcoded in the code but fixed to what suggest by the POG and this observables are calculated only for jets with Pt > 200 GeV.

  • modules/Isolation.cc : store the isolation value split by particle type: energy in the cone due to all the particles, energy due to neutral only (hadrons + photons), due to charged hadron and charged particles from PU. Useful to have a correct tune of the isolation cut at posteriori (offline).

  • modules/JetPileUpSubtractor.cc : this module can be used in two different way: when the flag doSafe4VAreaSubtraction is false, it takes in input a jet collection and a Rho value and applies L1 correction for pileup. On the other hand, when you want to apply safe subtraction, two things are done: Rho is re-evaluated clustering the whole particle in the event with a defined algo parsed from the card. After that, the whole event is re-clustered with another algorithm (commonly used to build up the jet collection) an the safe subtraction is applied at each stage of the clustering procedure. This is a sort of dynamic L1 correction that has to be done during clustering stage.

  • modules/PileUpJetID.cc : take as input a jet collection and evaluates the pileup jet id and Q/G likelihood input variables. Two ways: use jet constituent or match the particles in the event with the jet inside a fixed cone of chosen dR. Vertex collection is necessary to properly evaluate d0, dZ and betaStar values. By default, no jets are cut. Only input variables and a flag are stored for each jet; the flag is evaluated as a funtion of jet pt, eta and tightness of the chosen WP. The cut values to store the pileup jet id flag are stored in the card. The code in CMSSW for the input variable evaluation is located here InputVariables, while the cut definition is located here PileUpFlag.

  • modules/RunPUPPI.cc : modules used to run puppi procudere on top of an input particle collection. Widely change in order to take into account many more input parameters. Once PUPPI is run, an output collection of Candidates is produced with a corrected 4Momentum thanks to the PUPPI weights.

  • modules/TreeWriter.cc : changed in order to store in the output tree all the new variables added in the Delphes Dataformat. This class is used to migrate information from some existing particle collections, stored in the event as Candidate object, to dedicated classes like: Electron, Photon, Muon, Missing Energy, Rho, Jet, Track ... etc.

Readers

  • readers/DelphesPythia8.cpp : code modified in order to take in input a LHE file, apply possible cut defined by the user, run Pythia8 on top of the hard event and then run directly the delphes simulation in one shot using a defined card. The information of the LHE file are dumped in the output ROOT file and available at posteriori. An example of card with all the changes is committed here: JetStudies_Phase_I_50PileUp.tcl.

Submit Jobs

  • There is a generic script for DelphesPythia8 submission here in github SubmissionScript . The only caveat is that you must mount eos locally before run the script, via:
         eosmount $PWD/eos
       

  • This is an example of how you can run it from Delphes directory:
    python scripts/makeDelphesPythia8Jobs.py --inputdir /store/caf/user/rgerosa/TPSAMPLES_14TEV/LHE_FILES/gen_TP_uvev_126/ --lheKey phamom.dat --workdir SS_EWK_uvev --configCard Cards/TP_CARDS/CMS_Phase_I_50PileUp_Tracker2p5.tcl --outputname outputtree --inputPUdir /store/caf/user/rgerosa/TPSAMPLES_14TEV/MINIBIAS_PYTHIA8/ --eosdir /store/caf/user/rgerosa/TPSAMPLES_14TEV/DELPHES_TREES --executable DelphesPythia8 --eventsPerJob 10000 --mjjcut 10 --filter 1 --queue 8nh --sumbit
        
  • Main Options:
  1. inputdir -> input folder on eos where you have the unzipped lhe file.
  2. lheKey -> name to find in order to get only the lhe in the inputdir and related subdir.
  3. workdir -> name of the directory where Jobs will be created. The same name is used to create an output folder on eos in order not to re-write files.
  4. configCard -> path wrt card of delphes to be used (it will be copied and modify so you can use a committed one).
  5. outputname -> name of the output root (just the root of the name).
  6. inputPUdir -> directory on eos where PU files are stored.
  7. eosdir -> output directory path on eos.
  8. executable -> name of the exe file to run (DelphesPythia8).
  9. executableDumper -> name of the ntuple dumper exe. If not specified, the full Delphes tree will be stored.
  10. njobmax -> maximum number of jobs to submit and create.
  11. eventsPerJob -> how many events you want per job.
  12. mjjcut -> cut on mjj at LHE level.
  13. filter -> filter out fully hadronic events @LHE level.
  14. submit -> if you want to submit jobs and not only create them (first create, then submit).
  15. queue -> lxbatch queque to use
  • It works in the following way: for each lhe file in the folder it creates njobs asaf of how many events you want per job. The pileup file is took for each job throwing a uniform random number between (0,number of pileup files). This procedure is robust if we generate a large number of pileup files with ~100k events per file if jobs run over 10k events.

  • Pileup files and lhe files are copy to the node before the execution.

-- PietroGovoni - 2014-11-21

Edit | Attach | Watch | Print version | History: r4 < r3 < r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r4 - 2014-12-05 - JasperLauwers
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback