Getting and Running the HLT

Complete: 4

Introduction

The HLT runs off raw data files - specifically, off the single FEDRawDataCollection data structure; it can NOT be run off RECO or AOD files which do not keep this data structure. The HLT algorithms perform (regional) reconstruction as needed, and make trigger decisions.

The HLT is conceptually no different from any other CMSSW job: it is a series of modules written in c++ which are configured via a python config file. A CMSSW release can perform any standard workflow: there is no special HLT, Offline, or Analysis release. A CMSSW job (executable: cmsRun) is entirely defined by a cfg (configuration) file which defines the configuration of the c++ modules in the release.

The HLT is most similar to the CMSSW (offline) reconstruction job, i.e. it reconstructs a series of quantities in the detector starting from a RAW data file. However a key difference is that there is a single reconstruction config for a given release (although through the use of modifiers, this config can be altered for different data-taking scenarios as needed). This means the reconstruction config can be integrated into the release and it typically consists of thousands of python files which can be included as needed.

However there is not a single HLT for a given release. This is because the menu will evolve during data taking as well different menus existing for different needs (eg collisions vs cosmics vs special runs). Also users may be developing their own paths and thus have different versions of the menu. While a snapshot of the HLT is put in the release for Monte Carlo production, the HLT must usually be taken from an external source. Currently we use confdb to manage and maintain the HLT config and swap paths between them.

This twiki will describe how to access the HLT config, either the one in the release or downloaded from confdb and how to properly run it on samples. Each HLT is coupled to a specific release and global tag. While we try to maintain backwards compatiblity, that is newer releases can still run older menus, this is not guaranteed to work and there is even less guarantee that the results will be sensible.

Trigger development for Run-3

This part of the wiki describes how to develop new triggers and test them. In case you want your development integrated into the CMS trigger online, pay attention to the workflow achieving this goal, documented in detail here.

Important Note : Run-3 menus (which are all menus in 11_X or higher unless specifically noted in the name) must be customised when running on Run-2 data.

ConfDB Layout

There are several folders in ConfDB containing menus.

  • dev: this is where the latest integrated menus are located. Every major CMSSW version has its own directory (/dev/CMSSW_13_N_0/). Each directory features the combined menu (HLT) and its subtables (GRun, HIon, PIon, PRef), which are derived from the combined menu. The GRun subtable corresponds to the menu for standard pp data-taking.
  • frozen: this contains snapshots of GRun (and other menus as appropriate) corresponding to major releases of the HLT menu (e.g. 2018 v4.1 etc).
  • online: this contains the same menus as frozen but with the changes necessary to run online. This is mostly stripping out the MC Paths and adding online features such as the internal HLT DQM histograms. These menus are then copied over to the P5 database to be used for data-taking.
  • users: these is where CMS members develop their HLT paths prior to integration in the combined menu

In general, the latest GRun menu of the latest release is the starting point for development.

Getting the Menu From Confdb

There are two tools to download the menu from confdb: hltConfigFromDB and hltGetConfiguration.

They both use the same code under the hood however hltConfigFromDB is more of a direct dump while hltGetConfiguration customises the menu to make it appropriate for offline analysis work. Below, we focus on hltGetConfiguration, which is the tool we expect most users to use.

hltGetConfiguration

General Usage

This tool downloads an HLT menu from ConfDB, producing a python cfg file with proper customisations for re-running the HLT "offline". The major thing to specify is the menu and the global tag.

hltGetConfiguration [MENU LOCATION IN CONFDB] \
   --globaltag [NAME OF GLOBALTAG] \
   --mc # for real data, replace --mc with --data

To remove the HLT prescales, use --unprescale.

For the output options, you can use one the following 4 options

  • --output all : default, all output modules are retained
  • --output none : all output modules are removed
  • --output minimal : all output modules are removed, a single one is added which just keeps trigger-related outputs (e.g. HLT results, L1T objects)
  • --output full : all output modules are removed, a single one is added which saves everything

Important Note : The "--output full" is currently broken (see cmssw#37207 and CMSHLT-3147). You can fix it by appending the following lines to your python cfg file:

process.hltOutputFull.outputCommands += [
    'drop *_hltHbherecoLegacy_*_*',
    'drop *_hlt*Pixel*SoA*_*_*',
]

User-defined customisation functions can also appended to the menu by adding

--customise [CUSTOMISATION_1],[CUSTOMISATION_2]
For example, one can use such a customisation to further customise the EventContent of the output for his/her tests.

Important Note : for running over 2018 data in CMSSW_11_1+, it is mandatory to apply the following customisation (needed to adapt to un-upgraded HCAL barrel, and the old way of calibrating pixel clusters).

--customise HLTrigger/Configuration/customizeHLTforCMSSW.customiseFor2018Input

Using Services/PSets/ESModules from another Menu

While a given HLT path is self contained, to actually run it needs the Services, ESModules, PSets, ESSources globally attached to the menu. It is currently difficult (although this is changing!) to determine the exact subset of modules a path requires without trial and error so usually all existing ones are imported. To save you importing and maintaining all these files in your python file, hltGetConfiguration can download them from another area and put them in a setup file

--setup /dev/CMSSW_14_0_0/GRun

would take all the Services, ESModules, PSets and ESSources from the CMSSW_14_0_0 GRun menu into a separate config file which would then be included in the hlt file.

Getting a configuration fragment for including in another config (eg cmsDriver.py)

It can be useful to have the HLT in a format which it can be included in a config rather downloading a full complete config. For example it can be easier to smaller config file which loads the menu from this configuration fragment and then does any needed modifications than directly editing a 100K line python file.

--cff

Config file fragments are best placed in a Package/Subpackage/python directory. By doing this, they are automatically included in the search path python uses to look for modules which ensure consistent loading of the config file fragment. It will also be automatically shipped with crab and similar tools. It is best if you put them in HLTrigger/Configuration/python as this will allow you to use it with cmsDriver easier.

So first check out the package and do a scram b which will setup the appropriate symbolic links such that the directory will be added to the python include path. Note this only needs to be done after checking out the package, you do not need to run scram b after adding a file to that directory but it will have the advantage of trying to compile your python fragment to see if its valid.

git cms-addpkg HLTrigger/Configuration
scram b -j 4

Then to get the GRun menu you can do

hltGetConfiguration /dev/CMSSW_14_0_0/GRun \
   --globaltag auto:phase1_2024_realistic \
   --mc \
   --unprescale \
   --cff > "${CMSSW_BASE}"/src/HLTrigger/Configuration/python/HLT_User_cff.py
scram b -j 4

Then to include it in your config, simply load it as normal python file via

process.load("HLTrigger.Configuration.HLT_User_cff")

To use this in cmsDriver use the option

--step=HLT:User

Note that HLT:{menu} means use the menu in HLTrigger/Configuration/python/HLT_{menu}_cff.py, you can use any name other than User if you wish but the menu must be located that area and start with HLT_ and end with _cff.py.

Note: while running HLT plus RECO/PAT/AOD in the same cmsDriver command technically works, it has not been validated for Physics usage. The issue here is the loading of possibly conflicting ES module configurations by HLT and by RECO, where one overwrites the other.

Proxy Support for Downloading Menus Outside of CERN

hltGetConfiguration has been upgraded to be able to use a socks5 proxy like the GUI does. This feature is available since CMSSW_12_2_0_pre2, CMSSW_12_1_1, and CMSSW_12_0_4. If you are using older releases and need this new feature, please follow ConfDBRun3 for further details.

Once hltGetConfiguration proxy support has been added via these recipes you can enable it via

--dbproxy
which uses the defaults of hostname "localhost" and port "8080". To adjust these simply do

--dbproxy --dbproxyhost localhost --dbproxyport 8080

replacing localhost and 8080 with your hostname and port of choice. Finally as a reminder, to make a socks proxy simply do

ssh -f -N -D 8080 guest@lxplus.cern.ch

Note this currently only works for the offline database, not any other but this is currently being resolved PR: 40436 and this page will be updated once this is complete

Setup for Running on MC and Data

This section provides examples of how to run the HLT menus maintained by TSG in recent CMSSW releases.

Before testing, please read the following important notes.

General Notes.

  • This section only aims to provide technical examples. Recipes for physics studies (to estimate HLT rates, HLT timing, and trigger efficiencies) are provided by the STEAM subgroup of TSG.
  • TSG is currently developing HLT menus for the 13_3_X release cycle. This means that updates to the menus in ConfDB have to be done using HLT configurations for CMSSW_13_3_X in ConfDB (check the latest HLT combined table in ConfDB for the exact name of the correct 'release template'). HLT menus in ConfDB for earlier cycles (e.g. 13_2_X) are no longer under development.
  • At the same time, the latest CMSSW release cycle is 14_0_X. This is the CMSSW release cycle open for development ("pull requests", PRs, to CMSSW must first go to the master branch, which presently corresponds to the CMSSW_14_0_X branch). PRs that are necessary for HLT-menu development must be backported by HLT developers to the release cycle presently used for HLT-menu development (after being merged in the master branch).
  • For testing a HLT menu in a given CMSSW release cycle (e.g. CMSSW_X_Y_*), the recommendation is to use the latest release available for that cycle (e.g. the highest Z in CMSSW_X_Y_Z), unless there are specific reasons not to do so.
  • To correctly rerun the HLT reconstruction, it is important to be consistent in choosing (1) the HLT menu, (2) the CMSSW release, (3) the input EDM files, (4) the GlobalTag, and possibly (5) the L1T menu (when not using the one from the GlobalTag). Inconsistent choices can cause the job to fail at runtime (or at worst, to silently return incorrect results). Dependencies between these 5 items are not always strict, so recommendations can vary depending on the use case. A few general guidelines are listed below.
    • The HLT menu must be compatible with the CMSSW release. In general, a HLT menu based on the ConfDB release template "CMSSW_X_Y_Z" will work when used with the release CMSSW_X_Y_Z, might work in "higher" releases (esp. those not much 'higher' than CMSSW_X_Y_Z), and will likely not work in "lower" releases.
    • The HLT menu must be compatible with the L1T menu.
    • The GlobalTag must be compatible with the CMSSW release.
    • The GlobalTag should be compatible with the EDM input file(s). For Run-2 and Run-3 data, it is currently sufficient to use the latest HLT online GT prepared by the AlCaDB group for a given CMSSW release (examples in the subsections below). For MC, the choice of GT might be less obvious depending on the MC samples one needs to analyse; for example, it is important to make sure that the MC GT contains a BeamSpot tag (used for the BeamSpot reconstruction) which is consistent with the BeamSpot parameters used in the GEN-SIM step of the MC sample used as input.

Note on not using auto: GlobalTags on real data.

  • When testing on real data, the recommendation is to use the latest HLT online GT of the relevant CMSSW release cycle.
  • Currently, this (i.e. the latest HLT online GT) can be selected only using the explicit name of the GT, e.g. 132X_dataRun3_HLT_v2.
  • CMSSW provides the option to choose GTs via the so-called "auto:" keyword, e.g. auto:run3_hlt. The important point is that currently, for HLT (or Prompt) GTs, these auto:* aliases do not point to the HLT (or Prompt) GTs: they point to a "frozen" version of those GTs. Here, "frozen" means that those GTs are not updated with new payloads, which means that all time-dependent calibrations are frozen at the values they had at a certain point in time, set when those "frozen" GTs were produced (typically, this point in time is close to the time at which the frozen GT was created).
  • The recommendation is not to use auto:* GTs when running on real data, because using a frozen GT on data taken after that frozen GT was created can result in using outdated conditions.

Note on global tag compability for MC

  • There are some records that must be identical to those used when producing the RAW the HLT is running on, typically this is because the inverse of some correction which is applied on the RAW data is used in the simulation and/or digitisation of the MC events.
    • A classic example is that the inverse of the ECAL laser corrections are applied to the MC digis when making the RAW. If a different set of corrections are applied at the HLT, then you will have biased results as laser corrections are not the inverse of the correction applied when making the RAW. A similar thing happens to the pixel charge.
  • Therefore it is absolutely essential to compare the GT that you are currently using and the GT the RAW was produced with to ensure that they are compatible.
  • Only an expert can truly tell you if its compatible, but if the following tags change, they are NOT compatible (unless the tags are just renaming of the same payload), note this is not exhaustive and if you know of any tags that must not change, please add them to this list.
    • EcalLaserAPDPNRatiosRcd
    • SiPixel*TemplateRcd
    • SiPixelGenErrorDBObjectRcd
    • SiStripApvGainSimRcd
    • SiStripLorentzAngleSimRcd

Notes on the re-emulation of the L1 Trigger.

  • hltGetConfiguration provides the option to include the re-emulation of the L1Trigger when rerunning a HLT menu.
  • This is controlled by the options --l1-emulator [X] (enables the L1T re-emulation of type [X]) and --l1 [Y] (loads the L1T menu tag [Y] from the conditions database, instead of using the L1T-menu tag available in the GT). One can use the option --l1Xml [XML FILE] instead of --l1 [Y] to specify the L1T menu via a local .xml file, rather than a payload in the conditions database.
  • The L1 Trigger can be re-emulated at different levels, and this is controlled by the string [X] in --l1-emulator [X]. More information on the L1T re-emulation can be found in this Twiki maintained by the L1T group and this presentation from the 2021 L1T tutorial.
  • When --l1-emulator [X] is specified, it is also necessary to specify --eras Run3 (for technical reasons). When doing so, any additional customisations given to hltGetConfiguration via --customise should not make use of "era modifiers". In case of doubts or problems related to this, please ask in the "HLT User Support" channel of the "TSG" Team on Mattermost.
  • The addition of the L1T re-emulation will increase the execution time of the job. For the correct recipe to make measurements of HLT timing with the latest HLT menus, please contact the STEAM subgroup.

Notes on GPU offloading.

  • The latest Run-3 HLT menus include modules for offloading parts of the HLT reconstruction to GPUs (starting from /dev/CMSSW_12_3_0/GRun/V78).
  • GPU offloading can be controlled from the configuration file via the object process.options.accelerators of type cms.untracked.vstring.
    • Currently, options.accelerators is not explicitly set in the HLT menus maintained by TSG in ConfDB. When absent, this parameter defaults to ['*']. This means that cmsRun will offload to GPU if the machine has a GPU; otherwise, only the CPU will be used.
    • If options.accelerators = ['cpu'] is specified, cmsRun will not offload to GPU, even if the machine has a GPU.

CMSSW_14_0_X (presently used for HLT-menu development)

Setup:

cmsrel CMSSW_14_0_7_MULTIARCHS
cd CMSSW_14_0_7_MULTIARCHS/src
cmsenv
git cms-init
scram build -j 4

Example to run on events of a Run-3 2023 MC RelVal sample [*].

hltGetConfiguration /dev/CMSSW_14_0_0/GRun \
   --globaltag auto:phase1_2024_realistic \
   --mc \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --input /store/mc/Run3Winter24Digi/TT_TuneCP5_13p6TeV_powheg-pythia8/GEN-SIM-RAW/133X_mcRun3_2024_realistic_v8-v2/80000/dc984f7f-2e54-48c4-8950-5daa848b6db9.root \
   --eras Run3 --l1-emulator uGT --l1 L1Menu_Collisions2024_v1_2_0_xml \
   > hltMC.py

cmsRun hltMC.py >& hltMC.log

Example to re-run the HLT on RAW data from 2023 pp-collisions runs [*].

hltGetConfiguration /dev/CMSSW_14_0_0/GRun \
   --globaltag 140X_dataRun3_HLT_for2024TSGStudies_v1 \
   --data \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --eras Run3 --l1-emulator uGT --l1 L1Menu_Collisions2024_v1_2_0_xml \
   --input /store/data/Run2023D/EphemeralHLTPhysics0/RAW/v1/000/370/293/00000/2ef73d2a-1fb7-4dac-9961-149525f9e887.root \
   > hltData.py

cmsRun hltData.py >& hltData.log

[*] Notes.

  • For MC, the choice of the GT depends on the input sample. For 2023 MC samples, a GT for 2023 conditions should be used (examples: auto:phase1_2023_realistic, 133X_mcRun3_2023_realistic_v*). For 2024 MC samples, a GT for 2024 conditions should be used (examples: auto:phase1_2024_realistic, 133X_mcRun3_2024_realistic_v*), The name of the input dataset can help identifying the conditions with which it was produced (e.g. /*/*mcRun3_2024*/*RAW* for RAW datasets of 2024 MC samples).

  • For Data, an online (or, HLT) GT must be used. This example uses 140X_dataRun3_HLT_for2024TSGStudies_v1, which is a special online GT recommended for 2024 trigger studies, as it uses the latest conditions for the HCAL-energy-response corrections, PF-hadron calibrations and HLT jet-energy-scale corrections (all the other conditions are the same as 140X_dataRun3_HLT_v1). For the latest settings to be used for HLT-rates and HLT-timing studies, please see here and/or contact the STEAM group.

CMSSW_13_3_X

Please use CMSSW_14_0_X to develop changes to the HLT menus maintained by TSG. The instructions for older release cycles will be removed in the near future.

Setup:

cmsrel CMSSW_13_3_1_patch1
cd CMSSW_13_3_1_patch1/src
cmsenv
git cms-init
scram build -j 4

Example to run on events of a Run-3 2023 MC RelVal sample [*].

hltGetConfiguration /dev/CMSSW_13_3_0/GRun \
   --globaltag auto:phase1_2023_realistic \
   --mc \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --input /store/relval/CMSSW_13_3_0_pre4/RelValTTbar_14TeV/GEN-SIM-DIGI-RAW/133X_mcRun3_2023_realistic_v1_Standard_13_3_0_pre4-v1/2590000/4c47f6d7-9938-4c87-b795-ece3aa6d3d22.root \
   --eras Run3 --l1-emulator uGT --l1 L1Menu_Collisions2023_v1_3_0_xml \
   > hltMC.py

cmsRun hltMC.py >& hltMC.log

Example to re-run the HLT on RAW data from 2023 pp-collisions runs [*].

hltGetConfiguration /dev/CMSSW_13_3_0/GRun \
   --globaltag 133X_dataRun3_HLT_for2024TSGStudies_v1 \
   --data \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --eras Run3 --l1-emulator uGT --l1 L1Menu_Collisions2023_v1_3_0_xml \
   --input /store/data/Run2023D/EphemeralHLTPhysics0/RAW/v1/000/370/293/00000/2ef73d2a-1fb7-4dac-9961-149525f9e887.root \
   > hltData.py

cmsRun hltData.py >& hltData.log

[*] Notes.

  • For MC, the choice of the GT depends on the input sample. For 2023 MC samples, a GT for 2023 conditions should be used (examples: auto:phase1_2023_realistic, 133X_mcRun3_2023_realistic_v*). For 2024 MC samples, a GT for 2024 conditions should be used (examples: auto:phase1_2024_realistic, 133X_mcRun3_2024_realistic_v*), The name of the input dataset can help identifying the conditions with which it was produced (e.g. /*/*mcRun3_2024*/*RAW* for RAW datasets of 2024 MC samples).

  • For Data, an online (or, HLT) GT must be used. This example uses 133X_dataRun3_HLT_for2024TSGStudies_v1, which is a special online GT recommended for 2024 trigger studies, as it uses the latest conditions for the HCAL-energy-response corrections, PF-hadron calibrations and HLT jet-energy-scale corrections (all the other conditions are the same as 133X_dataRun3_HLT_v2). For the latest settings to be used for HLT-rates and HLT-timing studies, please see here and/or contact the STEAM group.

CMSSW_13_2_X

Please use CMSSW_14_0_X to develop changes to the HLT menus maintained by TSG. The instructions for older release cycles will be removed in the near future.

Setup:

cmsrel CMSSW_13_2_8
cd CMSSW_13_2_8/src
cmsenv
git cms-init
scram build -j 4

Example to run on Run3Winter23 MC ("forPU65"):

hltGetConfiguration /dev/CMSSW_13_2_0/GRun \
   --globaltag 126X_mcRun3_2023_forPU65_forBTag_v1 \
   --mc \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --input /store/mc/Run3Winter23Digi/TT_TuneCP5_13p6TeV_powheg-pythia8/GEN-SIM-RAW/GTv4BTagDigi_126X_mcRun3_2023_forPU65_forBTag_v1_ext2-v2/60000/ae2ab9cc-64d7-40ff-a73f-bae4a7a17cf4.root \
   --eras Run3 --l1-emulator FullMC --l1 L1Menu_Collisions2023_v1_3_0_xml \
   > hltMC.py

cmsRun hltMC.py >& hltMC.log

Example to re-run the HLT on RAW data from 2023 HI-collisions runs:

hltGetConfiguration /dev/CMSSW_13_2_0/HIon \
   --globaltag 132X_dataRun3_HLT_v2 \
   --data \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --eras Run3 --l1-emulator uGT --l1 L1Menu_CollisionsHeavyIons2023_v1_1_5_xml \
   --input /store/hidata/HIRun2023A/HIEphemeralHLTPhysics/RAW/v1/000/375/823/00000/cba08fac-450b-431d-88a0-aa9244776c42.root \
   > hltData.py

cmsRun hltData.py >& hltData.log

CMSSW_13_0_X

Please use CMSSW_14_0_X to develop changes to the HLT menus maintained by TSG. The instructions for older release cycles will be removed in the near future.

Setup:

cmsrel CMSSW_13_0_14
cd CMSSW_13_0_14/src
cmsenv
git cms-init
git cms-merge-topic silviodonato:customizeHLTFor2023 # to download extra customisation functions for offline studies
scram build -j 4

Note : the latest GRun menu is compatible with the latest 2023 (v1.3.0) L1T menu, and incompatible with the 2022 L1T menu(s). The last GRun menu compatible with the 2022 L1T menu is /dev/CMSSW_13_0_0/GRun/V24. If you need to run the latest GRun menu using the (last) 2022 L1T menu, you can use the customisation function customizeHLTFor2022L1TMenu described in this README (this will undo the L1T-seed changes that broke backward compatibility). Similarly, one can use the customisation function customizeHLTFor2023L1TMenu_v1_0_0 to make the GRun menu compatible with the first L1T menu of 2023, i.e. "2023-v1_0_0".

Example to run on Run3Winter23 MC ("forPU65"):

hltGetConfiguration /dev/CMSSW_13_0_0/GRun \
   --globaltag 126X_mcRun3_2023_forPU65_forBTag_v1 \
   --mc \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --input /store/mc/Run3Winter23Digi/TT_TuneCP5_13p6TeV_powheg-pythia8/GEN-SIM-RAW/GTv4BTagDigi_126X_mcRun3_2023_forPU65_forBTag_v1_ext2-v2/60000/ae2ab9cc-64d7-40ff-a73f-bae4a7a17cf4.root \
   --eras Run3 --l1-emulator FullMC --l1 L1Menu_Collisions2023_v1_3_0_xml \
   > hltMC.py

cmsRun hltMC.py &> hltMC.log

Example to re-run the HLT on RAW data from 2023 pp-collisions runs:

hltGetConfiguration /dev/CMSSW_13_0_0/GRun \
   --globaltag 130X_dataRun3_HLT_v2 \
   --data \
   --unprescale \
   --output minimal \
   --max-events 100 \
   --eras Run3 --l1-emulator uGT --l1 L1Menu_Collisions2023_v1_3_0_xml \
   --input /store/data/Run2023D/EphemeralHLTPhysics0/RAW/v1/000/370/293/00000/2ef73d2a-1fb7-4dac-9961-149525f9e887.root \
   > hltData.py

cmsRun hltData.py &> hltData.log

Developing a Menu in ConfDB

All paths must ultimately be added to ConfDB inorder to be included in the HLT menu. The easiest way way to do this by creating a new menu in your confdb area (/users/, create it if doesnt exist) and importing a path you wish to start from from the latest GRun menu. You can optionally import all the Services, PSets, ESModules and ESSources as well into your new menu but you may wish to simply retreive them from the GRun menu using the "--setup" feature in hltGetConfiguration.

It is likely that your new path may require new c++ code. CMS code policy is that all code must be integrated in the development release (currently 13_3_X) before being backported as needed. There is also a no change policy for RECO output for older releases so care must be taken with any backports. In practise it is unlikely you will need to backport to older releases at this time as the release for data taking is still under development.

You may find it easier to test in older releases, such as 11_3_X which over a more stable base but ultimately the trigger must be tested and run in the 13_3_X series.

When changing c++ code, particular care must be taken to preserve older behaviour if at all possible, that is it is desirable that an existing config when run with your patch will give the same output.

An important issue is when the c++ code requires changes to the HLT configuration template, either through the addition of a new module or by adding/removing parameters of existing c++ modules.

ConfDB depends on the HLT configuration template for a given release to tell it what c++ modules exist and what parameters (along with their default values) exist for a given module. If a module does not exist in the HLT configuration template, it can not be added and if a parameter of a module does not exist in the HLT configuration template it can not be set. When migrating between templates, ConfDB will automatically remove parameters/moduels which have been removed from the release and automatically add any new parameters to any module instances with their value set to their default.

ConfDB HLT configuration templates are generated using the fillDescriptions() method. Any module used at the HLT should define a fillDescriptions method and all parameters should have default values as if not, ConfDB will be unable to see them. This is manditory for new modules. Certain legacy modules are parsed from the "_cfi.py" and if adding parameters please consider adding a fillDescriptions() and only if this is not possible, add the new parameters to the _cfi.py file. Modules should not depend on whether a given parameter is present or not and if this behaviour is needed, it should use the default value to achieve this. Finally when adding new parameters, please set the default values in such a way so that the old behaviour of the module is reproduced.

Using your new modified modules which have changed the interface is tricky. The code must be integrated into a release before a ConfDB template can be parsed. Therefore for much of your development you can not use confdb to add your new module or alter your parameters. For adding a new module type, we suggest making a dummy placehold module in your config with the correct name but a random module type. You can then override this in your downloaded hlt config eg

process.myNewModule = cms.EDProducer("MyNewModule",....)

For the case of added parameters, you should just add them in your config such as...

process.existingModule.newParameter = cms.string("newParamValue")

Place these customisations manually at the end of your config or better yet define a customisation function which does this and include it via the "--customise" option in hltGetConfiguration.

Note that this is only for testing and development till your new code is parsed in confdb, the final config submitted to storm must be fully self contained

Submission for integration into CMSSW (CMSHLT-JIRA tickets)

Follow the instructions given here to get your paths integrated in the HLT menu. Ultimately the menu must be integrated in the latests combined menu so ensure you path cleanly imports to that menu and that its release tag is migrated to the current release.

All paths must pass the integration tests listed in that twiki.

How to run the production HLT within CMSSW

This part of the wiki describes the instructions on how to run the HLT contained in the CMSSW release you are using; no access to ConfDB is required. This mode is typically used for large-scale Monte-Carlo event production.

The menus in ConfDB are reguarly dumped as config file fragments int HLTTrigger/Configuration/python

They are

  • HLT_GRun_cff.py : the GRun menu, this is the standard "pp" menu
  • HLT_Fake[12]?_cff.py : a pass through menu for legacy L1, stage-1 L1 and stage-2 L1
  • HLT_HIon_cff.py : the heavy ion menu
  • HLT_PIon_cff.py : proton-ion menu
  • HLT_PRef_cff.py : the menu for the hion proton-proton reference run
  • HLT_Special_cff.py: the superset of all the special menus used for FOG outside of physics prodution
  • HLT_Full_cff.py : the combined menu containing all integrated paths

Additionally depending on the release there may be frozen menus used for MC production. For example CMSSW_10_2_3 used for the 2018 MC production has HLT_2018v32_cff.py which was the version frozen for MC production.

You can always find out from a menu where it is located in confdb by looking at the start of the file which will specify its confdb location in the first few lines as a comment

# /dev/CMSSW_12_0_0/HLT/V2 (CMSSW_12_0_0_pre1)
and as a PSet
fragment.HLTConfigVersion = cms.PSet(
  tableName = cms.string('/dev/CMSSW_12_0_0/HLT/V2')
)

You can include these menus via the --step=HLT:, eg

--step=HLT:GRun

will run the GRun menu. Finally in production cmsDriver commands, they make use of autoHLT.py to map common workflows to actual HLT menus. For example in CMSSW_12_0_0 "HLT:@relval2021" maps to GRun while "HLT:@relval2018" maps to "Fake2"

Legacy HLT menus and older releases

CMSSW_13_2_X (2023 HIon data taking release)

2023 Run-3 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH el8_amd64_gcc11
cmsrel CMSSW_13_2_9
cd CMSSW_13_2_9/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_13_0_X (2023 pp data taking release)

2023 Run-3 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH el8_amd64_gcc11
cmsrel CMSSW_13_0_17
cd CMSSW_13_0_17/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_12_5_X (2022 HIon test data taking release)

2022 Run-3 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH el8_amd64_gcc10
cmsrel CMSSW_12_5_5_patch1
cd CMSSW_12_5_5_patch1/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_12_4_X (2022 bulk pp data taking release - HLT V1.1 and higher)

2022 Run-3 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH el8_amd64_gcc10
cmsrel CMSSW_12_4_12
cd CMSSW_12_4_12/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_12_3_X (2022 initial pp data taking release - HLT V1.0 menu)

2022 Run-3 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH slc7_amd64_gcc10
cmsrel CMSSW_12_3_7_patch1
cd CMSSW_12_3_7_patch1/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_10_3_X (2018 HIon data taking release)

2018 Run-2 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH slc6_amd64_gcc700
cmsrel CMSSW_10_3_3_patch1
cd CMSSW_10_3_3_patch1/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_10_1_X (2018 pp data-taking release)

2018 Run-2 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH slc6_amd64_gcc630
cmsrel CMSSW_10_1_11_patch1
cd CMSSW_10_1_11_patch1/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_9_2_X (2017 collision-data-taking release)

2017 Run-2 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH slc6_amd64_gcc530
cmsrel CMSSW_9_2_15
cd CMSSW_9_2_15/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_8_0_X (2016 pp and pPb collision-data-taking release for HLT)

2016 Run-2 development HLT menus: GRun for pp (25ns), HIon for PbPb, PRef for pp5TeVref, and PIon for pPb.

setenv SCRAM_ARCH slc6_amd64_gcc530
cmsrel CMSSW_8_0_31_patch1
cd CMSSW_8_0_31_patch1/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_7_5_X (2015 PbPb collision-data-taking release for HLT)

2015 Run-2 HLT menus

setenv SCRAM_ARCH slc6_amd64_gcc491
cmsrel CMSSW_7_5_9
cd CMSSW_7_5_9/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_7_4_X (2015 pp collision-data-taking release for HLT)

2015 Run-2 HLT menus

setenv SCRAM_ARCH slc6_amd64_gcc491
cmsrel CMSSW_7_4_16_patch2
cd CMSSW_7_4_16_patch2/src
cmsenv

git cms-addpkg HLTrigger/Configuration
git cms-merge-topic cms-tsg-storm:74XHLTppRefMenu # for the ppRef5TeV menu

scram build -j 4
cd HLTrigger/Configuration/test

CMSSW_5_3_X (Legacy release for Run-I data taking period - 2011/2012/2013)

2013 PIon, 2012 GRun, 2011 HIon, and 2011 resurrected HLT menus

setenv SCRAM_ARCH slc6_amd64_gcc472
cmsrel CMSSW_5_3_38
cd CMSSW_5_3_38/src
cmsenv

git cms-addpkg HLTrigger/Configuration

scram build -j 4
cd HLTrigger/Configuration/test

Useful links

Test Files (available in root://eoscms.cern.ch/)

Run3Winter24

/store/mc/Run3Winter24Digi/TT_TuneCP5_13p6TeV_powheg-pythia8/GEN-SIM-RAW/133X_mcRun3_2024_realistic_v8-v2/80000/dc984f7f-2e54-48c4-8950-5daa848b6db9.root
</vebatim>

Run3Winter23 ("forPU65")
<verbatim>
/store/mc/Run3Winter23Digi/TT_TuneCP5_13p6TeV_powheg-pythia8/GEN-SIM-RAW/GTv4BTagDigi_126X_mcRun3_2023_forPU65_forBTag_v1_ext2-v2/60000/ae2ab9cc-64d7-40ff-a73f-bae4a7a17cf4.root
</verbatim>

Run3Summer22EE
<verbatim>
/store/mc/Run3Summer22EEDR/TT_TuneCP5_13p6TeV-powheg-pythia8/GEN-SIM-RAW/Poisson70KeepRAW_124X_mcRun3_2022_realistic_postEE_v1-v1/2820000/9d9729c8-b83e-4714-8c6c-56aeca948934.root
</verbatim>

Run3Summer22

<verbatim>
/store/mc/Run3Summer22DRPremix/QCD_PT-120to170_TuneCP5_13p6TeV_pythia8/GEN-SIM-RAW/124X_mcRun3_2022_realistic_v12-v2/80000/0e091590-ec1b-422f-81c0-e2e4f792dd90.root
</verbatim>

Run3Summer21

<verbatim>
/store/mc/Run3Summer21DRPremix/TT_TuneCP5_14TeV-powheg-pythia8/GEN-SIM-DIGI-RAW/120X_mcRun3_2021_realistic_v6-v2/2540000/b354245e-d8bc-424d-b527-58815586a6a5.root
</verbatim>

2018 Data

<verbatim>
/store/data/Run2018D/EphemeralHLTPhysics7/RAW/v1/000/323/790/00000/B543D251-40F1-CB46-A6A1-046CF3D78D6D.root
</verbatim>

<br />%RESPONSIBLE% Main.MartinGrunewald %BR%
</noautolink>
Edit | Attach | Watch | Print version | History: r3277 < r3276 < r3275 < r3274 < r3273 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r3277 - 2024-05-22 - MarcoMusich



 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    CMSPublic All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback