-- MatthieuPierreMarionneau - 11-Apr-2010

A-Z Ntuple Production Guide

Table of contents

The CMSSW NtupleMaker

Several tags of the NtupleMaker are available, for differents CMSSW releases. Using the HEAD tag in CVS could be under development, check on this page which tag has to be used.
If no special is specified, use the HEAD version.
Currently, v05-01-01 Tag is safe.

Data can be used with the CMSSW_3_8_X release.
Type in the workdir directory :
scramv1 project CMSSW CMSSW_3_8_X
cd CMSSW_3_8_X/src
cmsenv
cvs co -r <goodtag> -d GhmAnalysis UserCode/GautierHdeM/src/GhmAnalysis
addpkg CalibCalorimetry/EcalLaserCorrection
cp ~ferriff/public/PourMatthieu/EcalLaserDbService.h CalibCalorimetry/EcalLaserCorrection/interface/
cp ~ferriff/public/PourMatthieu/EcalLaserDbService.cc CalibCalorimetry/EcalLaserCorrection/src/

The python file used to run is GhmAnalysis/GhmEWKAnalysis/ghmAnalysis_7TeV.py.
The complete list of options of the NtupleMaker are detailed in GhmAnalysis/GhmEWKAnalysis/python/ghmZeeAnalysis_cfi.py.
Except the filling options, the Isolation switch and the level of cuts, no other configuration change is usually necessary.
Configuration file cmsRun ghmAnalysis_7TeV.py provided by default does not need any changes to work successfully.

Several filters are available, as L1 filter, collision filter, HLT filters, leptons or composite candidates. Be sure that the filter is consistent with data already processed or with the aim of the analysis.
By default, collision filter and a lepton filter is applied during processing

To run the NtupleMaker interactively, just compile and launch CMSSW smile
scramv1 b -j 8
cd GhmAnalysis/GhmEWKAnalysis
cmsRun ghmAnalysis_7TeV.py

How to run on the GRID, using CRAB

CRAB is a convenient interface between the CMSSW user and the GRID. Runnng on CRAB necessite a GRID certificate.
Several useful links are listed below :
CRAB Prerequisites
CRAB Guide
CRAB FAQ
CRAB Exit Codes

CRAB works with an configuration file, named by default crab.cfg, which contain several arguments. A typical file is detailed below :

# CMSSW parameters
[CMSSW]
pset=ghmZeeAnalysis_data.py #The CMSSW configuration file
datasetpath=/MinimumBias/Commissioning10-CollisionRecoSequence-v7/RECO #The DBS name of dataset

#Event numbering
total_number_of_events=-1 # Explicit, total number of events processed
events_per_job=10000 #Average number of event per job

runselection=132440 #For data, it permits to select runs which have to be processed
output_file=Ntuple.root #The name of the output file

#User parameters
[USER]
return_data=0 #Return output file in the submit location directory, do not use it for important task in limited storage area
email=matthieu.marionneau@cea.fr #return by mail status of tasks when they end, works only in server mode
ui_working_dir = run132440 #Name of the directory used to identificate the task
copy_data = 1 #Return output file on the storage directory given by following parameters
storage_element = node12.datagrid.cea.fr # GRID storage element where data are copied
storage_port = 8446 #Storage element port, depends on SRM manager version
storage_path=/srm/managerv2?SFN=/dpm/datagrid.cea.fr/home/cms/trivcat/store/user/mmarionn/ # General path where output file are copied
user_remote_dir = MinBias/run132440_bis # Special path where data are stored. storage_path+user_remote_dir is the directory of copy

#CRAB parameter
[CRAB]
scheduler=glite #Use it by default, need to be changed for CAF
jobtype=cmssw #type of job
server_name = slc5cern # Submission server name, use slc5cern for CMSSW_3_5_X, cern or bari in a slc4 job

#GRID Parameters, for blacklisting and whitelisting storage or computing elements, do not use it even if a specific SE/CE does not work at all
[GRID]
## By ComputingElement
#ce_black_list = storm.ifca.es
#ce_white_list = cmsdcache.pi.infn.it
## By StorageElement
#se_black_list = storm.ifca.es
#se_white_list = cmsdcache.pi.infn.it

The crab task submission is quite easy.
Use the command crab -create -cfg crab.cfg to create the CRAB task.
Use the command crab -submit -c <taksname> to submit the job on the GRID.

To monitor the task, there is several way.
Interactively , use the command crab -status -c <taksname>. Via internet, which is more convenient to follow several tasks, go to the http://dashb-cms-job-task.cern.ch/dashboard/request.py/taskmonitoring webpage.

How to run on CAF, using CRAB

Several informations are given in the CRAB Guide.
Under development

Edit | Attach | Watch | Print version | History: r7 < r6 < r5 < r4 < r3 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r7 - 2011-01-04 - unknown
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback