-- SebastianOlivares - 4-Jan-2013

TauTriggerValidationTutorial-D3PD

Validation Task

The Tau Trigger Validation is a regular task (in average two per month) where you have to compare two datasets with some different characteristics and see if the behaviour of the Tau Trigger have discrepancy between them. You should report to the Validation team by the given deadline and to the Tau Slice group on the first Monday before the previous report.

eGroup

To receive the emails from the Validation and the Tau Trigger teams you should subscribe here: (http://cern.ch/groups) to the following groups:

- "atlas-tau-trigger (Automatic Mailing list for tau trigger E-log)" <atlas-tau-trigger@cern.ch>

- "hn-atlas-tapm-tau-trigger (TAPM Tau Trigger)" <hn-atlas-tapm-tau-trigger@cern.ch>

- "Trigger Validation <hn-atlas-triggerReleaseValidation@cern.ch>"

Common Mail

The Validation team send general emails to all the triggers slices requesting specific tasks. Generally a single validation consist in three or more tasks where the team need to check if the changes on some samples affects the functioning of the Trigger. For example this is an extract of a previous validation:

(2) Task : Validation of the 64-bit identifiers, problem observed with electrons during the last validation will be hopefully gone.

Test: s1469_r3758 (17.2.0.2, 17.3.3.1) Reference a) s1469_r3799 (17.2.0.2,17.2.3.6)

This is a validation of a new version of the software made with 64-bit identifiers (we commonly used 32-bit based)

Then the first thing you need to do is to search the appropriate tau datasets of the requested samples tags. Following the validation example we will search for the tags: s1469_r3758 and s1469_r3799.

Searching Datasets (AMI)

Giving that we would like to see behaviour of the tau trigger within all the energy range we will use two MC datasets with different energy: for low pT Ztautau (Z boson decaying to two taus) and for high pT Atautau (CP-Odd Higgs of 800 GeV decaying to two taus).

AMI (ATLAS Metadata Interface) is a generic cataloging framework used in ATLAS for dataset search (also a Tag Collector + one or two other things).

1. The first thing you have to do is entry the AMI web page (you need a grid certificate correctly loaded into your browser): http://ami.in2p3.fr:8080/AMI/servlet/net.hep.atlas.Database.Bookkeeping.AMI.Servlet.Command?linkId=512

2. You will be requested for your username and password. If you have never enter to this web page you should create an account.

3. In the first rectangle you should copy the tag of the dataset you want to look for, following the example case we will first search for s1469_r3758.

4. On the top of the page (just under Home) you will find a "clickeable" rectangle with the kinds of dataset we can find for search, in this case we will use the valid_001-production.

5. On the third column (dataType) click on the Groupby button (left icon) and select Browse only AOD's.

6. On the second column (logicalDatasetName) click on the filter button and write tau.

7. Copy the name of the low and high pT datasets in a simple document because we will need them in the next step. In our example we would copy:

 valid1.106052.PythiaZtautau.recon.AOD.e1127_s1469_r3758 
 valid1.106573.PythiabbAtautauMA800TB35.recon.AOD.e1127_s1469_r3758 

Requesting D3PD (Savannah)

Since we don't have authorization to produce the Tau D3PDs, we have to request the production of these to the Tau working group.

1. Enter the TauD3PD Production Savannah web page: https://savannah.cern.ch/bugs/?group=taucpd3pd

2. Click login in the column of the left. You will be requested for your username and password. If you have never enter to this web page you should create an account.

3. On the top menu put the cursor of the mouse over "Bugs" and click on Submit.

4. You should put on the title of the topic "Validation Task 'Month and Day of the validation'" and request the datasets you copied previously from AMI.

Note: Please check if the dataset has been already requested in a previous Validation in order to do not order the same production twice.

Downloading D3PD

1. Go to: http://panda.cern.ch and click on "Task" on the left column.

2. Click on Search and put the configuration tag used to made the D3PD (e.g. p1130) on the rectangle "Configuration Tag".

3. Confirm the searching by clicking on "Query Submit".

4. Look for the requested D3PD in the list and see the status in the "State" column.

5. If it is done, click on the number of the column "Task ID" and copy the name of the D3PD in the "Out" row, for example:

 valid2.106052.PythiaZtautau.merge.NTUP_TAU.e1127_s1517_r3935_p1130_tid00984200_00 

Downloading the dataset with DQ2

Noticed that first you need to setup the grid environment that is explicated in the next chapter.

1. Go to your temporally folder.

cd /tmp/username

2. Paste the dq2-get command you just copy from Panda and wait untill the download is done.

dq2-get valid2.106052.PythiaZtautau.merge.NTUP_TAU.e1127_s1517_r3935_p1130_tid00984200_00

The Basics

Loging to your lxplus account

The first thing we have to do is entry to lxplus, in this case we will specify at the end of the command control keys to access to graphical modes (for root framework and emacs).
ssh yourlogin@lxplus.cern.ch -X -Y

It is assumed that you have an account on LXPLUS already, and have about 100MB of spare quota. You can check this by typing;

fs lq
the result is given in kilobytes.

Setting up Athena

1. If you have never run any athena release before you need to create the following file in your home area:

  emacs -nw .asetup

2. Copy the following lines in the file and save it:

 [defaults]
 default32 = True
 briefprint = True

3. Setup the needed Athena release (you should setup the newest release between all the datasets of the validation):

 asetup <athena_release>,here
 e.g.
 asetup 17.2.1,here

* You need to run point 1-2 only once, but the third point need to be change any time you enter lxplus for make a Validation.

Setting up Grid environment

1. Create a directory called myscripts in your home directory:

  mkdir myscripts
  cd myscripts

2. Make your grid requirements file:

  emacs -nw grid_env.sh

3. Copy the following lines in the file and save it:

  source /afs/cern.ch/project/gd/LCG-share/current_3.2/etc/profile.d/grid-env.sh
  voms-proxy-init --voms atlas
  source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.zsh

* You should change the ending of the last line (setup.zsh or setup.sh) according to your shell.

4. Setup the grid doing (you will be asked for your grid password):

  source grid_env.sh

Note: You only need to run the grid environment file when you want to download something from the grid.

D3PD Validation package Installation

Downloading the package

It is recommendable that you should always work with the latest version of the D3PD Validation package. You can cross check your current version with the ChangeLog of the trunk package located on svn (https://svnweb.cern.ch/trac/atlasoff/browser/Trigger/TrigAnalysis/TrigTauAnalysis/TrigTauD3PDValidation/trunk).

To download the HEAD (latest) version of the package:

  cmt co Trigger/TrigAnalysis/TrigTauAnalysis/TrigTauD3PDValidation 

Compiling the package

Compile the package by doing:

 cd Trigger/TrigAnalysis/TrigTauAnalysis/TrigTauD3PDValidation/cmt
 gmake

Running the package

You can run the code that would create the essence of the validation by doing:

 cd ../python
 root -b -q 'StartingScript.C("/directory/TestD3PDname/*.root*","MC",2012,"/directory/ReferenceD3PDname/*.root*")'

The first variable is the D3PD path, the second is the option "MC" or "data" and the third variable describes the set of cut variables used. Adding a second D3PD name, the package will compare the cut variables and some turn on curves of the test and reference D3PD.

You will get the following output files:

CutVariables.eps - cut variable distributions of the test D3PD.

TurnOnCurves.eps - some turn on curves of the test D3PD.

ReferencePlots.eps - comparison of the cut variable distributions between test and reference D3PD

ReferenceTurnOnCurves.eps - comparison of some turn on curves between test and reference D3PD

Results

TurnOnCurves (Efficiency plots)

Variables plots

Explaining differences

In order to understand better why we see differences between the samples, we can compare which package version is being used for the athena release we are interested in. To get the tag differences, we first setup either the athena release for the test or the reference and then ask for a tag difference between reference ("ref") and test (or check "chk"). For example,

 asetup AtlasProduction,17.2.6.2
 get-tag-diff.py --ref=AtlasProduction,17.2.1.4 --chk=AtlasProduction,17.2.6.2

finds that there are 506 tags which are different, but we are interested in tau tags differences, so we list only the tau differences packages. The output is,

 AtlasProduction/17.2.6.2 with platform i686-slc5-gcc43-opt
 at /cvmfs/atlas.cern.ch/repo/sw/software/i686-slc5-gcc43-opt/17.2.6
 ::: setup reference env. [AtlasProduction,17.2.1.4]...
 AtlasSetup(WARNING): CMTROOT has been changed to /cvmfs/atlas.cern.ch/repo/sw/software/i686-slc5-gcc43-opt/17.2.1/CMT/v1r24
 ::: setup check env. [AtlasProduction,17.2.6.2]...
 ::: found [506] tags which are different
 ref             ref-project     | chk             chk-project     | pkg-name                                     
 ------------------------------------------------------------------------------------------------------------------------
 01-00-05-01     Simulation      | 01-00-07-01     Simulation      | External/Tauolapp                             
 01-00-83        Simulation      | 01-00-85        Simulation      | Generators/Tauola_i                          
 00-00-06        Simulation      | 00-00-09        Simulation      | Generators/Tauolapp_i                        
 01-08-10        Production      | 01-08-10-01     Analysis        | PhysicsAnalysis/D3PDMaker/TauD3PDMaker       
 00-02-03        Analysis        | 00-02-06        Analysis        | PhysicsAnalysis/D3PDMaker/TrigTauD3PDMaker   
 01-07-12-01     Production      | 01-07-24        Production      | PhysicsAnalysis/TauID/TauDiscriminant        
 00-01-09        Analysis        | 00-01-11        Analysis        | PhysicsAnalysis/TauID/TauTagTools            
 00-06-01        Analysis        | 00-06-04        Analysis        | PhysicsAnalysis/TauID/TauTools               
 00-06-00        Production      | 00-06-03        Analysis        | PhysicsAnalysis/TauID/TauValidation           
 00-01-61        Reconstruction  | 00-01-61-00     Production      | Reconstruction/MuonIdentification/MuGirlStau 
 00-06-04        Event           | 00-06-05        Event           | Reconstruction/tauEvent                      
 00-02-12        Analysis        | 00-02-19        Analysis        | Reconstruction/tauMonitoring                 
 04-02-01        Production      | 04-02-14        Production      | Reconstruction/tauRec                        
 00-06-68        Trigger         | 00-06-70        Trigger         | Trigger/TrigAnalysis/TrigTauAnalysis/TrigTauPerformAthena
 00-00-42        Trigger         | 00-00-46        Trigger         | Trigger/TrigMonitoring/TrigTauMonitoring  
 

We add this tau differences table at the beginning of the the validation pdf file for every validation task.


Major updates:
-- Main.Sebastian Olivares - 4 Jan 2013

RESPONSIBLE SebastianOlivares
REVIEW Never reviewed

Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r6 - 2013-01-14 - GiovannaCottin
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback