On This Page

- MTCC CSC MC and Data Information

- CSC Calibration Project

- Calibration Documentation

- Helpful CRAB Information

- Automatic Creation of .cfg files for running mutliple jobs

- MTCC CSC Data Analysis and MC Comparison

MTCC MC and Data Information

The MC files can be found at /castor/cern.ch/user/a/aroe/MC3 . Each of 50 files has 20k events, giving a total of 1 million events. The directory contains the files after reconstruction. /MC2 has files before reconstruction but the LCT information is wrong. See below for a complete description of the production process. A third sample of 10 million events, at /MC5 is in production right now. Each of 1000 files will have 20k events.

The data files can be found at /castor/cern.ch/user/a/aroe/4188 and at castor/cern.ch/user/a/aroe/4386 . For each run, each of 10 files contains ~20k events, giving a total of ~200k events. The directory contains the files both before and after the reconstruction code. Run 4188 uses CSCRecHitB and CSCSegment as tagged for CMSSW_1_5_2. Run 4386 uses, by Dominique's suggestion, the head versions as of August 15. See below for a complete description of the production process.

A few "tricks" for Reconstruction and Emulation

- The global vectors are well documented here, but other links are missing crucial information. Also helpful is the CMS Conventions page for geometry.

- In the L1 Trigger there are a handful of parameters used to make the MC match the data for the MTCC. They are all important!

- In the unpacker for reco, there is a flag for IsMTCCData. This fixes a wire-group mapping issue found during the MTCC. The unpacker should read as:

        include "EventFilter/CSCRawToDigi/data/cscUnpacker.cfi"
        replace cscunpacker.isMTCCData = true

MC

I produced a Monte Carlo sample of 1 Million events using CMSCGEN on the GRID computing system via CRAB. This sample was created to simulate the 2006 MTCC. Thus, it has B=4T, and is on the surface at SX5. I am currently looking into producing a larger sample, as the statistics here are low.

After the standard Gen-Sim-Digi chain, the events are processed with the L1 Emulator code, to find LCTs. It is then run through CSCRecHitB. All of this has been done in CMSSW_1_5_2. The final root file has the following information:

For more information about the generator that I used, see CMSCGEN. The .cfg file which I used also ran the L1Emulator code can be found EvtGen+DetSim+Digi+CscLCTs.cfg. The RecHitB config file for MC is CSCRecHit2DProducer_MTCC_MC.cfg. For running over multiple files, I automated the process using a perl script to create the .cfg and a shell script, which copies the files, executes the perl script, and runs the new .cfg. The shell script takes a list of root files to run over as an input.

After speaking with Slava, I found that the wrong parameters were used for the L1 Emulator. I also noticed a parameter that was different in my CSCRecHitB configuration than in Dominique's. I ran a script that dropped the branches and re-ran the L1Emulator, CSCRecHitB and CSCSegment code. Those files can be found at /castor/cern.ch/user/a/aroe/MC3 . The configuration used here is found here.

Trigger Parameters used for L1 Emulator
The following parameters were used to mimic the data taken during MTCC phase II. These are used in the files found in /MC3.
# Parameters for ALCT processors: MTCC-II
    PSet alctParam = {
        uint32 alctFifoTbins   = 16
        uint32 alctFifoPretrig = 10
        uint32 alctBxWidth     =  6
        uint32 alctDriftDelay  =  3
        uint32 alctNphThresh   =  2
        uint32 alctNphPattern  =  4
        uint32 alctTrigMode    =  2
        uint32 alctMode        =  0
        uint32 alctL1aWindow   =  3
        untracked int32 verbosity = 0
    }

 # Parameters for CLCT processors: MTCC-II
    PSet clctParam = {
        uint32 clctBxWidth     =  6
        uint32 clctDriftDelay  =  2
        uint32 clctHsThresh    =  4
        uint32 clctDsThresh    =  4
        uint32 clctNphPattern  =  1
        uint32 clctFifoTbins   = 12
        uint32 clctFifoPretrig =  7
        untracked int32 verbosity = 0
    }

Data

While we are waiting for the official reprocessing of MTCC data for use in >CMSSW_1_5_x, I have reprocessed some files according to Alex Tumanov's instructions. I have chosen these files based on the recommendation of people in the group, as well as using the CSC ELOG, Dayong's extremely helpful run summary page and the CMS Run Summary Page. I have only processed events from run 4188 and 4386. Both have B=4T and used the LCT Singles trigger.

I automated the first step of the process (running stream.cfg) to make it easier to run over multipe files, using a simple shell script. This works by calling a perl script which creates a new stream.cfg. This needs an input text file containing the names of the files to run over.

These files were then processed in CSCRecHitB, usng a .cfg file such as this one for run 4188. An example of the .cfg file used for run 4386 is hereThis process has also been automated, using these perl and shell scripts.

Unexpected Peaks in Data
This is "under investigation" now. In run 4188 (processed with RecHitB from 1_5_2), were unexpected peaks found at:

- ME+1/2: -83.6263

- ME+3/1: -86.282

- ME+3/1: -86.2819

Dominique thought he had fixed this "feature". Accordingly, I processed run 4386 with his latest version, rev 1.34. It has not been fixed. Peaks filled with the exact same values in the same chambers exist.

The attached files show the rec hit distributions ME+3/1/14-17 for run 4386. The first file has this peak un-suppressed, while the second suppresses it.

recYME31_un.pdf

recYME31.pdf

MTCC CSC Data Analysis and MC Comparison

I am currently working on an MC comparison with MTCC data taken in summer '07. Using the MC and Data files processed as described above, I am doing basic distribution comparisons.

Stay Tuned- more to come! I will present my finding at the DPG meeting on August 23rd.

CSC Calibration Project

CSC Software Working Page

CSC Monitoring Graphs

Calibration Documentation

CVS Calibration Executables

CRAB Links:

Here is some helpful CRAB information .

A newer, very helpful page can be found at the Crab TWiki page.

To get your GRID certification, follow these directions .

The following directions are for running CRAB from lxplus, after the setups explained in the above document are done. You do not have to do it through lxplus, it can be done locally. This process is different if you are working locally.

Environment Setup

**note: This cannot be set up through tcsh, because when in lxplus the environment path is too long, and will get truncated. I run this in bash. After setting it up in bash, you should be able to switch back into tcsh.

>bash
>source /afs/cern.ch/cms/LCG/LCG-2/UI/cms_ui_env.sh
>cd CMSSW_x_y_z/src; ev ; cd -
>source /afs/cern.ch/cms/ccs/wm/scripts/Crab/crab.sh

**note: the order here is crucial!

Getting your cerftificates straightened out:

**note: this was as of December, 2006. The process may have changed.

Make 2 copies of your browser certificate. I did an scp from my non-lxplus node into my lxplus account, and this seems to have worked. Once you have your .p12 certificate in your lxplus account, then execute the two following commands, taken from this site. Before you execute the commands, see if a hidden directory .globus exists. if not, create it:

>mkdir .globus

Then:

>openssl pkcs12 -in export.p12 -clcerts -nokeys -out $HOME/.globus/usercert.pem

This will ask you for a password, which is the password you chose when getting your certificate authentication.

>openssl pkcs12 -in export.p12 -nocerts -out $HOME/.globus/userkey.pem

The uskery has to be readable only by you for the authentication to pass:

This will ask you for the same password, and then a pass phrase that you will create right now, and verify. you will need this passoword when submitting a job through CRAB.

>chmod 0400 $HOME/.globus/userkey.pem

You may also have to execute the following command when you begin:

>voms-proxy-init

Helpful Tutorials:

CRAB/GRID tutorials from december '06 CMS Week

MTCC MC info

CSC Calibration Documentation

Calibration runs are dedicated runs taken on the EMU Local DAQ system in the form of raw files. Each run will have one DDU (i.e. 9 chambers) present, so a complete set of MTCC '06 data will have four runs per test. The tests themselves are described on Oana's TWiki page. Running in real-time required copying from a local directory, such as /data/tmp/localdaq on emuslice06 or 12. The files will have the extension Crosstalk, Gains, etc... depending on the test.

Taking a calibration run does not require a special setup, other than to be in local running mode. This an expert-level process which Martin Von Der Mey has created. Once a run is taken, it will be transferred to Castor. For post-run viewing, all files are found in the directory /castor/cern.ch/cms/emuslice/2006/ .

Calibration code should be run from the special lxplus account, csccalib. The output of all calibration analysis code can be viewed at this site. There will only be an "automatic" link between the code and the website if it is run in the csccalib account. Everything contained in the csccalib account is inside the cvs packages CondFormats/CSCObjects and OnlineDB/CSCCondDB. It can be run (in theory) outside of the csccalib account.

The following tests are complete. Each has at least six files (a .cc in CSCCondDB/src, a .h in CSCCondDB/interface, a .cfg, .C, .pl and .sh in CSCCondDB/test).

- CFEB Gain

- CFEB Crosstalk

- CFEB Saturation

- CFEB Noise Matrix

- CFEB Connectivity

- AFEB Connectivity

- AFEB dac

Running An Analyzer-Macro Chain

This should be done from the csccalib account. First, log in and set up the environment:

> ssh csccalib@lxplus
> cd /scratch0/CMSSW_1_1_1/src/
> eval `scramv1 runtime -csh`
> cd OnlineDB/CSCCondDB/test
> source exec.sh

The GetRunsTEST_NAME.sh scripts can almost be used out of the box. You should only have to explicitly call a shell script, i.e. GetRunsCrosstalk.sh. As is explained below, this will call the other neccesary scripts. What should be edited in this shell script is what you are grep-ing for, i.e. what runs you want to process. Whatever is in GoodTEST_NAMERuns.txt will be looped over by the script.

>./GetRunsCrosstalk.sh

Example: Running over all Crosstalk Runs in raw format

#Copies all runs from castor directory
nsls /castor/cern.ch/cms/emuslice/2006/ > Runs.txt
#takes all crosstalk runs
grep -e Crosstalk_ Runs.txt > AllCrosstalkRuns.txt
#filters for only the main file from each run
grep -e "RUI" AllCrosstalkRuns.txt > GoodCrosstalkRuns_.txt
grep -e ".raw" GoodCrosstalkRuns_.txt > GoodCrosstalkRuns.txt

Example: Running over all crosstalk runs between 201 and 209, replace

grep -e ".raw" GoodCrosstalkRuns_.txt > GoodCrosstalkRuns.txt

With:

grep -e "csc_0000020*" GoodCrosstalkRuns_.txt > GoodCrosstalkRuns.txt

This will be very important, in ordere to run over a set of Calibration data without re-processing everything that we have ever taken.

Explanation of Other Scripts

For each test, there are five files. For the Crosstalk test, for instance, there is:

-GetRunsCrosstalk.sh

-CreateConfigCrosstalk.pl

-CSCxtalk.cfg

-xTalkMacro.C

GetRunsCrosstalk.sh will execute an entire series of processes, which will each be described below. It will find the appropriate runs in castor. For each run, it will:

- copy the run file(s) to castor

- create new .cfg file using the .pl script

- execute a cmsRun job using the .cfg file, which outputs a .root file

- execute the .C root Macro to process the .root file and create .gif images

After the entire loop is finished, it will execute a different perl script, in a different directory, to put the newly processed runs on the web.

CreateConfigCrosstalk.pl will take all of the files for a given run (which are now in the /tmp/csccalib directory), and create a .cfg file based on it. For Gains and Saturation the script is more complicated, because there are multiple files per run. The other tests use a simple and straight-forward perl script.

CSCxtalk.cfg will not have to be edited manually, because it is recreated using the perl script. Of course, it can be. The only thing that changes run-to-run is the input, i.e. the file path.

xTalkMacro.C is a root macro, and the various macros range in how complicated they are. xTalkMacro.C is the most complicated, and so it has the most documentation. A cleaner version that has essentially the same functionality is gainsMacro.C. I will sketch out the components here, but for full documention, see the codes. The code is modular, so the “main” loop calls several functions written later on in the document, as well as some functions contained in GenFuncMacro.C, common to all the other root macros.

xTalMacro.C is set to parse through the /tmp/csccalib directory for any root files, and process them. The program will have trouble if there is more than 1 root file in the directory. After opening the file, it will process accordingly. Each macro has different looping. Most (including xTalkMacro) have a function that loops at both the chamber level (over all strips and layers) and at the layer level (each data point represents the data from one strip). In some cases there are pre-made graphs in the root tuple, so rather than drawing (using the “Project” command, usually), it uses “Get”.

The macro output is complicated. I have handwritten a function called PrintAsGif(). ROOT cannot print .gif files in batch mode, for technical reasons. There is a roundabout process that the macro executes to get this. It takes a long time, but is much quicker than not using batch mode. Other file types (i.e. .eps) will be too large for quick web-based access. In effect, the macro creates a linux directory hierarchy for each run, to organize the many .gif files. The hierarchy looks like this: “Images”->Test_Name->Run_ Name->Folder->Sub_Folder

For xTalkMacro.C, for instance, the Test_Name is “Crosstalk”, which contains all of the various runs. Each run has some top-level graphs, i.e. error flags, and then various folders, some of which have more sub-folders.

This complex directory structure is mimic-ed on the web. This entire structure is written out in the directory ******* which is linked to the histogram web page.

Web Site Details

The web site functions based on a JavaScript code. Big thanks to Khristian Khortov for helping get started here. The html pages call a .js code, which comes out of the box from here. The main changes that have to be made are in the tree_items.js script, which the browser generator takes as input.

A perl script, CreateTree_Items.pl, parses the hierarchy in “images” and puts it into the proper syntax to be read in by the browser. The execution of this script will update the web page. Anything found in the subdirectories of “images” should appear on the webiste as soon as CreateTree_Items.pl is run, without delay. NOTE: if there a lag time using Mozilla, clear the cache. If it persists, you may have to set the browser to not retain anything in the cache.

Automatic Creation of .cfg files for running mutliple jobs in succession

This is a helpful hack to run multiple cmsRun jobs in succession. Since the framework has not provided a way to have a .cfg file take an argument, this will automatically generate the a .cfg file. What follows are several versions of a perl script. This can be put inside a shell script for batch running.

Simple Version: Single Input File

To execute the following script, type:

>perl CreateConfigSimple.pl "myPath/my_run_name.raw"

#!/usr/local/bin/perl
 
#set your variable. "$ARGV[0]" is the first argument given to the script when execute, i.e. "myPath/my_run_name.raw"
$demoVar =
"process Demo =  {
        include \"FWCore/MessageLogger/data/MessageLogger.cfi\"
       source = PoolSource
      {
           untracked vstring fileNames = {\"$ARGV[0]\"}
      }
      path p = {demo}
}"; 

#print your new variable to the screen
print "$demoVar\n";

#open demo.cfg for writing (the ">" is important")
open(CONFIGFILE, ">demo.cfg");
#overwrite it with your new variable
print CONFIGFILE "$demoVar";
#close it
close(CONFIGFILE);

This is relatively straight-forward. To make it useful in a loop, you can do something like this:

#!/bin/bash

nsls /My/Castor/Path/To/Run/Directory/ > Runs.txt
grep -e "MyRunTypeName" Runs.txt > MyRunType.txt
cat MyRunType.txt | while read line
do
  rfcp "/My/Castor/Path/To/Run/Directory/$line" "/tmp/MyUserName/$line"
   #call the script shown above
  perl CreateConfigSimple.pl "/tmp/MyUserName/$line"
  cmsRun demo.cfg
done
 
#clean out the .txt files
rm Runs.txt
rm MyRunType.txt

Complicated Version: Mutliple Input Files

The principle is the same as the simple version, but the inner workings here are messier. The shell script used here will very much depend on how your runs are organized. What I show here is a script for taking every run from a given castor directory and inputing it to a single .cfg file. This can be used, for instance, if a given run is in one folder but has many data files.

#!/usr/local/bin/perl

#take tempFile.txt as input, read it into an array (@runs), and close the file.
open(RUNFILE, "MyRunType.txt");
@runs = readline(RUNFILE);
close(RUNFILE);
 
#loop over each member of the array, put them into a single variable
foreach $run (@runs){
#the members have blank space. use this to cut off all extra (blank) letters. This will take the first 40 letters.
$runName = substr($run,0,40); 
$runName = "/tmp/MyUserName/" . $runName;
$list = "$list" . "\"" . "$runName" . "\",";
}

#cut off the last comma from the run list.
$list = substr($list,0,-1);
 
$demoVar =
 "process Demo =  {
        include \"FWCore/MessageLogger/data/MessageLogger.cfi\"
       source = PoolSource
      {
           untracked vstring fileNames = {$list}
      }
      path p = {demo}
}"; 

print "$demoVar\n";
 
#open demo.cfg for writing (the ">" is important")
open(CONFIGFILE, ">demo.cfg");
#overwrite it with your new variable
print CONFIGFILE "$demoVar";
#close it
close(CONFIGFILE);

In order to use this script, your shell script will also have to be more complicated...

#!/bin/bash

nsls /My/Castor/Path/To/Run/Directory/ > Runs.txt
grep -e "MyRunTypeName" Runs.txt > MyRunType.txt
  
#loop over all runs that you have grepped for.
cat MyRunType.txt | while read line
do
     rfcp "/My/Castor/Path/To/Run/Directory/$line" "/tmp/MyUserName/$line"
done
#Create your .cfg file. Notice that this does not explicitly take any arguments
perl CreateConfigComplicated.pl;
cmsRun demo.cfg; 
 
rm Runs.txt
rm MyRunType.txt

Topic attachments
I Attachment History Action Size Date Who Comment
Unknown file formatcfg CSCRecHit2DProducer_MTCC.cfg r1 manage 4.3 K 2007-08-10 - 12:23 AdamRoe  
Unknown file formatcfg CSCRecHit2DProducer_MTCC_MC.cfg r1 manage 3.3 K 2007-08-10 - 11:50 AdamRoe  
Postscriptps CompRecDists.ps r1 manage 16.0 K 2007-08-12 - 12:29 AdamRoe  
Texttxt ConfigStream.pl.txt r1 manage 0.6 K 2007-08-10 - 12:06 AdamRoe  
Unknown file formatcfg EvtGen+DetSim+Digi+CscLCTs.cfg r1 manage 4.9 K 2007-08-10 - 11:36 AdamRoe  
Texttxt RecHitConfig.pl.txt r1 manage 3.5 K 2007-08-10 - 12:22 AdamRoe  
Texttxt RecHitConfigMC.pl.txt r1 manage 3.6 K 2007-08-10 - 11:50 AdamRoe  
Unknown file formatcfg RecHitMTCCDataAuto_4386.cfg r1 manage 4.7 K 2007-08-16 - 15:14 AdamRoe  
Unknown file formatcfg RecHitMTCCMCAuto.cfg r1 manage 3.9 K 2007-08-15 - 17:35 AdamRoe  
Texttxt RunRecHit.sh.txt r1 manage 0.2 K 2007-08-10 - 11:55 AdamRoe  
Texttxt RunStream.sh.txt r1 manage 0.3 K 2007-08-10 - 12:06 AdamRoe  
HTMLhtml index.html r1 manage 0.3 K 2006-09-04 - 16:45 AdamRoe  
PDFpdf recYME31.pdf r1 manage 5.4 K 2007-08-16 - 16:02 AdamRoe  
PDFpdf recYME31_un.pdf r1 manage 5.2 K 2007-08-16 - 15:57 AdamRoe  
Postscriptps recYME31_un.ps r1 manage 18.9 K 2007-08-16 - 15:48 AdamRoe  
Edit | Attach | Watch | Print version | History: r22 < r21 < r20 < r19 < r18 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r22 - 2007-08-24 - AdamRoe
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback