RStDScGd18NovAthenaGen

[lxplus]lcg-infosites --vo atlas se
 
**************************************************************
These are the related data for atlas: (in terms of SE)
**************************************************************
 
Avail Space(Kb) Used Space(Kb)  Type    SEs
----------------------------------------------------------
1               1               disk    se2.itep.ru
1094713344      0               disk    se3.itep.ru
2641463808      3214702080      disk    lcgce.ijs.si
1445562064      14675672        disk    seitep.itep.ru
1000000000000   500000000000    disk    castorsrm.pic.es
340884448       389841440       mss     se01.lip.pt
1000000000000   500000000000    disk    castorgrid.pic.es
53431688        197728          disk    g03n05.pdc.kth.se
15655816        130364620       disk    se00.inta.es
1000000000000   500000000000    disk    castorsrmsc.pic.es
751320000       56010000        disk    se.grid.kiae.ru
145365976       654460          rfio    grid-se2.desy.de
999999999       111111111       disk    grid05.lal.in2p3.fr
1401004360      59196764        disk    grid100.kfki.hu
45144312        14157440        mss     grid11.lal.in2p3.fr
60687336        9444760         disk    ingvar-se.nsc.liu.se
1470000000      80000000        disk    lxfs04.jinr.ru
1               1               mss     lxn1180.cern.ch
3417160011      26349401                globe-door.ifh.de
1               1                       sc4.triumf.ca
1552290056      9353848                 se.polgrid.pl
20487252        14056728                se001.imbm.bas.bg
2181115232      4061786784              teras.sara.nl
437180816       854036                  fornax-se.itwm.fhg.de
1               1                       pc55.hep.ucl.ac.uk
100000000000    50000000000             sc.cr.cnaf.infn.it
894837016       178872040               se-a.ccc.ucl.ac.uk
34834548        36183604                se001.grid.bas.bg
2181115232      4061786784              teras.sara.nl
9024895232      105120640               cluster.pnpi.nw.ru
95042800        14473928                mu2.matrix.sara.nl
13568284        4607784                 node05.datagrid.cea.fr
435910604       53452                   pcncp22.ncp.edu.pk
107235524       2281204                 se-01.cs.tau.ac.il
103051820       5497004                 se-egee.bifi.unizar.es
25898424        8540804                 se01.grid.acad.bg
2044936288      878233120               se1.pp.rhul.ac.uk
902568052       13470928                filippos.it.uom.gr
1196651008      559962484               grid01.uibk.ac.at
355107136       377293660               gridba6.ba.infn.it
342780536       112588                  ise.prd.hp.com
95042800        14473928                mu2.matrix.sara.nl
37639720        12161300                se1.egee.fr.cgg.com
1413313224      487999192               dgse0.icepp.jp
1000000000000   500000000000            castorsrm.ific.uv.es
1206878228      897486540               cclcgseli01.in2p3.fr
99999999999999  0                       cclcgseli02.in2p3.fr
520547720       57979308                clrlcgse02.in2p3.fr
1               1                       cmsstore.pi.infn.it
14769440        19774540                grid2.fe.infn.it
355784764       324215236               lcgse01.triumf.ca
71753320        380381972               lcgse02.triumf.ca
10283103216     454306784               wormhole.westgrid.ca
1               1                       SE.pakgrid.org.pk
1000000000000   500000000000            castorgrid.ific.uv.es
1630000000      130000000               epgse1.ph.bham.ac.uk
1               1                       grid006.mi.infn.it
66348199        760665                  grid008.to.infn.it
436949960       1655298152              grid009.to.infn.it
671646592       15395112                se101.grid.ucy.ac.cy
888073176       27981160                se43.hep.ntua.gr
1000000000000   500000000000            castorgrid.cern.ch
177902668       46559496                gridstore.cs.tcd.ie
1469615314      9286677294              dcache.gridpp.rl.ac.uk
573543292       183603960               grid002.fi.infn.it
1336036352      473301312               grid007g.cnaf.infn.it
905938312       57197080                lcg-se.its.uiowa.edu
492109100       9844140                 node004.grid.auth.gr
94596344        17570848                se.phy.bg.ac.yu
23404108        88769848                se.ui.savba.sk
204406684       349552                  se.ulakbim.gov.tr
1               1                       t2-se-01.mi.infn.it
794701844       1310828796              t2-se-03.lnl.infn.it
41207828        33569652                boalice1.bo.infn.it
464303920       27524116                grid003.ca.infn.it
60552           708991072               gridit002.pd.infn.it
1464576664      157220796               lcgse0.shef.ac.uk
478325328       1422987096              tbn15.nikhef.nl
0               0                       tbn18.nikhef.nl
666205184       64764928                clrlcgse01.in2p3.fr
974070788       6255996                 ibelieve-i.hpc2n.umu.se
1152547512      173865280               atlasse.lnf.infn.it
100000000000    50000000000             castorsrm.cr.cnaf.infn.it
426856960       11213656                dgc-grid-34.brunel.ac.uk
1915461012      513148                  gallows.dur.scotgrid.ac.uk
159135224       753466480               griditse01.na.infn.it
5748988         3819432                 hepgrid3.ph.liv.ac.uk
2576660         2321772                 lcg-se01.usatlas.bnl.gov
366187792       122093472               pccmsgrid09.pi.infn.it
25484952        9059028                 pccmsgrid11.pi.infn.it
4252048096      142130464               prod-se-01.pd.infn.it
4325009920      69168640                prod-se-02.pd.infn.it
99628684        7885584                 se.keldysh.ru
16760000000     1430000000              se01.esc.qmul.ac.uk
1484495808      1456686496              t2-se-01.roma1.infn.it
10000           5000                    castorgrid.cr.cnaf.infn.it
10255884        20527324                ce1.egee.unile.it
9914004         4781404                 grid02.cslab.openu.ac.il
669909100       619468572               se-iep-grid.saske.sk
3954136         10394828                se.cc.ncu.edu.tw
4161843200      980740096               se01.isabella.grnet.gr
209475696       1937851792              zeus03.cyf-kr.edu.pl
292711136       18232984                gw38.hep.ph.ic.ac.uk
3513132729      1694023                 se.bfg.uni-freiburg.de
1251660632      572315432               se01-lcg.projects.cscs.ch
528434088       1345721296              se1-gla.scotgrid.ac.uk
2490000000      140000000               se2-gla.scotgrid.ac.uk
14495487000     27624                   dcache-tape.gridpp.rl.ac.uk
2453240000      75280000                dpm.epcc.ed.ac.uk
1090849800      905713944               fal-pygrid-03.lancs.ac.uk
61271647501     114088691               fal-pygrid-20.lancs.ac.uk
8038969033      1511735                 gfe02.hep.ph.ic.ac.uk
3037391328      769848864               golias26.farm.particle.cz
3974800         4025264                 grid002.ics.forth.gr
676620396       788432660               lcgrid.dnp.fmph.uniba.sk
329153857       981566143               lcgse01.nic.ualberta.ca
3000000000      2100000000              pccms2.cmsfarm1.ba.infn.it
76100000        200000                  pccms5.cmsfarm1.ba.infn.it
56969228        9483892                 spaci02.na.infn.it
15538859755     12996038933             srm.epcc.ed.ac.uk
156002952       12689936                wipp-se.weizmann.ac.il
1               1                       grid-cert-03.roma1.infn.it
0               0                       lcg13.sinp.msu.ru
172610000       10770000                lcgse01.phy.bris.ac.uk
3037391328      769848864               skurut18.cesnet.cz
792621984       135918700               se001.m45.ihep.su
689604316       451152228               lcg-se.lps.umontreal.ca
40453676        22077352                melon035.ngpp.ngp.org.sg
1500000000      80000000                t2se01.physics.ox.ac.uk
1203781944      402350076               t2se02.physics.ox.ac.uk
1               1                       castorsc.grid.sinica.edu.tw
159135224       753466480               griditse01.na.infn.it
103525204       4977836                 lcg00123.grid.sinica.edu.tw
59869860        6054536                 se-lcg.sdg.ac.cn
3403796         22399272                se01.novsu.ac.ru
0               0                       serv02.hep.phy.cam.ac.uk
13568284        4607784                 node05.datagrid.cea.fr
1               1                       node12.datagrid.cea.fr
105747072       3238176                 castor.grid.sinica.edu.tw
433241576       1007036316              marseillese01.mrs.grid.cnrs.fr
123003543245    345345345               srm-dcache.desy.de
1546959616      8960                    storage3.bluesmoke.nsc.liu.se
54612389189     550048443               cmssrm.fnal.gov
790542552       34988012                grid-se.physik.uni-wuppertal.de
956670312       648465536               bigmac-lcg-se.physics.utoronto.ca
58784396        633298888               testbed003.phys.sinica.edu.tw
1000000000000   500000000000            castorsrm.cern.ch
5889419269      754358267               grid002.ft.uam.es
569930000       212940000               se002.m45.ihep.su
17534572        16078524                iuatlas.physics.indiana.edu
1000000000000   500000000000            castorgridsc.cern.ch
[lxplus]
[lxplus]edg-job-submit -vo atlas -o jid1 genscotgrid.jdl
 
Selected Virtual Organisation name (from --vo option): atlas
Connecting to host gdrb01.cern.ch, port 7772
Logging to host gdrb01.cern.ch, port 9002
 
================================ edg-job-submit Success =====================================
 The job has been successfully submitted to the Network Server.
 Use edg-job-status command to check job current status. Your job identifier (edg_jobId) is:
 
 - https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw
 
 The edg_jobId has been saved in the following file:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/jid1
=============================================================================================
 
[lxplus]
[lxplus]cat genscotgrid.jdl
############# Athena #################
Executable = "athena_gen.sh";
StdOutput = "athena_gen.out";
StdError = "athena_gen.err";
InputSandbox = {"athena_gen.sh","jobOptions.pythia.vbf.py"};
OutputSandbox = {"athena_gen1.out","athena_gen1.err", "CLIDDBout.txt"};
Requirements = Member("VO-atlas-release-10.0.1", other.GlueHostApplicationSoftwareRunTimeEnvironment);
Requirements = other.GlueCEUniqueID=="ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque";
######################################
[lxplus]
[lxplus]cat athena_gen.sh
#!/bin/bash
source $VO_ATLAS_SW_DIR/software/10.0.1/setup.sh
source $SITEROOT/dist/10.0.1/Control/AthenaRunTime/*/cmt/setup.sh
cp $SITEROOT/dist/10.0.1/InstallArea/share/PDGTABLE.MeV .
# Run the job:
athena.py jobOptions.pythia.vbf.py
ls -l
hostname
lcg-cr -v --vo atlas -d dcache.gridpp.rl.ac.uk
   -l lfn:stdenis_vbf_001.pool.root file://`pwd`/pythia.pool.root
[lxplus]

[lxplus]cat jobOptions.pythia.vbf.py
###############################################################
#
# Job options file
#
#==============================================================
#--------------------------------------------------------------
# General Application Configuration options
#--------------------------------------------------------------
theApp.setup( MONTECARLO )
 
include( "PartPropSvc/PartPropSvc.py" )
 
#--------------------------------------------------------------
# Private Application Configuration options
#--------------------------------------------------------------
theApp.Dlls  += [ "TruthExamples", "Pythia_i" ]
theApp.TopAlg = ["Pythia","DumpMC"]
theApp.ExtSvc += ["AtRndmGenSvc"]
# Set output level threshold (2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL )
MessageSvc = Service( "MessageSvc" )
MessageSvc.OutputLevel               = 2
#--------------------------------------------------------------
# Event related parameters
#--------------------------------------------------------------
# Number of events to be processed (default is 10)
theApp.EvtMax = 5
#--------------------------------------------------------------
# Algorithms Private Options
#--------------------------------------------------------------
AtRndmGenSvc = Service( "AtRndmGenSvc" )
AtRndmGenSvc.Seeds = ["PYTHIA 4789899 989240512", "PYTHIA_INIT 820021 2347532"]
# AtRndmGenSvc.ReadFromFile = true;
Pythia = Algorithm( "Pythia" )
Pythia.PythiaCommand = ["pysubs msel 0","pysubs msub 124 1",
                        "pydat2 pmas 25 1 160",
                        "pypars mstp 61 2",
                        "pypars mstp 71 1",
                        "pypars mstp 81 0",
                        "pypars mstp 111 1",
                        "pydat3 mdme 190 1 0",
                        "pydat3 mdme 191 1 0",
                        "pydat3 mdme 192 1 0",
                        "pydat3 mdme 194 1 0",
                        "pydat3 mdme 195 1 0",
                        "pydat3 mdme 196 1 0",
                        "pydat3 mdme 198 1 0",
                        "pydat3 mdme 199 1 0",
                        "pydat3 mdme 200 1 0",
                        "pydat3 mdme 206 1 2",
                        "pydat3 mdme 207 1 3",
                        "pydat3 mdme 208 1 0",
                        "pydat3 mdme 210 1 0",
                        "pydat3 mdme 211 1 0",
                        "pydat3 mdme 212 1 0",
                        "pydat3 mdme 213 1 0",
                        "pydat3 mdme 214 1 0",
                        "pydat3 mdme 215 1 0",
                        "pydat3 mdme 218 1 0",
                        "pydat3 mdme 219 1 0",
                        "pydat3 mdme 220 1 0",
                        "pydat3 mdme 222 1 0",
                        "pydat3 mdme 223 1 0",
                        "pydat3 mdme 224 1 0",
                        "pydat3 mdme 225 1 0",
                        "pydat3 mdme 226 1 1",
                        "pydat3 mdme 174 1 0",
                        "pydat3 mdme 175 1 0",
                        "pydat3 mdme 176 1 0",
                        "pydat3 mdme 177 1 0",
                        "pydat3 mdme 178 1 0",
                        "pydat3 mdme 179 1 0",
                        "pydat3 mdme 182 1 0",
                        "pydat3 mdme 183 1 0",
                        "pydat3 mdme 184 1 0",
                        "pydat3 mdme 185 1 0",
                        "pydat3 mdme 186 1 0",
                        "pydat3 mdme 187 1 0"
                        ]
#---------------------------------------------------------------
# Ntuple service output
#---------------------------------------------------------------
#==============================================================
#
# End of job options file
#
###############################################################
[lxplus]


[lxplus]edg-job-status `cat jid1 | grep http`
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw
Current Status:     Scheduled
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 18:41:30 2005
*************************************************************
 
[lxplus]edg-job-status `cat jid1 | grep http`
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw
Current Status:     Done (Success)
Exit code:          127
Status Reason:      There were some warnings: some file(s) listed in the outputsandbox were not available and were ignored
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 18:48:08 2005
*************************************************************
 
[lxplus]edg-job-get-output -dir . https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw
 
Retrieving files from host: gdrb01.cern.ch ( for https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw )
 
*********************************************************************************
                        JOB GET OUTPUT OUTCOME
 
 Output sandbox files for the job:
 - https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw
 have been successfully retrieved and stored in the directory:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/stdenis_SK7Sar3RRKTeHFoMhkS5Fw
 
*********************************************************************************
 
[lxplus]ls stdenis_SK7Sar3RRKTeHFoMhkS5Fw
CLIDDBout.txt
[lxplus]cat stdenis_SK7Sar3RRKTeHFoMhkS5Fw/CLIDDBout.txt
2101 EventInfo
2102 PileUpEventInfo
2221 Gen_HEPEVT
2411 IdDictManager
6166 Atlas_HEPEVT
133273 McEventCollection
61780915 TagInfo
77641104 SG::Folder
164875623 AtlasDetectorID
220174395 MergedEventInfo
222376821 DataHeader
[lxplus]


[lxplus]/afs/cern.ch/atlas/offline/external/DQClient/dms3/dms3.py search "*stdenis*"
Using home grid:  ['http://atlfarm009.mi.infn.it:11122/']
Using all grids:  ['http://atlfarm009.mi.infn.it:11121/']
No matching LPN on home grid. Will try remote grids...
No matching LPN.
[lxplus]


Tried to use a different name in the jdl to give outoput. Fix this. Try with redirect of error and output explicit and to one file for running, then put stdout,err to another file.
[lxplus]edg-job-submit -vo atlas -o jid1 genscotgrid.jdl
 
Selected Virtual Organisation name (from --vo option): atlas
Connecting to host gdrb01.cern.ch, port 7772
Logging to host gdrb01.cern.ch, port 9002
 
================================ edg-job-submit Success =====================================
 The job has been successfully submitted to the Network Server.
 Use edg-job-status command to check job current status. Your job identifier (edg_jobId) is:
 
 - https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g
 
 The edg_jobId has been saved in the following file:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/jid1
=============================================================================================
 
[lxplus]

[lxplus]edg-job-status https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g
Current Status:     Scheduled
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 19:03:32 2005
*************************************************************
 *************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g
Current Status:     Done (Success)
Exit code:          127
Status Reason:      Job terminated successfully
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 19:15:19 2005
*************************************************************
export JOB=https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g

[lxplus]edg-job-get-output -dir . $JOB                                                                                
Retrieving files from host: gdrb01.cern.ch ( for https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g )
 
*********************************************************************************
                        JOB GET OUTPUT OUTCOME
 
 Output sandbox files for the job:
 - https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g
 have been successfully retrieved and stored in the directory:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/stdenis_9_WgNfZvQ9bxPppHkhRG7g
 
*********************************************************************************
 
[lxplus]

[lxplus]ls stdenis_9_WgNfZvQ9bxPppHkhRG7g
athena_gen.err  athena_gen.out  athena_result.out  CLIDDBout.txt

[lxplus]more stdenis_9_WgNfZvQ9bxPppHkhRG7g/athena_gen.out
total 252
-rw-r--r--    1 atlas022 atlas           0 Nov 19 19:05 athena_gen.err
-rw-r--r--    1 atlas022 atlas           0 Nov 19 19:05 athena_gen.out
-rwxr-xr-x    1 atlas022 atlas         389 Nov 19 19:05 athena_gen.sh
-rw-r--r--    1 atlas022 atlas      199235 Nov 19 19:05 athena_result.out
-rw-r--r--    1 atlas022 atlas          54 Nov 19 19:05 AtRndmGenSvc.out
-rw-r--r--    1 atlas022 atlas         224 Nov 19 19:05 CLIDDBout.txt
-rw-r--r--    1 atlas022 atlas        3840 Nov 19 19:05 jobOptions.pythia.vbf.py
-rwxr-xr-x    1 atlas022 atlas       32957 Nov 19 19:05 PDGTABLE.MeV
node065
[lxplus]

[lxplus]more stdenis_9_WgNfZvQ9bxPppHkhRG7g/athena_gen.err
usage: lcg-cr [--config config_file] [-d dest_file | dest_host] [-g guid]
        [-h | --help] [-i | --insecure] [-l lfn] [-n nbstreams] [-P relative_path]
        [-t timeout] [-v | --verbose] [--vo vo] src_file
./athena_gen.sh: line 10: -l: command not found

* athena_result19nov051333.out: So there was an error in the usage of lcg-cr, but the job ran fine. The error is a missing \. Fixed, try again.
Sat Nov 19 20:38:20 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/q8IwwlhXLgSKoNoKzKL6mQ
Current Status:     Ready
Status Reason:      unavailable
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 19:37:53 2005
****
Sat Nov 19 20:39:21 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/q8IwwlhXLgSKoNoKzKL6mQ
Current Status:     Scheduled
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 19:38:23 2005
*************************************************************
 

Sat Nov 19 20:44:24 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/q8IwwlhXLgSKoNoKzKL6mQ
Current Status:     Running
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 19:43:41 2005
*************************************************************
 
Sat Nov 19 20:45:26 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/q8IwwlhXLgSKoNoKzKL6mQ
Current Status:     Done (Success)
Exit code:          1
Status Reason:      Job terminated successfully
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 19:44:59 2005
*************************************************************

[lxplus]edg-job-get-output -dir . $JOB                                                                                
Retrieving files from host: gdrb01.cern.ch ( for https://gdrb01.cern.ch:9000/q8 IwwlhXLgSKoNoKzKL6mQ )
 
******************************************************************************* **
                        JOB GET OUTPUT OUTCOME
 
 Output sandbox files for the job:
 - https://gdrb01.cern.ch:9000/q8IwwlhXLgSKoNoKzKL6mQ
 have been successfully retrieved and stored in the directory:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/Use rAnalysis/UserAnalysis-00-02-02/share/stdenis_q8IwwlhXLgSKoNoKzKL6mQ
 
******************************************************************************* **
 
[lxplus]
[lxplus]
[lxplus]ls stdenis_q8IwwlhXLgSKoNoKzKL6mQ
athena_gen.err  athena_gen.out  athena_result.out  CLIDDBout.txt
[lxplus]

[lxplus]more stdenis_q8IwwlhXLgSKoNoKzKL6mQ/athena_gen.out
total 252
-rw-r--r--    1 atlas022 atlas           0 Nov 19 19:40 athena_gen.err
-rw-r--r--    1 atlas022 atlas           0 Nov 19 19:40 athena_gen.out
-rwxr-xr-x    1 atlas022 atlas         390 Nov 19 19:40 athena_gen.sh
-rw-r--r--    1 atlas022 atlas      199235 Nov 19 19:41 athena_result.out
-rw-r--r--    1 atlas022 atlas          54 Nov 19 19:41 AtRndmGenSvc.out
-rw-r--r--    1 atlas022 atlas         224 Nov 19 19:41 CLIDDBout.txt
-rw-r--r--    1 atlas022 atlas        3840 Nov 19 19:40 jobOptions.pythia.vbf.py
-rwxr-xr-x    1 atlas022 atlas       32957 Nov 19 19:41 PDGTABLE.MeV
node065
Using grid catalog type: edg
[lxplus]more stdenis_q8IwwlhXLgSKoNoKzKL6mQ/athena_gen.err
lcg_cr: No such file or directory
[lxplus]


This time I did not generate the output file. That is because I need a different pythia options file.
[lxplus]cat genscotgrid.jdl
############# Athena #################
Executable = "athena_gen.sh";
StdOutput = "athena_gen.out";
StdError = "athena_gen.err";
InputSandbox = {"athena_gen.sh","jobOptions.pythia.vbf-out.py"};
OutputSandbox = {"athena_gen.out","athena_gen.err","athena_result.out", "CLIDDBout.txt"};
Requirements = Member("VO-atlas-release-10.0.1", other.GlueHostApplicationSoftwareRunTimeEnvironment);
Requirements = other.GlueCEUniqueID=="ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque";
######################################
[lxplus]cat athena_gen.sh
#!/bin/bash
source $VO_ATLAS_SW_DIR/software/10.0.1/setup.sh
source $SITEROOT/dist/10.0.1/Control/AthenaRunTime/*/cmt/setup.sh
cp $SITEROOT/dist/10.0.1/InstallArea/share/PDGTABLE.MeV .
# Run the job:
athena.py jobOptions.pythia.vbf-out.py > athena_result.out 2>&1
ls -l
hostname
lcg-cr -v --vo atlas -d dcache.gridpp.rl.ac.uk \
   -l lfn:stdenis_vbf_001.pool.root file://`pwd`/pythia.pool.root
[lxplus]cat jobOptions.pythia.vbf-out.py
###############################################################
#
# Job options file
#
#==============================================================
#--------------------------------------------------------------
# General Application Configuration options
#--------------------------------------------------------------
theApp.setup( MONTECARLO )
 
include( "PartPropSvc/PartPropSvc.py" )
 
# Add POOL persistency
 
include( "AthenaPoolCnvSvc/WriteAthenaPool_jobOptions.py" )
include( "GeneratorObjectsAthenaPool/GeneratorObjectsAthenaPool_joboptions.py" )
 
# 2101 = EventInfo
# 133273 = MCTruth (HepMC)
Stream1.ItemList += [ "2101#*", "133273#*" ]
include("AthenaSealSvc/AthenaSealSvc_joboptions.py" )
AthenaSealSvc.CheckDictionary = TRUE
Stream1.OutputFile = "vbf.pool.root"
 
 
#--------------------------------------------------------------
# Private Application Configuration options
#--------------------------------------------------------------
theApp.Dlls  += [ "TruthExamples", "Pythia_i" ]
theApp.TopAlg = ["Pythia","DumpMC"]
theApp.ExtSvc += ["AtRndmGenSvc"]
# Set output level threshold (2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL )
MessageSvc = Service( "MessageSvc" )
MessageSvc.OutputLevel               = 4
#--------------------------------------------------------------
# Event related parameters
#--------------------------------------------------------------
# Number of events to be processed (default is 10)
theApp.EvtMax = 5
# Set run number (default 0 causes problems)
 
EventSelector = Service("EventSelector")
EventSelector.RunNumber = 1337
#
#--------------------------------------------------------------
# Algorithms Private Options
#--------------------------------------------------------------
AtRndmGenSvc = Service( "AtRndmGenSvc" )
AtRndmGenSvc.Seeds = ["PYTHIA 4789899 989240512", "PYTHIA_INIT 820021 2347532"]
# AtRndmGenSvc.ReadFromFile = true;
Pythia = Algorithm( "Pythia" )
Pythia.PythiaCommand = ["pysubs msel 0","pysubs msub 124 1",
                        "pydat2 pmas 25 1 160",
                        "pypars mstp 61 2",
                        "pypars mstp 71 1",
                        "pypars mstp 81 0",
                        "pypars mstp 111 1",
                        "pydat3 mdme 190 1 0",
                        "pydat3 mdme 191 1 0",
                        "pydat3 mdme 192 1 0",
                        "pydat3 mdme 194 1 0",
                        "pydat3 mdme 195 1 0",
                        "pydat3 mdme 196 1 0",
                        "pydat3 mdme 198 1 0",
                        "pydat3 mdme 199 1 0",
                        "pydat3 mdme 200 1 0",
                        "pydat3 mdme 206 1 2",
                        "pydat3 mdme 207 1 3",
                        "pydat3 mdme 208 1 0",
                        "pydat3 mdme 210 1 0",
                        "pydat3 mdme 211 1 0",
                        "pydat3 mdme 212 1 0",
                        "pydat3 mdme 213 1 0",
                        "pydat3 mdme 214 1 0",
                        "pydat3 mdme 215 1 0",
                        "pydat3 mdme 218 1 0",
                        "pydat3 mdme 219 1 0",
                        "pydat3 mdme 220 1 0",
                        "pydat3 mdme 222 1 0",
                        "pydat3 mdme 223 1 0",
                        "pydat3 mdme 224 1 0",
                        "pydat3 mdme 225 1 0",
                        "pydat3 mdme 226 1 1",
                        "pydat3 mdme 174 1 0",
                        "pydat3 mdme 175 1 0",
                        "pydat3 mdme 176 1 0",
                        "pydat3 mdme 177 1 0",
                        "pydat3 mdme 178 1 0",
                        "pydat3 mdme 179 1 0",
                        "pydat3 mdme 182 1 0",
                        "pydat3 mdme 183 1 0",
                        "pydat3 mdme 184 1 0",
                        "pydat3 mdme 185 1 0",
                        "pydat3 mdme 186 1 0",
                        "pydat3 mdme 187 1 0"
                        ]
#---------------------------------------------------------------
# Ntuple service output
#---------------------------------------------------------------
#==============================================================
#
# End of job options file
#
###############################################################
[lxplus]

[lxplus]edg-job-submit -vo atlas -o jid1 genscotgrid.jdl
 
Selected Virtual Organisation name (from --vo option): atlas
Connecting to host gdrb03.cern.ch, port 7772
Logging to host gdrb03.cern.ch, port 9002
 
================================ edg-job-submit Success =====================================
 The job has been successfully submitted to the Network Server.
 Use edg-job-status command to check job current status. Your job identifier (edg_jobId) is:
 
 - https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
 
 The edg_jobId has been saved in the following file:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/jid1
=============================================================================================

[lxplus]cat jid1
###Submitted Job Ids###
https://gdrb01.cern.ch:9000/SK7Sar3RRKTeHFoMhkS5Fw
https://gdrb01.cern.ch:9000/9_WgNfZvQ9bxPppHkhRG7g
https://gdrb01.cern.ch:9000/q8IwwlhXLgSKoNoKzKL6mQ
https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
[lxplus]export JOB=https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
[lxplus]cat ckjob
#/bin/sh
while [ 1 ]; do
  date
  edg-job-status $JOB
  sleep 60
done
 
[lxplus]./ckjob
Sat Nov 19 21:35:06 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
Current Status:     Scheduled
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 20:32:39 2005
*************************************************************
Sat Nov 19 21:37:07 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
Current Status:     Running
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 20:37:02 2005
*************************************************************
 Sat Nov 19 21:41:02 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
Current Status:     Done (Success)
Exit code:          1
Status Reason:      Job terminated successfully
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 20:40:21 2005
*************************************************************
 [lxplus]edg-job-get-output -dir . $JOB                                                                                      
Retrieving files from host: gdrb03.cern.ch ( for https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ )
 
*********************************************************************************
                        JOB GET OUTPUT OUTCOME
 
 Output sandbox files for the job:
 - https://gdrb03.cern.ch:9000/D4ownFh81uH4DuQJckBpkQ
 have been successfully retrieved and stored in the directory:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/stdenis_D4ownFh81uH4DuQJckBpkQ
 
*********************************************************************************
 
[lxplus]


 [lxplus]ls stdenis_D4ownFh81uH4DuQJckBpkQ
athena_gen.err  athena_gen.out  athena_result.out  CLIDDBout.txt
[lxplus]more stdenis_D4ownFh81uH4DuQJckBpkQ/athena_gen.out
total 348
-rw-r--r--    1 atlas022 atlas           0 Nov 19 20:35 athena_gen.err
-rw-r--r--    1 atlas022 atlas           0 Nov 19 20:35 athena_gen.out
-rwxr-xr-x    1 atlas022 atlas         394 Nov 19 20:34 athena_gen.sh
-rw-r--r--    1 atlas022 atlas      178316 Nov 19 20:35 athena_result.out
-rw-r--r--    1 atlas022 atlas          54 Nov 19 20:35 AtRndmGenSvc.out
-rw-r--r--    1 atlas022 atlas         224 Nov 19 20:35 CLIDDBout.txt
-rw-r--r--    1 atlas022 atlas        4347 Nov 19 20:34 jobOptions.pythia.vbf-out.py
-rwxr-xr-x    1 atlas022 atlas       32957 Nov 19 20:35 PDGTABLE.MeV
-rw-r--r--    1 atlas022 atlas         325 Nov 19 20:35 PoolFileCatalog.xml
-rw-r--r--    1 atlas022 atlas      103198 Nov 19 20:35 vbf.pool.root
node065

[lxplus]more stdenis_D4ownFh81uH4DuQJckBpkQ/athena_gen.err
lcg_cr: No such file or directory
[lxplus]




uhf. Wrong file name specified in store

[lxplus]cat athena_gen.sh
#!/bin/bash
source $VO_ATLAS_SW_DIR/software/10.0.1/setup.sh
source $SITEROOT/dist/10.0.1/Control/AthenaRunTime/*/cmt/setup.sh
cp $SITEROOT/dist/10.0.1/InstallArea/share/PDGTABLE.MeV .
# Run the job:
athena.py jobOptions.pythia.vbf-out.py > athena_result.out 2>&1
ls -l
hostname
lcg-cr -v --vo atlas -d dcache.gridpp.rl.ac.uk \
   -l lfn:stdenis_vbf_001.pool.root file://`pwd`/vbf.pool.root
[lxplus]
[lxplus]edg-job-submit -vo atlas -o jid1 genscotgrid.jdl
                                                                                     
Selected Virtual Organisation name (from --vo option): atlas
Connecting to host gdrb01.cern.ch, port 7772
Logging to host gdrb01.cern.ch, port 9002
 
================================ edg-job-submit Success =====================================
 The job has been successfully submitted to the Network Server.
 Use edg-job-status command to check job current status. Your job identifier (edg_jobId) is:
 
 - https://gdrb01.cern.ch:9000/gWbamlt2omASqUdgkTHL1w
 
 The edg_jobId has been saved in the following file:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/jid1
=============================================================================================

[lxplus]export JOB=https://gdrb01.cern.ch:9000/gWbamlt2omASqUdgkTHL1w
[lxplus]./ckjob
Sat Nov 19 21:47:16 CET 2005
 
 
*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/gWbamlt2omASqUdgkTHL1w
Current Status:     Scheduled
Status Reason:      Job successfully submitted to Globus
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 20:46:57 2005
*************************************************************


*************************************************************
BOOKKEEPING INFORMATION:
 
Status info for the Job : https://gdrb01.cern.ch:9000/gWbamlt2omASqUdgkTHL1w
Current Status:     Done (Success)
Exit code:          0
Status Reason:      Job terminated successfully
Destination:        ce1-gla.scotgrid.ac.uk:2119/jobmanager-lcgpbs-dque
reached on:         Sat Nov 19 21:01:08 2005
*************************************************************
 
 
[lxplus]edg-job-get-output -dir . $JOB
 
Retrieving files from host: gdrb01.cern.ch ( for https://gdrb01.cern.ch:9000/gWbamlt2omASqUdgkTHL1w )
 
*********************************************************************************
                        JOB GET OUTPUT OUTCOME
 
 Output sandbox files for the job:
 - https://gdrb01.cern.ch:9000/gWbamlt2omASqUdgkTHL1w
 have been successfully retrieved and stored in the directory:
 /afs/cern.ch/user/s/stdenis/testarea/10.0.1/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-02-02/share/stdenis_gWbamlt2omASqUdgkTHL1w
 
*********************************************************************************
 
[lxplus]ls stdenis_gWbamlt2omASqUdgkTHL1w
athena_gen.err  athena_gen.out  athena_result.out  CLIDDBout.txt
[lxplus]more stdenis_gWbamlt2omASqUdgkTHL1w
 
*** stdenis_gWbamlt2omASqUdgkTHL1w: directory ***
 
[lxplus]more stdenis_gWbamlt2omASqUdgkTHL1w/athena_gen.out
total 348
-rw-r--r--    1 atlas022 atlas           0 Nov 19 20:49 athena_gen.err
-rw-r--r--    1 atlas022 atlas           0 Nov 19 20:49 athena_gen.out
-rwxr-xr-x    1 atlas022 atlas         391 Nov 19 20:49 athena_gen.sh
-rw-r--r--    1 atlas022 atlas      178316 Nov 19 20:49 athena_result.out
-rw-r--r--    1 atlas022 atlas          54 Nov 19 20:49 AtRndmGenSvc.out
-rw-r--r--    1 atlas022 atlas         224 Nov 19 20:49 CLIDDBout.txt
-rw-r--r--    1 atlas022 atlas        4347 Nov 19 20:49 jobOptions.pythia.vbf-out.py
-rwxr-xr-x    1 atlas022 atlas       32957 Nov 19 20:49 PDGTABLE.MeV
-rw-r--r--    1 atlas022 atlas         325 Nov 19 20:49 PoolFileCatalog.xml
-rw-r--r--    1 atlas022 atlas      103174 Nov 19 20:49 vbf.pool.root
node065
Using grid catalog type: edg
Source URL: file:///tmp/WMS_node065_026909_https_3a_2f_2fgdrb01.cern.ch_3a9000_2fgWbamlt2omASqUdgkTHL1w/vbf.pool.root
File size: 103174
VO name: atlas
Destination specified: dcache.gridpp.rl.ac.uk
Destination URL for copy: gsiftp://gftp0447.gridpp.rl.ac.uk:2811//pnfs/gridpp.rl.ac.uk/data/atlas/generated/2005-11-19/filea2e8d45a-c1e1-4d70-b35a-a59339acda2c
# streams: 1
# set timeout to 0 seconds
Alias registered in Catalog: lfn:stdenis_vbf_001.pool.root
 
Transfer took 3160 ms
Destination URL registered in Catalog: srm://dcache.gridpp.rl.ac.uk/pnfs/gridpp.rl.ac.uk/data/atlas/generated/2005-11-19/filea2e8d45a-c1e1-4d70-b35a-a59339acda2c
guid:04b65d22-83df-4c4c-b90b-5ff3e72d0225
[lxplus]!/af
/afs/cern.ch/atlas/offline/external/DQClient/dms3/dms3.py search '*stdenis*'
Using home grid:  ['http://atlfarm009.mi.infn.it:11122/']
Using all grids:  ['http://atlfarm009.mi.infn.it:11121/']
 
LPN: /stdenis_vbf_001.pool.root
GUID: 04b65d22-83df-4c4c-b90b-5ff3e72d0225
File size: None
MD5SUM: None
Creation date: None
Replicas:
  srm://dcache.gridpp.rl.ac.uk/pnfs/gridpp.rl.ac.uk/data/atlas/generated/2005-11-19/filea2e8d45a-c1e1-4d70-b35a-a59339acda2c
[lxplus]
[lxplus]more stdenis_gWbamlt2omASqUdgkTHL1w/athena_gen.err
       103174 bytes     48.02 KB/sec avg     48.02 KB/sec inst
[

The log for the generation is:athena_result18nov051830.txt: So this worked.


Major updates:
-- RichardStDenis - 19 Nov 2005

%RESPONSIBLE%
%REVIEW%

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r2 - 2005-11-20 - RichardStDenis
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback