ATLAS Software Workshop Minutes

CERN, February 23 - 27, 1998

These minutes are still under construction; the finals are bound to be different.

Day Time    
Mon 09:00 Topic S. Fisher: StP tutorial
Background
References Slides: HTML
Summary
Tue 09:00 Topic J. Knobloch: Introduction, Agenda
Background
References N/A
Summary
  • Main topics: LCB workshop, software process, world-wide computing, data base, control domain and reconstruction, simulation, LHC++, planning
  • D. Candlin: Try and have DIG report on Tuesday
  • L. Perini: Could be more useful to have dedicated working meetings on OO software in the first half of the week, second half then overview talks
  • Discussion will be held on Friday
Tue 09:20 Topic J. Knobloch: LCB workshop in Barcelona
Background
References N/A
Summary
  • Concentrate on software
  • Framework will be important topic
  • Invite "experts" (G. Pawlitzek - DLR, S. Bandinelli - ESI, J. Sventek - HP, J. Walton - NAG)
  • Aim: foster discussion about future software, in particular OO migration
  • Participants: ~ 15 each from IT, Alice, LHCB, ~ 20 each from Atlas, CMS
  • Session topics: Architecture
  • Session convener, some talks per session. Program is now biased towards a conference, too many talks by some people under Alice's umbrella, will be corrected this week (final agenda by end of this week)
  • Try and have one talk about basic choices if they are the same across experiments
  • Update on the agenda on Friday this week
  • Discussion of possible Atlas contributions to the LCB workshop
Tue 09:50 Topic L. Tuura: Naming conventions
Background
References N/A
Summary
  • Namespaces: Don't use them for now; once they can be used, they must use the package names. In general, there should be one level of name space, although some domains (eg. reconstruction) may want more. For now, use the (possibly abbreviated) package name as initial part of the class name
  • 'using' directive forbidden, 'using' declaration permitted only in implementation files
  • Package names should follow domain names
  • Use Atlas or subdetector defined names
  • Class names: Acronyms should be all uppercase, with underscores for separation if necessary. Plain English words should be capitalised.
  • Names of all other entities: all lowercase, with underscores for separation
  • It would be desirable to have a general scheme for the naming of 'physical' quantities such as hits, digis, clusters etc.
Discussion
  • The proposal will be summarized and widely distributed, inviting comments. The final decision will be taken in the next workshop.
  • The information must be checked automatically as much as possible. Easy access through a Web page is required.
Tue 10.25 Topic J. Hrivnac: Graphics
Background
References N/A
Summary
  • HEPVis workshop: Java highlights: Wired, Java Analysis Studio, HEP-Java mailing list (Tony Johnsson)
  • Analysis: LHC++, Root (many talks: Comparison to LHC++, Atlfast++, GH2Root, UI based on X95)
  • Atlantis: Well received; simple Atlas interface soon, full one: this summer
  • Geant4: difficult to use G4 visualisation packages outside G4
  • HEPVis library (HEP extensions to OI); three free implementations of OI
  • Agreement on terminology is required
Discussion Scripting languages; discuss general user requirements on Wednesday morning, after Lassi Tuura's overview over the control domain
Wed 09:00 Topic P. Hendriks: Arve introduction
Background
References Slides: HTML, gzipped PS
Summary
  • Developed by Toby Burnett
  • Source code: /afs/cern.ch/atlas/software/dist/00-00-00/arve
  • More info: http://wwwinfo.cern.ch/b/burnett/www/arve/
  • Object oriented, C++
  • Framework, not complete reconstruction program. Can plug algorithms in
  • Targeted at developers of code for now
  • Runs on PC and Unix (gcc, aCC)
  • Domains: Subsystems/modules, control, graphics, geometry and detector description, simulation based on Gismo
  • Subsystems related to physical subdetectors, modules are actions such as combined reconstruction etc.
  • Example: 'Slug' as subsystem which inherits from SubSystem, implements execute() and clear() methods
  • main() also needs to be implemented by user
  • Control domain: control of console and graphics window, defines menus and commands. Going to be replaced soon
  • Geometry: base classes, derived entities, geometrical operations
  • Arve comprises a very old version of CLHEP
  • Graphics: follows Atlas graphics standard, implementations of basic geometry entities implemented, but display of tracks etc. needs code
  • Example: Test-display, implements two boxes with their representations. 3D view comes for free with Arve
  • Detector description: Medium has detectors, materials, shapes, magnetic fields. Composite medium inherits from Medium, and implements complex structures
  • Example: Ecal. EcalBarrel constructor discussed which implements the barrel as vacuum tube, and four cylindrical detector layers
  • Hits are stored in a detector hierarchy
  • Example event with TRT and ECAL
  • Conclusion: To build an application is simple, the rest is up to the user.
Discussion
  • CLHEP should be removed from the Arve code base, should use latest version of CLHEP library
  • Is it worth implementing the full geometry for the Gismo simulation if we are going to read from Geant3 files soon? Do we get the geometry description from the Geant3 files?
  • RD: Event access will be available by April, geometry access via the events too
  • Helge: April framework prototype will be based on this version of Arve. Main areas of work: documentation, event access to Geant3 Zebra files
  • Port to aCC is required, Onne Peters has done it successfully already - get his changes in order to avoid duplication of work
Wed 10:25 Topic L. Tuura: Control domain issues
Background
References N/A
Summary
  • First version of component based control by end this week
  • Documentation still in rudimentary shape
  • No graphical editor for networks yet, but simple C++ like syntax
  • Integration with Arve is still missing, will start in March
  • Need to know which parts of Arve are really required by current Arve users
  • Component control is data driven
Discussion
  • Why do we want this flexibility? Isn't there a danger that people build the wrong network? Can wrap networks into super-components, not really different from today's situation now
Wed 11:10 Topic M. Stavrianakou: Scripting language requirements
Background
References N/A
Summary
  • Wanted: Analysis scenarios, compare CERNLIB with LHC++
  • Test case: alignment (TRT testbeam straw tube alignment)
  • CERNLIB: interactively with PAW, Sigma, Comis; batch: PAW and kumacs, or in Fortran using HBOOK/HPLOT
  • LHC++: interactively: Iris Explorer: Make and save a map by connecting existing modules (lots of clicking), or Iris Explorer script (Scheme like, not the thing physicists usually like), or encapsulate HEP C++ code in extra module. Batch: C++ program, ...
  • Simple to write an Iris Explorer module in C++, comprehensive example of the TRT testbeam straw tube alignment shown
  • How can one obtain the C++ code (or script) out of the clicking to link the modules?
Discussion
  • What do we want the scripting language to do? Probably need to distinguish journaling from scripting, want the latter
  • Much of the problems is actually triggered by our usage of Iris Explorer
  • Should we distinguish command languages from scripting ones?
  • Other use cases to be considered: Analysis on an event-by-event basis run on N-tuple like data store; graphical representations
Decisions
  • Collect more scripts and use cases
  • Ask control domain to follow it up
  • Report back at next workshop
Wed 11:55 Topic H. Meinhard: DIG decisions
Background
References Slides: HTML
Summary
  • Reviews: All developers are encouraged to submit their stuff for review early, even if further development is foreseen. Reviews should be considered a helpful hand rather than policing.
  • Event domain: All counting of channels must start at 0 for all subsystems.
  • Graphics domain: Work ongoing on Atlantis, Wired, and an OpenInventor based approach
  • Muons: Need proper description of passive material in detector description database
  • Magnetic field: Map implemented, next tracking in B field
  • Reconstruction: Track classes being redesigned
  • Compilers etc.: April release with HP-UX 10.20 with aCC 1.12 and WNT 4.0/x86 with VC++ 5.0/SP3, AIX and DUX later, observing Gnu cc
  • Differences in compiler capabilities and behaviour should be expressed as such and figured out by autoconf
  • gcc and aCC versions of CLHEP will be prepared by Atlas and immediately be fed back into the central repository
  • Naming convention proposal will be sent out
  • StP: Current PC release (2.4.2) is production one on PCs, although it may take a while until it becomes available on HP
Thu 09:00 Topic J. Pater: ID software
Background
References Slides: HTML, gzipped PS
Summary
  • Simulation virtually unchanged since 97_3
  • Services included in simulation
  • Reminder of geometry
  • Pixel simulation includes Lorentz angle effect, diffusion, threshold fluctuation, dead channels, electronics noise, controllable via data cards
  • Changes since 97_3: Changed sign of tilt angle (minimising cluster width), services, slight material increases. Test version available, see note in http://marpix1.in2p3.fr/Pixel/dice/intro_dice.html
  • SCT simulation includes Lorentz angle effect, diffusion, electronics crosstalk, random dead channels, noise, digital readout
  • Changes since 97_3: Inner ring from GaAs to Si, material to follow TDR, services
  • TRT: Each straw is a Geant tube, detailed description of material
  • More realistic TRT geometry is available, but not integrated yet with reconstruction. No sizeable impact on performance
  • TRT digitisation: rather detailed
  • Test version, still private: Changes in geometry, digitisation model, bug fixes, occupancy calculations improvements
  • Service changes prepared
  • Future changes: update material, include squirrel cage (conflicting designs, need further study)
  • Pattern recognition: iPatRec, xKalman, PixlRec, Astra
  • iPatRec: starting from space points in SCT and pixels, road with beam spot, TRT hits added later
  • Released version not changed since 97_3, 40000 lines in Fortran, procedural C++, and OO C++, fairly slow, works in Slug framework
  • September 97: Geometry and decoder converted to OO C++, lead to significant speed increase; included data base of dead modules and readout chips; misalignment per module capability
  • November 97: Track extension to TRT replaced by OO C++, reproduced TDR results well
  • Since then, track finder rewritten, now down to 15 s per 2-jet event with pileup. Can handle realistic B field. Version being tested, needs cleanup
  • Track fitter being extended to be useful to Muons, too
  • Plan to finish OO conversion by summer, integrate with other pattern recognition packages, reviews, documentation, make it work in Arve
  • xKalman: start with histogramming in TRT, local track search in precision layers
  • Heavily used for ID TDR, current release entirely Fortran, works in Slug and Atlsim, user guide and algorithm description exist
  • Being converted to OO C++, track reconstruction converted already, starting on read definition code
  • Future plans: Finish OO conversion, put into framework, test and reproduce TDR results, reviews, documentation, ...
  • Pixlrec: Start in pixel layer, add points in successive precision layer, add TRT information as in xKalman
  • First release in September 96, exploits fine granularity of pixel detectors, but very slow, used to understand ID performance
  • Entirely Fortran for now, user guide and algorithm description available
  • Astra: OO track finder using histogrammin in precision layers to find space points belonging to tracks, groups these space points to reduce combinatorial background, hence fast
  • Depends on vertex knowledge, however can be determined with filter
  • Started conventionally in 1995 with similar goals as Pixlrec
  • Only a track finder, no detector constants, no track fitting
  • Being developed in Atlantis, one event at a time
  • Future: Higher statistics test, run in Slug, tune parameters, reviews, use TRT information
  • Work going on toward modular pattern recognition code: common code (with priorities): Interfaces to data bases and magnetic field; clustering; track fitting; output (track class structure)
  • Other ID code: Conversion reconstruction, standard cuts to select good tracks, additional smearing to high-pT tracks from solenoidal field, use calorimeter position as extra hit, produce analytic curves of muon track resolution
  • Summary: Released software very stable, many developments going on
Discussion
  • G4 will implement transition radiation facilities, should get rid of standalone program
  • Coordinate errors: related to clustering, being studied
  • Is the slow consolidation a problem for ID? Atlsim harder and harder to use. Possible ecommendation: Use Arve as main development framework
Thu 10:05 Topic M. Stavrianakou: LHC++ status and evaluation
Background
References Slides: HTML, gzipped PS
Summary
  • Goals: mid 98: consistent release of all components
  • mid 99: Full production release
  • Old releases will disappear soon
  • New versions of commercial components expected by March, HEP adaptions need to be done
  • Objy 5: Sun and NT in 2 weeks time, other platforms later
  • Licensing: Purchases through Cocotime for all CERN programme, outside labs will have to sign agreement
  • Mathlib: Some functions now in section "C" of CERNLIB not yet available
  • CLHEP 1.2: Ported to WNT, no gcc support by LHC++ team
  • CLHEP future: could be extended, changes to be expected with STL introduction
  • Gemini: Unified C++ interface to Minuit, NagC, ...
  • HepExplorer: Migrate to Objy 5, STL. HistOOgram frozen for now, ported to NT, DEC, AIX; work on design of new version using templates etc.
  • Future releases of HepExplorer, HepInventor: What do we want to be implemented for the June release? People want contour and Lego 2D plot, plot annotations, simple picking
  • Iris Explorer: Scripting language survey
  • Geant4, Pythia7
  • Evaluation of LHC++ functionality: Testbeam straw tube alignment of TST (see Wednesday's discussion on scripting languages)
  • Improvements needed: Hbook2Objy (histogram names), histogram operations, batch fitting methods, histogram factories, improved graphics, PostScript output
Discussion
  • Licensing: Only for CERN experiments
  • Linux: Must be a real request by the collaborations. Vendors of commercial components can probably do it by the end of this year. Need to agree on kernel version and/or distribution, compiler, ...
  • Iris Explorer: Still worries whether it's the correct way to go. Evaluation will carry on
Decision Further discussion of Iris Explorer will be done in the following way:
  • A mailing list will be set up, initially comprising all Atlas participants in the Explorer training. The mailing list will be open, with all those Atlas members who have got experience with Explorer, invited to subscribe
  • The examples given in the course will be made available on the Atlas WGS
  • A questionnaire will be sent to the mailing list, asking for people's experience with Explorer
  • A discussion will be scheduled for the next workshop in May.
Thu 11:05 Topic S. Giani: Geant4 status
Background
References Slides: HTML, gzipped PS
Summary
  • 100 collaborators, 40 institutes, 10 experiments
  • History: R&D project for four years (until end 98)
  • Alpha01 (spring 97): Step-compliant geometry, HEPEVT input, EM showers, hadronic showers, calorimeter and tracker hits, ODBMS persistency, visualisation, 9 user classes
  • Milestones by end 97: Fast MC and shower parametrisations, persistency of full events, multiple visualisation and user interface drivers
  • Milestones for 1998: Mid 98: open beta release, including documentation, examples and tutorials, persistency of geometry, extension of physics processes, performance improvements. End 98: First production version
  • Current status: Approved documentation structure, persistency class category, muons up to PeV, transition radiation, geometry performance optimisation
  • Current developments: examples; geometry: booleans etc; physics: em processes from 1 eV to 10 TeV; tracking optimisation
  • Future: put new management structure in place for past-R&D period
Discussion
  • Support: favoured model is decentralised, based on MoU
  • Tuning with physics data has been done
  • Radiation background in LHC caverns is simulated well (was actually a milestone)
  • Physics processes very complete, nevertheless easy to add more if necessary
  • No particular preference for first 'production user'
  • No user support for using beta version in production mode
Thu 11:40 Topic M. Stavrianakou: MC consolidation
Background
References Slides: HTML, gzipped PS
Summary
  • Suggest to downgrade original aims, complete reverse engineering of Atlsim proved very difficult
  • Suggestion is now to implement a subset of interactive facilities into Slug/Dice
  • More specifically: Interactive event generation and file I/O, interactive event reconstruction, ...
  • Will use only very essential part of Atlsim, migrate from FFREAD to KUIP
  • Deadline: April 10, 1998; implement pile-up and geometry versioning later, consider alternative solutions if project does not converge by April
Discussion
  • Need a 'frozen' version of Atlsim anyway whatever the outcome of the consolidation
Fri 09:00 Topic J. Knobloch: Software releases and production
Background
References N/A
Summary
  • Problems with software encountered: incomplete feedback from physics software, late delivery in spite of deadlines, poorly tested, lack of personpower for production
  • Solutions: Stick to deadlines, system to contribute to testing, look for people for production
  • Solutions already advocated to Atlas Plenary in November 1994
  • Progress: Testing team has been set up with people from Lar, ID, Muon, and Tile
  • Production run only if physics need confirmed, production and bookkeping person(s) known, production resources available, production followed up
  • Many means in place already: Web pages
Discussion
  • Software releases, test area, SRT: SRT may allow to abandon the test area, actually being proposed
  • Main stumbling blocks for cvs/SRT migration: Dice, Atrecon; may require another CMZ release; conceivable to have all in by May workshop
Decisions
  • cvs/SRT is the recommended work tool now
  • Test area will be abandoned
  • Prepare proposal for how to use cvs/SRT, send it around, decision with developers on how to use cvs/SRT during next workshop?
Fri 09:30 Topic RD Schaffer: Summary of data base / WWCG meeting
Background
References Slides: HTML, gzipped PS
Summary
  • Objy V5: mid March ... mid April
  • Atlas data base server should be available soon
  • Finished review of detector description requirements
  • Milestones: G3 digits by 01/98, first version of detector description database by 06/98, access to parts of G4 events by 12/98
  • Access to digits: ready for ID. Added to model: Identifiers, detector hierarchy, ...
  • Identifiers: Proposal by C. Arnault
  • Clustering strategies for overlapping subsets (M. Schaller)
  • CDF Objy performance (S. Rolli): Found that Varrays of floats are much more performant than mapping of YBOS banks onto objects in Objy
  • DOE Grand Challenge (D. Malon): optimising storage and access using Objy/HPSS; particularly interesting: Query estimator, Storage manager, Order optimised iterator
  • Computing model analysis results (L. Perini): Where to put data, CPU. Start from CTP scenarios, with more variations. Further understanding needed
  • PAP for LCB project for data management (K. Sliwa): Well received, but need to define project more precisely. Relationship with RD45? Submission of PAP to LCB by June
Discussion
  • Can we meet the milestone of 1 TB in Objy by end 1998?
  • Prepare discussion during May workshop on how we go ahead with this milestone
Fri 09:55 Topic G. Poulard: Summary of reconstruction meeting
Background
References Slides: HTML
Summary
  • Subjects: Track classes, Atrecon, Move to OO
  • Atrecon: interface to AMDB, fix to calorimeter, updates of xKalman and Pixlrec, structures moved to +KEEP sequences, Muonbox included
  • Problems: Compiler on Atlas WGS with optimisation, crash if reconstructing muons of 100 GeV pT. Cvs migration awaiting stabilisation of the code; what should the structure of packages look like?
  • Combined reconstruction not yet discussed, people want common N-tuple formats for various pattern recognition package
  • There are still two fairly disjunct communities for Fortran and OO based developments
  • Track classes: Thinking about a common structure for ID and Muons, common for various pattern recognition. Looked at BaBar solutions. Difficult to agree because people are talking about two things: tracks as working storage during reconstruction, tracks as objects finally put out. Agreement on preliminary design of the latter. Need more input from users on their requirements
  • A. Poppleton's proposal will be implemented in next version of iPatRec
  • OO: Common code: Interface to DB, magnetic field, clustering (need to identify somebody to work on it), utilities (idem), track fitting
  • Integration into Arve: Which Arve version? All integration work has been done with previous version
Discussion
  • Need to become very clear soon on what to do for the Arve release
Fri 10:35 Topic J. Knobloch: Barcelona workshop agenda
Background
References N/A
Summary
  • Architecture, components and re-use, integration technology, domain decomposition, migration strategy (software process, environments for software development), planning, training, conclusions etc.
  • Some discussion about additional talks proposed by Alice
  • Discussion with experts on March 5th, interested parties invited to attend
Fri 10:50 Topic Discussion on future software weeks
Background
References N/A
Discussion
  • Low attendance, few people from the conventional Fortran software. Propose to concentrate presentations into half a week, have working meetings in the other half of the week. Week as it is now is too diluted to many people
  • Not everybody agrees with this proposal
  • Clash with system weeks may have kept attendance low, skiing holidays
  • Publicity could be improved by sending dedicated mails to mailing lists
  • Bad choice to have workshops adjacent to Atlas weeks
  • Good thing to have flexibility in the program
  • Ask mailing list for feedback on what people want changed
  • Should we have a dedicated meeting during the Atlas week?
  • Need to start planning the workshop earlier
  • Collection of background information: is considered useful
  • Computing coordinator reports: Is appreciated, should continue
Fri 11:10 Topic H. Meinhard / J. Knobloch: Summary of decisions, planning of next cycles
Background
References Slides: HTML
Summary N/A
Discussion
  • Where can we be in April/May?
  • Event and DB: all digits available from Zebra; review of design; revised user requirements, detector description by next cycle
  • Control: documentation for Lassi's component model, case example; integration of RD's event access as component, some graphics components as well if ready (depend in turn on event input). Network editor: perhaps design.
  • Graphics: Updated design, code prepared for review, interface to Atlantis. Object browser?
  • Muons: Integrate with events from Zebra for barrel, track fit in some magnetic field, reconstruction of muon spectrometers started
  • Reconstruction: Preliminary version of track class, implement Fortran developments in context of combined performance TDR
  • Analysis: More use cases; call LHC++ components from Arve
  • Tools: CodeCheck integration in SRT, start evaluating case tools, cvs/SRT migration completed
  • Framework milestone: Will be organised, weekly reviews
  • Geometry domain kick-off?


Jürgen Knobloch, Lassi A. Tuura, Helge Meinhard / February 1998
Last update: $Id: minutes.html,v 1.8 1998/03/14 21:02:50 helge Exp $