logo_cern.jpg Piotr Krzysztof SKOWRONSKI at CERN.CH logo_cern.jpg

Links

  My galleries
  PhD thesis pdf ps.gz
  WAW→GVA by car (pl)

Pages of my friends

  Seb
  Kubus
  Jarek Polok
  Michal Kwiatek
  Rafal Otto
  Adam Kisiel
Welcome,

I am applied physicist working at CERN. I was born, growing up and sudying in Warsaw.

This web page is meant for not professional content, although you can find below short descriptions what I am dealing with at the moment, what I was doing before and where and what I was studying.


Contact me by mail :
         
  or send me an SMS via my gateway


foto1.jpg

Shortcuts

  Old Alice-HBT pages
logo_clic.jpg
logo_ctf3.gif
logo_madx.gif
    2005 - NOW

   I started working for CTF3 ( CLIC Test Facility 3) on November 2005. My responsibilities cover

  • Machine modelling - the description of the machine in the language understandable by the programs for beam optics calculations (In our case it is MAD-X)
  • Developement of applications needed for the machine operations and measurements
  • I am also deeply involved in CTF3 commissioning and operations itself.
   I have become one of the developers of the MAD-X code, which is in the core of the CTF3 online model. I have been involved in the integration of PTC library (Polymorphic Tracking Code) into MAD-X. MAD-X is a world-wide established standard code for particle accelerators design and simulations. It was created few decades ago as a small fast program in Fortran 77. Since then it was constantly growing in functionality, meanwhile partially rewritten in C, it reached high level of complexity. It has several well known flaws and drawbacks, for example badly constructed input language. However, it is still the reference code, and all the new products always compare their result to the experiment and MAD.

Another very important MADX flaw is its limitation in the of order of the calculations (2nd). In studies of particle accelerators it is often required to account for higher order terms when calculating beam dynamics. They can drastically the beam properties or lead to instabilities (resonances). That is why it was decided to employ PTC and give the user possibility to use more precise but much slower algorithm.

PTC is a Fortran 95 library which enables the user to define most complicated accelerator configurations (f.g. recirculators) and calculate particle parameters with theoretically infinite precision. Theoretically, because as the requested precision grows, the computation time and required computer memory increase factorially.

I have implemented the following functionalities in MAD which use PTC

  • Ray tracking with acceleration
  • Twiss parameters calculation with acceleration
  • 3D layout visualization with ROOT
  • Implementation of parametric knobs: all Twiss parameter are given as functions (polynomials) of knob(s)



Links

  CTF3 model CVS repo
  CTF3 in 2007 techn drawing
  Mad keywords
  My PTC commands

Results

  Response Matrix Dec08
  CTF3 tracking with PTC

Publications

  EPAC08 Proc MOPP010
  EPAC08 Proc MOPP011
  PAC07 Proc THPMN063
  PAC07 Proc THPAN070
  ICAP06 Proc WEPPP12
  ICAP06 Proc WEPPP14
 

Presentations

  CTF3 ColTech Meeting Jan 09
  CARE'08 Nov 08
  CLIC Workshop Oct 08
  CTF3 ColTech Meeting Jan 08
AliceLogo.gif
MadHatter.gif
logo_hirg.jpg
      1999 - 2005

  My adventure with the ALICE experiment at CERN started during 4th year of my studies at Faculty of Applied Mathematics and Technical Physics. I joined scientific student association called CAMAC, where students could work on small but real projects, mainly in the field of computed aided experimentation. It was run by Dr Wiktor Peryt, who was, and still is, member of Nuclear Physics Division, and naturally most of the projects were connected to the experiments in which the Division was involved: ALICE at CERN and STAR at BNL. For example, we set up a test stand for silicon strip detectors developed at our collaborating institute Subatech Nantes. These detectors are used now in both, STAR and ALICE. We were using electron beam of van der Graaf accelerator in IChTJ institute, the detector was controlled by a PC via GPIB card.

Within the collaboration with Subatech every summer we could go to Ecole des Mines de Nantes where we were working on little bigger projects. For most of us the projects became at least partly the subject of MsC theses. Mine was 100% written in Nantes. I was working on the reconstruction software for the Silicon Strip Detectors. The sensors of these detectors are parallel strips. Both sides of the detector contain these strips, but one side is inclined with respect to the other with small angle of 30mrad (1.72 degree) so if one sees it from top they make a mesh of rhombuses. In case there is only one particle that passes thru the detector the reconstruction is trivial: one takes the point where the fired strips cross. But when there are 50 particles that go thru, the thing becomes not obvious because two pairs of not adjasent strips on each side create 4 crossings. But it also may happen that there is more complicated ambiguity. My thesis was to write an algorithm to find the biggest amount of (good) points, implement it check with simulations.

After the defense I got offer of PhD position in Division of Nuclear Science. My tutor was the boss of the division himself prof. Jan Pluta. I was dealing with preparation of ALICE experiment to particle correlations analyses, colloquially called Hanbury-Brown Twiss interferometry (HBT), due to its large similarity to this astrophysics technique. In both cases the principle is the same: measure an object that emits particles by correlations coming from quantum statistics. Then there are subtle differences between star diameter measurements with photons an size measurement of a system created in a nuclear collision with particles emerging from it.

My work included

  • Creation of simulation package for the correlation signal, so colled particle generator
  • Analysis package that includes implementation of different corrections that need to be applied
  • Study of the detector performance with massive simulations on the Grid that included
    1. Generation of events with the particle generator
    2. Calculation how the created particles go thru the detector, how they decay and how much energy they deposit in each piece of the detector.
    3. Calculation of the detector response on those particles going thru it.
    4. Reconstruction of the particles from the detector signals
    5. Thorough study how the simulated signal compares with reconstructed one, the resolution in the signal detection, the maximum size one can reconstruct.
    6. etc,
Since my work needed large extensions of the ALICE offline framework called AliRoot what needed close cooperation with many collaborators, in 2001 I was invited to CERN as an associate and later I became member of EP/AIP group under Federico Carminati supervision in frame of the Doctoral Student Programme. Within the work on the framework I have designed and implemented
  • So called NewIO project: the insulation layer between user code and ROOT I/O (files and trees).

    The main trigger of this development was related to the way ALICE data used to be stored. All types of data were placed in a single file, but on different TTrees. For large central events the file could reach easily more than 2GB, what was the limit of the Linux file system at the time. Even though this limitation of Linux was removed meanwhile, still having such big files was not very practical, especially that for most of the cases user needed only a small sub-set of the data. On top of that the analyses often were performed on the Grid, and this in turn involved copying of these large files over the network.

    I implemented an interface that manages data retrieval and storage the way it is requires minimal attention and coding from a physicist writing his/her simulation and analysis codes. The main difficulty in the design was in the high heterogeneity of the data: for each sub-detector there is at least 4 different types of data (hits, digits, clusters, reconstructed points, tracks, etc.), and for each sub-detectors they can be different, and its optimal way of storage is different. The implementation was based on utility classes called Loaders. See chapter 2 section 3 of my PhD thesis for the details.

  • Foundation library for Analyses.

    I was the first in the experiment who was writing a dedicated analysis software, and there was no underlying infrastructure prepared. Hence, I needed to design and implement one.

At the middle of my studentship, it turned out that for the professors at my university there is not enough physics content in the work I was doing. I needed to add something containing dozen pages with integrals, differentials and other snake like symbols. It was bizarre to me, since I was doing my PhD on Polytechnics, so the work was supposed to be rather on technology, but since the professors were after theoretical physics departments, there was no discussion possible.

Together with my tutor we contacted ALIEC Physics Coordinator at the time, Guy Paic. And he likes jets. No, no, not the airplanes. Here we mean jets of particles emerging from a collision, when a quark, gluon, gamma or something else gets kicked so strongly, that it has enough energy to decay into several particles. And since all the offspring comes from something that has a lot of momentum, all of them go more or less the same direction. Their traces create a geometrical jet-like shape. So, I was set in HBT, he was in jets, we found that it interesting to see what is going on with HBT signals if there is a lot of jets popping out, what is the case when the collision energy is large. And it was especially hot subject at the time due to the unexplained HBT results from the RHIC experiments, baptized as the HBT puzzle.

First we looked into heavy ion collisions. We have started working on the subject together with Boris Tomasik who was a fellow at theory department at the time. We have presented the results on a workshop on particle correlations in Warsaw and published the paper in Nucleonika. On the course of work on this paper we started looking into HBT in proton-(anti)proton collisions, since it was an input we needed in our model, since we assumed that HBT signal from jest is the same as in pp. And it turned out that there is few measurements, mostly 1 dimensional, a couple 2 dimensional, and that the 2 big experiments at TeVatron have not published the results on for some reasons. All of them measured increasing HBT radii with increasing number of observed particles. In the literature we found a few theoretical explanations for this effect, but let's say that none of them sounded plausible to us. In our understanding the time needed for parton hadronization obviously depends on number of decays it goes thru since each particle in the chain has its own proper decay time. And all is boosted due to high energy they bear (the decay time we observe in the lab frame is elongated by factor gamma=E/E0, where E is total energy of the particle and E0 is its energy at rest, i.e. mc2. And of course the space the process occurs is equally enlarged because everything flies in speed of light. So, as more particles in a jet there is more decay levels (branchings) it goes thru so it takes more time and space. We made the model, ran the calculations and we got very nice agreement of the obtained correlation functions with the measured ones. We published the paper in Journal of Physics G ( preprint available for free on arXiv).

All of this is described in the detail in my PhD thesis which I defended in January 2006.


ALICE Experiment
ALICE Off-Line

Publications

  J.Phys.G31 (2005) 1045-1054
          [arXiv:hep-ph/0504051]

  Nukleonika 49 (2004) 89
          [arXiv:nucl-th/0403007]

  arXiv:physics/0306111
  Nukleonika 49 (2004) 103

Presentations

  PWG2 2.06.05
  Alice-Week 24.06.04
  hbt.alice-week 16.06.03
  HIF 18.10.02
  its week 19.05.03
  offline week 11.06.02
  offline-week 11.06.03
  offline-week 6.03.03
  offline-week 14.09.04
  offline-week 30.06.04
  offline-week 8.03.04
  offline-week 10.03.04
  jets-hbt
  ppr-hbt 13.12.02
  video-conference 25.02.04
  wawa hbt
  Warsaw 10.05.02
logo_if.jpg
logo_pw.gif


logo_MISMaP.png
        What and Where I was studying

  After finishing primary school no. 121 in Warsaw (1991) I got to XIVLO in Warsaw (high school with strong mathematics profile, 1995). Already in primary school I decided the University I wanted to go. It was Technical Physics on Faculty of Applied Mathematics and Technical Physics. Although I have very bad lack for physics teachers in both, primary and high school, I did not change my mind. In the first case it was teacher of Russian by education, but since the demand for this language drastically dropped in Poland after 1989, she turned herself to a physics teacher. She was pressing the formulas to our heads without understanding them herself. Later on in the high school we got a crazy guy, who was terrorizing the pupils and teaching us to memorize rules, not even formulas, by hard. The most important was to have nice notes. Never mind, I could write a small book about his craziness.

Never the less, I still liked Physics, but I needed to read Restnick and Halliday regularly to understand what this guy was about on his classes. And I needed to understand it well since my notes were quite ugly. On the other hand, we had a great Mathematics teacher: Halina Gozelnik. She was tough, even very tough, but always fair. Thanks to her I had 50% off during my university education, because most of maths was already in the high school on much higher level.

I entered Warsaw University of Technology in 1995. When I was on the 4th year my Faculty of Applied Mathematics and Technical Physics was divided into Mathematics and Information Sciences and Physics. The faculty was really cool. It was studying, not learning like in other schools. We needed to understand, not to memorize. The questions on the exams were rather about how to develop a theory or a law starting from some basics, and not to give precise final formula.

Thanks to that I could start second university. I always liked biology, and especially its "chemistry" part. In 1997 I was accepted on College of Inter-Faculty Individual Studies in Mathematics and Natural Sciences. It is cool Faculty that enables it students picking any lecture from 7 different faculties. It allows getting "mixed specializations" like bio-informatics, bio-mathematics, psycho-chemistry, or any other combination of these 7 sciences. I followed Molecular Biology path and courses provided by Interdisciplinary Centre for Mathematical and Computational Modelling.

Unfortunately, the Biology Faculty at Warsaw University was not so cool as my primary faculty on WUT. It was like an army compared to a hippie kinder garden. Again formulas and definitions by hard with understanding them, 3 predefined examination sessions each 2 weeks long, and you have to pass all the exams you have within this short period. I managed to pass all these exams, but I was disappointing to loose points because the definition you gave had two words swapped. Arrrgh. It was rare to find an open minded scientist on this department. Ah, I generalized too much: certainly the guys from Microbiology and Genetics were really cool.

On the course of my MsC thesis preparation on WUT, I was twice invited to come to CERN for 1 or 2 weeks. But it was completely inacceptable to skip more than one lab on Biology. So, after 2 years I had to stop the adventure with the second degree. It was pity, but I do not regret. Thanks to this choice I am here at CERN since 2001.

Univ Links