What is HPSS?

HPSS (High Performance Storage System) allows you to store and retrieve large files in a relatively simple way. Instead of writing your files to tape, you can store files in your HPSS home directory, without worrying about bookkeeping issues.
It looks like a file system, which means that you can create your own directories in which you can store your files. With RFIO commands like "rfcp", "rfmkdir", "rfrm" etc. you can easily manage your files.
Storing large files in HPSS has some obvious advantages over the other storage possibilities you have:
  • No "garbage collecting" runs on HPSS. Little used files are moved from disk to tape, but this is transparant to the end user.
  • You can access other people's files. This can be very useful for group Ntuples and privately produced Monte Carlo sets. "pubarch" does not have this possibility.
  • No manual intervention is needed to stage in the files. Private user tapes normally are manually mounted, which can cause long delays outside office hours.
    More information can be found here.

    How can DELPHI users benefit from HPSS?

    Currently, there are already many ways to store, archive and/or back-up DELPHI user files. HPSS is intended for large files (> 100 MB), and especially DSTs and large Ntuples are good candidates for storage in HPSS.
    To access your HPSS files, you have to stage them in, with a command like
      delstage -M $HPSS_HOME/myfile.dst
    
    As with usual tape staging, the contents of your file are copied onto the diskserver, and a symbolic link is created in your directory.

    Practical points


    Example

    This part of a user script:
    # create PDLINPUT
    cat > PDLINPUT << EOF
    FILE=${HPSS_HOME}/dst/ww98.1.dst
    EOF
    
    # run the executable
    ./myprogram.exe
    
    results in:
     PHDST 3.13/01               IHEP/Protvino team              
     Compiled 990712.2145  Today is 990714.1419
    
      "FILE=hpsssrv1:/hpss/cern.ch/user/v/vaneldik/dst/ww98.1.dst"
    
     PHDST. "delstage -M hpsssrv1:/hpss/cern.ch/u" -
    
     PHDST. "ser/v/vaneldik/dst/ww98.1.dst"
     Wed Jul 14 14:19:40 MET DST 1999 delstage -M hpsssrv1:/hpss/cern.ch/user/v/vaneldik/dst/ww98.1.dst
     Wed Jul 14 14:19:40 MET DST 1999: DELSTAGE - Calling stagein to stage the tape with WAIT option
     DELSTAGE: Files staged, return code = 0
    
    ww98.1.dst
     PHDST. Open file hpsssrv1:/hpss/cern.ch/user/v/vaneldik/dst/ww98.1.dst
    etc...
    
    

    Group files

    If you want to give write permission for your HPSS-files to other Delphi people, you should create an HPSS subdirectory with the -m option:
         rfmkdir -m 775 $HPSS_HOME/groupfiles
    
    All Delphi people will be able to create (and delete!) files in this directory.
    For more details: man rfmkdir.

    Removing staged files

    When you update a file in your HPSS home directory, you have to make sure that your remove this file from the disk server! If the old version of your file is staged in, it will not be replaced automatically with the new version! To avoid this confusion, just issue:
         delstage -C -M $HPSS_HOME/myfile.paw
    
    when you update you file $HPSS_HOME/myfile.paw

    Copying HPSS-files to/from non-CERN machines

    Currently there is no mechanism to easily transfer HPSS-files to/from machines outside CERN. Direct ftp or rfcp is not possible.

    We suggest the following scheme, using the large /tmp disk on dxplus (or lxplus, please note however that /tmp area on hpplus and rsplus is very small!):

  • rlogin dxplus
  • cd /tmp
  • rfcp ${HPSS_HOME}/myfile /tmp/myfile
  • use ftp to transfer the file back home
  • rm /tmp/myfile

    WARNING!! Do not try to rfcp your HPSS file to an AFS area, like your $SCRATCH !!
    This file transfer is most likely to fail:

       dxplus05 [13] rfcp ${HPSS_HOME}/myfile ${AFS_SCRATCH}/myfile
       rfio_readlist() : I/O error
       copyfile_hpss(): -1 bytes out of 12877824 transfered
       rfcp : I/O error
    

    Problems

  • Currently, HPSS is being tested. It may happen that CERN decide not to use it for LHC data storage, and discontinue the project. Although the user files will not be lost, it may mean we will have to switch to another way of storing data. With $DELPHI_PAW, $CORE_SCRATCH, $AFS_SCRATCH, $SCRATCH_WEEK, pubarch, user tapes, ... we have quite enough possibilities already!
  • Avoid the use of shortcuts like ${HPSS_HOME}/../../p/pubxx when you want to stage other people's HPSS files. The resolving of this kind of names confuses the stager!
  • We (DELPHI support) do not have a lot of experience with this system ourselves. The recommendations made on this page reflect our knowledge, but it may well be that users will run into problems that we have not realized. Please give us feedback!
  • IT have developed a wrapper around the RFIO commands, called hsm.
    As always, questions, complaints, remarks:
    delphi-core@cern.ch
    Jan van Eldik May 5, 1999