This page provides a
TMVA plugin for the Versatile Artificial Neural Network Package
JetNet as well as some basic instructions on how to use it.
JetNet is a FORTRAN package from the University of Lund (Sweden) that provides Artificial Neural Networks as a Multivariate Analysis method. The version 3.5 used for the plugin is from 16.04.1997 and was modified by
ChristianWeiser to enable weighted training. For more information about
JetNet, see
CERN-TH-7135-94.
In addition, a C++ interface for
JetNet written by
GiacintoPiacquadio is used to build the plugin on top. As a baseline for the plugin, the existing
NeuroBayes interface for
TMVA was used. More information on
TMVA can be found
here.
The Plugin Code
The plugin code can be downloaded from
SVN. To check out the code, do
#check out from svn
svn co svn+ssh://svn.cern.ch/reps/atlas-bmoser/tmva-jetnet/trunk
Quick Start
For a quick start of the plugin, you can run it on one of CERNs lxplus machines. To do this, please log in a clean session. You can then copy the code from below to set things up.
#set up the atlas environment
setupATLAS
#setup ROOT (the plugin has be testet with ROOT 5.34 and ROOT 6.04)
localSetupROOT
#change ROOT version to 5.34.25 if needed
#localSetupROOT --rootVersion=5.34.25-x86_64-slc6-gcc48-opt
(
note: Up to now, the plugin was only tested with the
ROOT versions 5.34/25 and 6.04/14)
#if not already done create a target directory and check out the plugin from SVN
mkdir targetDir
cd targetDir
svn co svn+ssh://svn.cern.ch/reps/atlas-bmoser/tmva-jetnet/trunk
#create the shared libraries needed
make
(
note: If one get's stuck in rootcint, just tipe .q to exit)
#go to the example directory
cd example
#run the example script
root -l -b runTraining.C
The training settings for
TMVA are set in "training.C". As usual in
TMVA one uses the option string in "factory->BookMethod()" to set the
JetNet parameters. The example script trains with a dummy tree ("sampleTree.root") with gaussian distributed signal and background events. If one runs the script as it is provided, it splits the sample tree in two halfs, trains with the first half, tests its performance with the second half and changes the order afterwards. The outputs are therefore labeled as "Output_JetNet_0of2.root" and "Output_JetNet_1of2.root". To use the
TMVA GUI to view the output, do in the "/example" directory
root -l
#if one uses ROOT 6.04
TMVA::TMVAGui("Output_JetNet_0of2.root")
#if one uses ROOT 5.34
.L gui.C
gui("Output_JetNet_0of2.root")
An example how to read back a trained network to propagate data through is given in "readNetwork.cxx" that can be started using
root -l -b runRead.C
This code propagates the part of the tree used as test sample in the first training through the net optimised in that training.
Tested Platforms
In principle the plugin should also work well on local machines. Up to now it has been tested on linux machines with the following setup:
- ROOT version 5.34.25
- g++ 4.8.1
- GNU Fortran 4.8.1
or
- ROOT version 6.04.14
- g++ 4.9.3
- GNU Fortran 4.9.3
Performance Monitoring
A lot of performance monitoring can be done using the "Output_JetNet_*of2.root" file created by
TMVA. However, the training and test errors are not written there by
TMVA. They are directly written out by the pluging and stored in "weights/TMVAClassification_JetNet_*of2.Monitoring.root".
Parameters for JetNet Option String
In this section, the possible parameters that can be given to
JetNet via the option string, their default values and their corresponding parameters in
CERN-TH-7135-94.
are listed.
Name |
Explanation |
Default Value |
Corresponding Parameter |
ErrorFunc |
type of the error function |
0 |
MSTJN(4) |
Momentum |
optional momentum for the training |
0 |
PARJN(2) |
WeightUpdate |
weight update after number of patterns |
3 |
MSTJN(2) |
NMaxTrainingIter |
number of maximal training iterations if minimum is not reached before |
100 |
no corresponding parameter |
NMaxTrainRiseErr |
number of training itarations with rising error after that training is aborted and the configuration with minimum error is chosen (if -1, training is always done up to max number of iterations) |
-1 |
no corresponding parameter |
UpdatingMethod |
training method (default: backpropagation) |
0 |
MSTJN(5) |
LearningRate |
learning rate |
0.05 |
PARJN(1) |
InitialWeightsWidth |
width of the randomly set initial weights |
1.0 |
PARJN(4) |
LearningRateDecrease |
additional decreasing factor for the learning rate (1.0 means no decreasing) |
1.0 |
PARJN(11) |
ActivationFunction |
type of activation function for the neurons |
1 |
MSTJN(3) |
HiddenLayers |
structure of the hidden layers (e.g. N+2,N+1 for two hidden layers and N being the number of input variables |
N+2 |
MSTJN(1) and MSTJN(10+) |
About the Plugin
Here a short process diagram on how the plugin actually works and what is done by
TMVA and what by
JetNet.
A simplified class diagramm with all important classes can be seen here.
If you need additional information you can't find here and in the given links, feel free to ask.
My Links
My Personal Preferences
- Show tool-tip topic info on mouse-over of WikiWord links, on or off:
- Set LINKTOOLTIPINFO = off
- Preference for the editor, default is the WYSIWYG editor. The options are raw, wysiwyg:
- More preferences TWiki has system wide preferences settings defined in TWikiPreferences. You can customize preferences settings to your needs: To overload a system setting, (1) do a "raw view" on TWikiPreferences, (2) copy a
Set VARIABLE = value
bullet, (3) do a "raw edit" of your user profile page, (4) add the bullet to the bullet list above, and (5) customize the value as needed. Make sure the settings render as real bullets (in "raw edit", a bullet requires 3 or 6 spaces before the asterisk).
Related Topics