I found strange behavior of TTreeFormula for ROOT2.00/09 under IBM/AIX and
DEC-Alpha/OSF1.
If I get TTree object from a ROOT file and feed into TTreeFormula, its
function EvalInstance() works as expected. But if I create TTree object
myself, then strange behavior came.
Under interactive ROOT, I enter the sequence commands below
root [0] TTree *tree = new TTree("t1","an example");
root [1] int x,y;
root [2] tree->Branch("x",&x,"x");
root [3] tree->Branch("y",&y,"y");
root [4] x=19981; y=10;
root [5] char *cut1 = "x>1998";
root [6] TTreeFormula *f_cut1=new TTreeFormula("cut1",cut1,t1);
root [7] .p f_cut1->EvalInstance(0);
(Double_t)0.000000000000e+00 // under IBM/AIX
*** Break *** floating point exception // under Alpha/OSF1
if change x and y into type of float, then everything works. So, what is
the difference for TTree object created by myself and read from a file ?
How can different type variable make TTreeFormula::EvalInstance behaves
differently ?
And hints ? Thanks.
--Shuwei