✕
Data and Procedures
General Information This website
provides access to a database in which measurements of nuclear
properties of hypernuclei such as hyperon binding energies,
lifetimes, or excitation energies are collected. A systematic
approach is followed in the treatment of the data set. In some
previous data sets, systematic uncertainties of measurements, a
limited validity of averaging procedures, or bias effects on the
selection of data may have been underestimated. In few cases,
inconsistent or unresolvable discrepant measurements exist.
Reference
Please cite
hypernuclei.kph.uni-mainz.de when using generated plots or
computed values. The concept of this database, its statistical
procedures, and first implementations have been discussed in P.
Achenbach,
AIP Conf. Proc. 2130 (2019). Details
of the systematic treatment have been published in P. Eckert & P.
Achenbach et al.,
Rev. Mex. Fis. Suppl. 3, 0308069 (2022).
Storage and availability of the data
While the website reads the data from an XML file that provides a flexible,
tag‑based format ideal for exchange between systems, the data is stored
additionally in a SQL database for efficient querying.
A mirror site is being set up at CERN.
Procedures of averaging The standard
averaging procedure used in this database is the variance weighted
arithmetic mean. Measurements with asymmetric uncertainties are
combined using the maximum likelihood method described in R. Barlow,
"Asymmetric statistical errors" (2004),
arXiv:physics/0406120
and first applied to hypernuclear data in C. Rappold et al.
Phys. Lett. B 728, 543 (2014). We use Barlow's
linear σ model for the probability density function for the hypernuclear data and a solver that was
checked for convergence and robustness.
Treatment of errors and scaling of the
overall uncertainty Errors on a value
x that appear
in the form
x ± δx were treated as if they were Gaussian.
Therefore, when such an error is indeed Gaussian,
δx is one
standard deviation (σ), for which the range
x ± δx
corresponds to a 68.3% confidence interval about the central value
x.
Statistical and systematic errors were considered separately and
used for constructing a total uncertainty for
x by adding all
contributions in quadrature. If errors on a value are dominated by
systematic effects that tend to be non-Gaussian, an unconstrained
average value including that uncertainty could be severely affected.
Following the Particle Data Group, Review of Particle Physics, e.g.
Prog.
Theor. Exp. Phys. 2020, 083C01 (2020) or other editions, a scaling
factor
S = ( χ² / ndf )
¹/₂
is calculated when the χ² sum is larger than the number of
degrees of freedom (ndf = number of contributing measurements
– 1). For 1 <
S < 2 the data with their total
uncertainties are considered mildly discrepant, but consistent when
the overall uncertainty of the average will get scaled with
S.
This procedure is also known as Birge ratio algorithm and should be
used when handling discrepant data according to B. Taylor, "Numerical
Comparisons of Several Algorithms for Treating Inconsistent Data in
a Least-Squares Adjustment of the Fundamental Constants", U.S.
National Bureau of Standards (1982),
NBSIR 81-2426. For
S > 2, we do not
recommend an average value as we assume that unknown uncertainties
prevent its reliable determination. The scaling procedure for the
overall uncertainties does not effect the averages.
Exclusion of measurements In cases where
measurements have been reanalysed or recalibrated, all values are
listed, but only the more precise ones are included in the averaging
procedure. Preliminary results from the arXiv are listed but not
included. If there is substantial reason in the literature that a
measurement is superseded, it is excluded with a reference to the
publication with the discussion. In case a scaling factor of 1 <
S
< 2 indicates inconsistent measurements, measurements which
contribute to an average with a weight of less than 1.5% are
automatically excluded from the averaging procedure. The same holds
for measurements with up to 5% weight if their χ² value is
unreasonably high. Excluded measurements appear gray in the data
table. Any exclusion of measurements can be controlled manually.
Shared and recommended systematic errors
Following the Particle Data Group, e.g.
Prog.
Theor. Exp. Phys. 2020, 083C01 (2020) or other editions, for
averaging measurements that share systematic errors, a modified
systematic error is computed that has the advantage that each
measurement may be treated as independent when averaging way with
other data. Its value is indicated in brackets behind the systematic
error.
As suggested by D.H. Davis in
NPA547 (1992) 369c, a systematic error of
40 keV should be assumed for any emulsion measurement done
before 1990 if there was no systematic error published to avoid an
underestimation of errors. This recommendation is followed by
checking the box below the measurements of binding energies or masses.
Sorting of measurements Measurements are
listed in order of the year of their publication.
Measurements not assigned to a hypernucleus
Measurements not assigned to a specific hypernucleus, e.g. lifetime
results from delayed fission of heavy hypernuclei, can be found
in the hypernuclear chart at an approximate proton and neutron number
Z and N. Such hypernuclei are labeled as "u" (for unassigned) with an approximate mass number A.
Typically, these data are for a group of different hypernuclei
with some dispersion in the (A, Z) distribution.
Probability density functions and ideograms The overall experimental situation is
depicted in form of an ideogram showing the combined probability
density function of a measured property compared to the evaluated
average and its overall uncertainty. Each measurement contributes to
the ideogram by a continuous probability density curve with a
central value equal to the measured value, a width equal to total
uncertainty σ, and an area proportional to 1/σ. With
this choice for the area, the center of gravity of the ideogram
corresponds to an average of the measurements that would use weights
equal to 1/σ rather than the (1/σ)² used in the
arithmetic mean. This approach assumes that large systematic errors
are not as infrequent as large statistical fluctuations. The 1986
edition of the Particle Data Group,
Phys. Lett. 170B, 1 (1986) has a detailed
discussion on the use of ideograms.
When a part
of the probability density function lies in an unphysical region
(e.g., negative binding energy or lifetime), the whole function is
shown because only unbiased results can be combined meaningfully.
Used particle masses (all values in MeV/
c²):
| p |
938.272088 |
|
π0 |
134.9766 |
|
Λ |
1115.683 |
| n |
939.565420 |
|
π± |
139.5702 |
|
Σ- |
1197.449 |
|
Ξ- |
1321.71 |