✕
Chart of Hypernuclides ‒ General Information
The database for hypernuclei
This website provides access to an SQL database in which measurements of nuclear properties of
hypernuclei such as hyperon binding energies, lifetimes, or excitation energies are collected.
A systematic approach is followed in the treatment of the data set.
In some previous data sets, systematic uncertainties of measurements, a limited
validity of averaging procedures, or bias effects on the selection of data may have been
underestimated. In few cases, inconsistent or unresolvable discrepant measurements exist.
Please cite this website when using its generated plots or computed values. The concept of this
database, its statistical procedures, and first implementations have been discussed in P. Achenbach,
AIP Conf. Proc. 2130 (2019).
Some more details of the systematic treatment have been published in
P. Eckert, P. Achenbach et al.,
Rev. Mex. Fis. Suppl. 3, 0308069 (2022).
Shared and recommended systematic errors
Following the Particle Data Group, e.g.
Prog. Theor. Exp. Phys. 2020, 083C01 (2020)
or other editions, for averaging measurements that share systematic errors, a modified systematic error is computed that has the
advantage that each measurement may be treated as independent when averaging way with other data.
As suggested by D.H. Davis in
NPA547 (1992) 369c, a systematic error of 40 keV should be assumed
for any emulsion measurement done before 1990 if there was no systematic error published to
avoid an underestimation of errors. This recommendation is followed by checking the box:
Procedures of averaging
The standard averaging procedure used in this database is the variance weighted arithmetic mean.
Measurements with asymmetric uncertainties are combined using the maximum likelihood method described in
R. Barlow, "Asymmetric statistical errors" (2004),
arXiv:physics/0406120 and first applied to hypernuclear data in C. Rappold et al.
Phys. Lett. B 728, 543 (2014). We use Barlow's linear σ model for the probability density function
as it was found to be the most robust model for the hypernuclear use cases. Measurements with a strongly asymmetric uncertainty are
treated with a modified version of the σ function that prevents non-converging probability density
functions and numerical problems.
Treatment of errors and scaling of the overall uncertainty
Errors on a value
x that appear in the form
x ± δx were treated as if they were Gaussian.
Therefore, when such an error is indeed Gaussian,
δx is one standard deviation (σ),
for which the range
x ± δx corresponds to a 68.3% confidence interval about the central value
x. Statistical and systematic errors were considered separately and used for constructing
a total uncertainty for
x by adding all contributions in quadrature.
If errors on a value are dominated by systematic effects that tend to be non-Gaussian, an unconstrained average
value including that uncertainty could be severely affected.
Following the Particle Data Group, Review of Particle Physics, e.g.
Prog. Theor. Exp. Phys. 2020, 083C01 (2020)
or other editions, a scaling factor
S = ( χ² / ndf )
¹/₂
is calculated when the χ² sum is larger than the number of degrees of freedom (ndf = number of contributing
measurements – 1). For 1 <
S < 2 the data with their total uncertainties are considered mildly discrepant,
but consistent when the overall uncertainty of the average will get scaled with
S. According to
B. Taylor, "Numerical Comparisons of Several Algorithms for Treating Inconsistent Data in a Least-Squares
Adjustment of the Fundamental Constants", U.S. National Bureau of Standards (1982),
NBSIR 81-2426 this procedure is also known as Birge ratio
algorithm and should be used when handling dicrepant data. For
S > 2, we do not recommend an average value as we
assume that unknown uncertainties prevent its reliable determination. The scaling procedure for the overall uncertainties does not effect
the averages.
Exclusion of measurements
Only if
S > 1, measurements which contribute to an average with a weight of less than 1.5% are excluded from the averaging
procedure. The same holds for measurements with up to 5% weight if their χ² value is unreasonably high.
Excluded measurements appear gray in the data table. Any exclusion of measurements can be controlled manually.
Sorting of measurements
Measurements included in the averaging procedure are listed in order of the year of their publication.
Excluded measurements follow below in order of the year of their publication.
Ideograms
The overall experimental situation is depicted in form of an ideogram showing the combined probability
density function of a measured property compared to the evaluated average and its overall uncertainty.
Each measurement contributes to the ideogram by a
continuous probability density curve with a central value equal to the measured value, a width equal to total
uncertainty σ, and an area proportional to 1/σ. With this choice for the area, the center
of gravity of the idogram corresponds to an average of the measurements that would use weights equal to 1/σ
rather than the (1/σ)² used in the arithmetic mean. This approach assumes that large systematic errors are
not as infrequent as large statistical fluctuations. The 1986 edition of the Particle Data Group,
Phys. Lett. 170B, 1 (1986) has a detailed discussion on the use of ideograms.
Used particle masses
(all values in MeV/
c²):
| p | 938.272088 |
| π0 | 134.9766 |
| Λ | 1115.683 |
| n | 939.565420 |
| π± | 139.5702 |
| Σ- | 1197.449 |
|
Ξ- | 1321.71 |
About this site
Devised and maintained by Patrick Achenbach.
Contact and support via
e-mail.
This project was supported by the European Union's Horizon 2020 research and innovation programme under grant agreement No. 824093.