Loading AI tools
Concept in cosmology From Wikipedia, the free encyclopedia
In cosmology, the cosmological constant problem or vacuum catastrophe is the substantial disagreement between the observed values of vacuum energy density (the small value of the cosmological constant) and the much larger theoretical value of zero-point energy suggested by quantum field theory.
Depending on the Planck energy cutoff and other factors, the quantum vacuum energy contribution to the effective cosmological constant is calculated to be between 50 and as many as 120 orders of magnitude greater than observed,[1][2] a state of affairs described by physicists as "the largest discrepancy between theory and experiment in all of science"[1] and "the worst theoretical prediction in the history of physics".[3]
The basic problem of a vacuum energy producing a gravitational effect was identified as early as 1916 by Walther Nernst.[4][5][6] He predicted that the value had to be either zero or very small. In 1926, Wilhelm Lenz concluded that "If one allows waves of the shortest observed wavelengths λ ≈ 2 × 10−11 cm, ... and if this radiation, converted to material density (u/c2 ≈ 106), contributed to the curvature of the observable universe – one would obtain a vacuum energy density of such a value that the radius of the observable universe would not reach even to the Moon."[7][6]
After the development of quantum field theory in the 1940s, the first to address contributions of quantum fluctuations to the cosmological constant was Yakov Zeldovich in the 1960s.[8][9] In quantum mechanics, the vacuum itself should experience quantum fluctuations. In general relativity, those quantum fluctuations constitute energy that would add to the cosmological constant. However, this calculated vacuum energy density is many orders of magnitude bigger than the observed cosmological constant.[10] Original estimates of the degree of mismatch were as high as 120 to 122 orders of magnitude;[11][12] however, modern research suggests that, when Lorentz invariance is taken into account, the degree of mismatch is closer to 60 orders of magnitude.[12][13]
With the development of inflationary cosmology in the 1980s, the problem became much more important: as cosmic inflation is driven by vacuum energy, differences in modeling vacuum energy lead to huge differences in the resulting cosmologies. Were the vacuum energy precisely zero, as was once believed, then the expansion of the universe would not accelerate as observed, according to the standard Λ-CDM model.[14]
The calculated vacuum energy is a positive, rather than negative, contribution to the cosmological constant because the existing vacuum has negative quantum-mechanical pressure, while in general relativity, the gravitational effect of negative pressure is a kind of repulsion. (Pressure here is defined as the flux of quantum-mechanical momentum across a surface.) Roughly, the vacuum energy is calculated by summing over all known quantum-mechanical fields, taking into account interactions and self-interactions between the ground states, and then removing all interactions below a minimum "cutoff" wavelength to reflect that existing theories break down and may fail to be applicable around the cutoff scale. Because the energy is dependent on how fields interact within the current vacuum state, the vacuum energy contribution would have been different in the early universe; for example, the vacuum energy would have been significantly different prior to electroweak symmetry breaking during the quark epoch.[12]
The vacuum energy in quantum field theory can be set to any value by renormalization. This view treats the cosmological constant as simply another fundamental physical constant not predicted or explained by theory.[15] Such a renormalization constant must be chosen very accurately because of the many-orders-of-magnitude discrepancy between theory and observation, and many theorists consider this ad-hoc constant as equivalent to ignoring the problem.[1]
The vacuum energy density of the Universe based on 2015 measurements by the Planck collaboration is ρvac = 5.96×10−27 kg/m3 ≘ 5.3566×10−10 J/m3 = 3.35 GeV/m3[16][note 1] or about 2.5×10−47 GeV4 in geometrized units.
One assessment, made by Jérôme Martin of the Institut d'Astrophysique de Paris in 2012, placed the expected theoretical vacuum energy scale around 108 GeV4, for a difference of about 55 orders of magnitude.[12]
Some proposals involve modifying gravity to diverge from general relativity. These proposals face the hurdle that the results of observations and experiments so far have tended to be extremely consistent with general relativity and the ΛCDM model, and inconsistent with thus-far proposed modifications. In addition, some of the proposals are arguably incomplete, because they solve the "new" cosmological constant problem by proposing that the actual cosmological constant is exactly zero rather than a tiny number, but fail to solve the "old" cosmological constant problem of why quantum fluctuations seem to fail to produce substantial vacuum energy in the first place. Nevertheless, many physicists argue that, due in part to a lack of better alternatives, proposals to modify gravity should be considered "one of the most promising routes to tackling" the cosmological constant problem.[17]
Bill Unruh and collaborators have argued that when the energy density of the quantum vacuum is modeled more accurately as a fluctuating quantum field, the cosmological constant problem does not arise.[18] Going in a different direction, George F. R. Ellis and others have suggested that in unimodular gravity, the troublesome contributions simply do not gravitate.[19][20] Recently, a fully diffeomorphism-invariant action principle that gives the equations of motion for trace-free Einstein gravity has been proposed, where the cosmological constant emerges as an integration constant. [21]
Another argument, due to Stanley Brodsky and Robert Shrock, is that in light front quantization, the quantum field theory vacuum becomes essentially trivial. In the absence of vacuum expectation values, there is no contribution from quantum electrodynamics, weak interactions, and quantum chromodynamics to the cosmological constant. It is thus predicted to be zero in a flat spacetime.[22][23] From light front quantization insight, the origin of the cosmological constant problem is traced back to unphysical non-causal terms in the standard calculation, which lead to an erroneously large value of the cosmological constant. [24]
In 2018, a mechanism for cancelling Λ out has been proposed through the use of a symmetry breaking potential in a Lagrangian formalism in which matter shows a non-vanishing pressure. The model assumes that standard matter provides a pressure which counterbalances the action due to the cosmological constant. Luongo and Muccino have shown that this mechanism permits to take vacuum energy as quantum field theory predicts, but removing the huge magnitude through a counterbalance term due to baryons and cold dark matter only.[25]
In 1999, Andrew Cohen, David B. Kaplan and Ann Nelson proposed that correlations between the UV and IR cutoffs in effective quantum field theory are enough to reduce the theoretical cosmological constant down to the measured cosmological constant due to the Cohen–Kaplan–Nelson (CKN) bound.[26] In 2021, Nikita Blinov and Patrick Draper confirmed through the holographic principle that the CKN bound predicts the measured cosmological constant, all while maintaining the predictions of effective field theory in less extreme conditions.[27]
Some propose an anthropic solution,[28] and argue that we live in one region of a vast multiverse that has different regions with different vacuum energies. These anthropic arguments posit that only regions of small vacuum energy such as the one in which we live are reasonably capable of supporting intelligent life. Such arguments have existed in some form since at least 1981. Around 1987, Steven Weinberg estimated that the maximum allowable vacuum energy for gravitationally-bound structures to form is problematically large, even given the observational data available in 1987, and concluded the anthropic explanation appears to fail; however, more recent estimates by Weinberg and others, based on other considerations, find the bound to be closer to the actual observed level of dark energy.[29][30] Anthropic arguments gradually gained credibility among many physicists after the discovery of dark energy and the development of the theoretical string theory landscape, but are still derided by a substantial skeptical portion of the scientific community as being problematic to verify. Proponents of anthropic solutions are themselves divided on multiple technical questions surrounding how to calculate the proportion of regions of the universe with various dark energy constants.[29][17]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.