For the first time, DES scientists can combine measurements of the distribution of matter, galaxies, and galaxy clusters to advance our understanding of dark energy.
The universe is expanding at an ever-increasing rate, and while no one is sure why, researchers with the Dark Energy Survey (DES) at least had a strategy for figuring it out: They would combine measurements of the distribution of matter, galaxies, and galaxy clusters to better understand what’s going on.
Reaching that goal turned out to be pretty tricky, but now a team led by researchers at the Department of Energy’s SLAC National Accelerator Laboratory, Stanford University, and the University of Arizona have come up with a solution. Their analysis, published recently in Physical Review Letters, yields more precise estimates of the average density of matter as well as its propensity to clump together – two key parameters that help physicists probe the nature of dark matter and dark energy, the mysterious substances that make up the vast majority of the universe.
“It is one of the best constraints from one of the best data sets to date,” says Chun-Hao To, a lead author on the new paper and a graduate student at SLAC and Stanford working with Kavli Institute for Particle Astrophysics and Cosmology Director Risa Wechsler.
An early goal
When DES set out in 2013 to map an eighth of the sky, the goal was to gather four kinds of data: the distances to certain types of supernovae, or exploding stars; the distribution of matter in the universe; the distribution of galaxies; and the distribution of galaxy clusters. Each tells researchers something about how the universe has evolved over time.
Ideally, scientists would put all four data sources together to improve their estimates, but there’s a snag: The distributions of matter, galaxies, and galaxy clusters are all closely related. If researchers don’t take these relationships into account, they will end up “double counting,” placing too much weight on some data and not enough on others, To says.
To avoid mishandling all this information, To, University of Arizona astrophysicist Elisabeth Krause and colleagues have developed a new model that could properly account for the connections in the distributions of all three quantities: matter, galaxies, and galaxy clusters. In doing so, they were able to produce the first-ever analysis to properly combine all these disparate data sets in order to learn about dark matter and dark energy.
Adding that model into the DES analysis has two effects, To says. First, measurements of the distributions of matter, galaxies, and galaxy clusters tend to introduce different kinds of errors. Combining all three measurements makes it easier to identify any such errors, making the analysis more robust. Second, the three measurements differ in how sensitive they are to the average density of matter and its clumpiness. As a result, combining all three can improve the precision with which the DES can measure dark matter and dark energy.
In the new paper, To, Krause and colleagues applied their new methods to the first year of DES data and sharpened the precision of previous estimates for matter’s density and clumpiness.
Now that the team can incorporate matter, galaxies, and galaxy clusters simultaneously in their analysis, adding in supernova data will be relatively straightforward, since that kind of data is not as closely related with the other three, To says.
“The immediate next step,” he says, “is to apply the machinery to DES Year 3 data, which has three times larger coverage of the sky.” This is not as simple as it sounds: While the basic idea is the same, the new data will require additional efforts to improve the model to keep up with the higher quality of the newer data, To says.
“This analysis is really exciting,” Wechsler said. “I expect it to set a new standard in the way we are able to analyze data and learn about dark energy from large surveys, not only for DES but also looking forward to the incredible data that we will get from the Vera Rubin Observatory’s Legacy Survey of Space and Time in a few years.”
Reference: “Dark Energy Survey Year 1 Results: Cosmological Constraints from Cluster Abundances, Weak Lensing, and Galaxy Correlations” by C. To et al. (DES Collaboration), 6 April 2021, Physical Review Letters.
The research was a collaborative effort within the Dark Energy Survey and was supported by the National Science Foundation and the Department of Energy’s Office of Science.
Another way to explain Dark Energy is suggested by String Theory. All matter and energy, including photons (light), have vibrating strings as their basis.
String and anti-string pairs are speculated to be created in the quantum foam, a roiling energy field suggested by quantum mechanics, and they immediately annihilate each other. If light passes near these string/anti-string annihilations, perhaps some of that annihilation energy is absorbed by the string in the light. Then the Fraunhofer lines in that light will move a bit towards the blue and away from the red shift. As this continues in an expanding universe we get the same curve displayed by Perlmutter and colleagues at their Nobel Prize lecture, without the need for Dark Energy.
This speculation has the universe behaving in a much more direct way. Specifics can be found in my YouTube https://www.youtube.com/watch?v=epk-SMXbu1c
You know the drill: that is self promotion and linking to pseudoscience.
No other theory can touch the general relativistic ΛCDM model. The recent completed SDSS-IV extended Baryon Oscillation Spectroscopic Survey has released several results, but the strongest may be that dark energy is tested beyond reasonable doubt.
“By combining BAO and RSD, the team confirmed the existence of dark energy to a stunning confidence level of 11-Sigma. Typically, a scientific result to 5-Sigma is taken as confirmation. A result at 11-Sigma is so strong it is about as close to certainty that we can get. Dark energy and the accelerating expansion it drives is definitely real.”
[ https://www.universetoday.com/151042/11-sigma-detection-of-dark-energy-comes-from-measuring-over-a-million-extremely-distant-galaxies/ ]
Also notable is that another paper nearly empties out the observational space on flat space.
Its preprint version notes on BAO that it “allow constraints on curvature that are now roughly one order of magnitude within the detectable limit of σ(Ω_k) ∼ 0.0001″.
And they see Weinberg’s multiverse, which a natural consequence of eternal inflation of the type seen by the Planck collaboration 2018 in the cosmic background spectra, realized as a strong candidate for predicting the observed value of a flat space and the vacuum energy:
“Nevertheless, the observed consistency with flat ΛCDM at the higher precision of this work points increasingly towards a pure cosmological constant solution, for example, as would be produced by a vacuum energy finetuned to have a small value. This fine-tuning represents a theoretical difficulty without any agreed-upon resolution and one that may not be resolvable through fundamental physics considerations alone (Weinberg 1989; Brax & Valageas 2019). This difficulty has been substantially sharpened by the observations presented here.”
“Completed SDSS-IV extended Baryon Oscillation Spectroscopic Survey: Cosmological implications from two decades of spectroscopic surveys at the Apache Point Observatory
Shadab Alam et al.
Phys. Rev. D 103, 083533 – Published 28 April 2021”
What I find fascinating with this study, apart from the other technical advances, is that they find it makes an improvement to add the small neutrino masses to the basic νΛCDM model to get a νΛCDM model.