Expansion of Universe accelerating

10

Ask Ethan: Does Dark Energy Mean We’re Losing Information About The Universe?

“The universe’s expansion means our visible horizon is retreating; things faraway are vanishing continuously. (Albeit slowly, right now.) This would seem to imply we are losing information about the universe. So why is it the idea of losing information in a black hole’s event horizon is so controversial, if we’re constantly losing information to another horizon?”

As you look to greater and greater distances, you’re looking back in time in the Universe. But thanks to dark energy, what we can see and access today isn’t always going to be accessible. As galaxies grow more distant with the accelerated expansion of the Universe, they eventually recede faster than the speed of light. At present, 97% of the galaxies in the Universe aren’t reachable by us, even at the speed of light. But that isn’t the same as losing information. As a galaxy crosses over the horizon, its information never disappears from the Universe connected to us entirely. Instead, it gets imprinted on the cosmic horizon, the same way that information falling into a black hole gets imprinted on its event horizon. But there’s a fundamental difference between a black hole’s decaying horizon to the cosmic horizon’s eternal persistence, and that makes all the difference.

Come learn why even with dark energy, we don’t lose information about the Universe, but why the black hole information paradox is real!

Cartography of the Cosmos

There are hundreds of billions of stars in our own Milky Way galaxy. Estimates indicate a similar number of galaxies in the observable universe, each with its own large assemblage of stars, many with their own planetary systems. Beyond and between these stars and galaxies are all manner of matter in various phases, such as gas and dust. Another form of matter, dark matter, exists in a very different and mysterious form, announcing its presence indirectly only through its gravitational effects.

This is the universe Salman Habib is trying to reconstruct, structure by structure, using precise observations from telescope surveys combined with next-generation data analysis and simulation techniques currently being primed for exascale computing.

Keep reading

2

STAR MERGERS: A NEW TEST OF GRAVITY, DARK ENERGY THEORIES

When scientists recorded a rippling in space-time, followed within two seconds by an associated burst of light observed by dozens of telescopes around the globe, they had witnessed, for the first time, the explosive collision and merger of two neutron stars.

The intense cosmological event observed on Aug. 17 also had other reverberations here on Earth: It ruled out a class of dark energy theories that modify gravity, and challenged a large class of theories.

Dark energy, which is driving the accelerating expansion of the universe, is one of the biggest mysteries in physics. It makes up about 68 percent of the total mass and energy of the universe and functions as a sort of antigravity, but we don’t yet have a good explanation for it. Simply put, dark energy acts to push matter away from each other, while gravity acts to pull matter together.

The neutron star merger created gravitational waves – a squiggly distortion in the fabric of space and time, like a tossed stone sending ripples across a pond – that traveled about 130 million light-years through space, and arrived at Earth at almost the same instant as the high-energy light that jetted out from this merger.

The gravity waves signature was detected by a network of Earth-based detectors called LIGO and Virgo, and the first intense burst of light was observed by the Fermi Gamma-ray Space Telescope.

That nearly simultaneous arrival time is a very important test for theories about dark energy and gravity.

“Our results make significant progress to elucidate the nature of dark energy,” said Miguel Zumalacárregui, a theoretical physicist who is part of the Berkeley Center for Cosmological Physics at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley.

“The simplest theories have survived,” he said. “It’s really about the timing.”

He and Jose María Ezquiaga, who was a visiting Ph.D. researcher in the Berkeley Center for Cosmological Physics, participated in this study, which was published Dec. 18 in the journal Physical Review Letters.

A 100-year-old “cosmological constant” theory introduced by Albert Einstein in relation to his work on general relativity and some other theories derived from this model remain as viable contenders because they propose that dark energy is a constant in both space and time: Gravitational waves and light waves are affected in the same way by dark energy, and thus travel at the same rate through space.

“The favorite explanation is this cosmological constant,” he said. “That’s as simple as it’s going to get.”

There are some complicated and exotic theories that also hold up to the test presented by the star-merger measurements. Massive gravity, for example – a theory of gravity that assigns a mass to a hypothetical elementary particle called a graviton – still holds a sliver of possibility if the graviton has a very slight mass.

Some other theories, though, which held that the arrival of gravitational waves would be separated in time from the arriving light signature of the star merger by far longer periods – stretching up to millions of years – don’t explain what was seen, and must be modified or scrapped.

The study notes that a class of theories known as scalar-tensor theories is particularly challenged by the neutron-star merger observations, including Einstein-Aether, MOND-like (relating to modified Newtonian dynamics), Galileon, and Horndeski theories, to name a few.

With tweaks, some of the challenged models can survive the latest test by the star merger, Zumalacárregui said, though they “lose some of their simplicity” in the process.

Zumalacárregui joined the cosmological center last year and is a Marie Sklodowska-Curie global research fellow who specializes in studies of gravity and dark energy.

He began studying whether gravitational waves could provide a useful test of dark energy following the February 2016 announcement that the two sets of gravitational-wave detectors called LIGO (the Laser Interferometer Gravitational-Wave Observatory) captured the first confirmed measurement of gravitational waves. Scientists believe those waves were created in the merger of two black holes to create a larger black hole.

But those types of events do not produce an associated burst of light. “You need both – not just gravitational waves to help test theories of gravity and dark energy,” Zumalacárregui said.

Another study, which he published with Ezquiaga and others in April 2017, explored the theoretical conditions under which gravity waves could travel at a different velocity than light.

Another implication for this field of research is that, by collecting gravitational waves from these and possibly other cosmological events, it may be possible to use their characteristic signatures as “standard sirens” for measuring the universe’s expansion rate.

This is analogous to how researchers use the similar light signatures for objects – including a type of exploding stars known as Type Ia supernovae and pulsating stars known as Cepheids – as “standard candles” to gauge their distance.

Cosmologists use a combination of such measurements to build a so-called distance ladder for gauging how far away a given object is from Earth, but there are some unresolved discrepancies that are likely due to the presence of space dust and imperfections in calculations.

Gathering more data from events that generate both gravitational waves and light could also help resolve different measurements of the Hubble constant – a popular gauge of the universe’s expansion rate.

The Hubble rate calibrated with supernovae distance measurements differs from the Hubble rate obtained from other cosmological observations, Zumalacárregui noted, so finding more standard sirens like neutron-star mergers could possibly improve the distance measurements.

The August neutron star merger event presented an unexpected but very welcome opportunity, he said.

“Gravitational waves are a very independent confirmation or refutation of the distance ladder measurements,” he said. “I’m really excited for the coming years. At least some of these nonstandard dark energy models could explain this Hubble rate discrepancy.

“Maybe we have underestimated some events, or something is unaccounted for that we’ll need to revise the standard cosmology of the universe,” he added. “If this standard holds, we will need radically new theoretical ideas that are difficult to verify experimentally, like multiple universes – the multiverse. However, if this standard fails, we will have more experimental avenues to test those ideas.”

New instruments and sky surveys are coming online that also aim to improve our understanding of dark energy, including the Berkeley Lab-led Dark Energy Spectroscopic Instrument project that is scheduled to begin operating in 2019. And scientists studying other phenomena, such as optical illusions in space caused by gravitational lensing – a gravity-induced effect that causes light from distant objects to bend and distort around closer objects – will also be useful in making more precise measurements.

“It could change the way we think about our universe and our place in it,” Zumalacárregui said. “It’s going to require new ideas.”


TOP IMAGE….Artist’s illustration of two merging neutron stars. The rippling space-time grid represents gravitational waves that travel out from the collision, while the narrow beams show the bursts of gamma rays that are shot out just seconds after the gravitational waves. Swirling clouds of material ejected from the merging stars are also depicted. The clouds glow with visible and other wavelengths of light. (Credit: NSF/LIGO/Sonoma State University/A. Simonnet)

LOWER IMAGE….Data from the neutron star merger observed Aug. 17 disfavor a range of theories, including many based around quintic Galileon cosmologies. This graph shows about 300 of these Galileon variants, with the green-shaded ones disfavored by the observed merger event. (Credit: Berkeley Lab, Physical Review Letters)

2

ARTIFICIAL INTELLIGENCE ANALYZES GRAVITATIONAL LENSES 10 MILLION TIMES FASTER

** Synopsis: SLAC and Stanford researchers demonstrate that brain-mimicking ‘neural networks’ can revolutionize the way astrophysicists analyze their most complex data, including extreme distortions in spacetime that are crucial for our understanding of the universe. **

Researchers from the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have for the first time shown that neural networks – a form of artificial intelligence – can accurately analyze the complex distortions in spacetime known as gravitational lenses 10 million times faster than traditional methods.

“Analyses that typically take weeks to months to complete, that require the input of experts and that are computationally demanding, can be done by neural nets within a fraction of a second, in a fully automated way and, in principle, on a cell phone’s computer chip,” said postdoctoral fellow Laurence Perreault Levasseur, a co-author of a study published today in Nature.

Lightning Fast Complex Analysis

The team at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), a joint institute of SLAC and Stanford, used neural networks to analyze images of strong gravitational lensing, where the image of a faraway galaxy is multiplied and distorted into rings and arcs by the gravity of a massive object, such as a galaxy cluster, that’s closer to us. The distortions provide important clues about how mass is distributed in space and how that distribution changes over time – properties linked to invisible dark matter that makes up 85 percent of all matter in the universe and to dark energy that’s accelerating the expansion of the universe.

Until now this type of analysis has been a tedious process that involves comparing actual images of lenses with a large number of computer simulations of mathematical lensing models. This can take weeks to months for a single lens.

But with the neural networks, the researchers were able to do the same analysis in a few seconds, which they demonstrated using real images from NASA’s Hubble Space Telescope and simulated ones.

To train the neural networks in what to look for, the researchers showed them about half a million simulated images of gravitational lenses for about a day. Once trained, the networks were able to analyze new lenses almost instantaneously with a precision that was comparable to traditional analysis methods. In a separate paper, submitted to The Astrophysical Journal Letters, the team reports how these networks can also determine the uncertainties of their analyses.

Prepared for Data Floods of the Future

“The neural networks we tested – three publicly available neural nets and one that we developed ourselves – were able to determine the properties of each lens, including how its mass was distributed and how much it magnified the image of the background galaxy,” said the study’s lead author Yashar Hezaveh, a NASA Hubble postdoctoral fellow at KIPAC.

This goes far beyond recent applications of neural networks in astrophysics, which were limited to solving classification problems, such as determining whether an image shows a gravitational lens or not.

The ability to sift through large amounts of data and perform complex analyses very quickly and in a fully automated fashion could transform astrophysics in a way that is much needed for future sky surveys that will look deeper into the universe – and produce more data – than ever before.

The Large Synoptic Survey Telescope (LSST), for example, whose 3.2-gigapixel camera is currently under construction at SLAC, will provide unparalleled views of the universe and is expected to increase the number of known strong gravitational lenses from a few hundred today to tens of thousands.

“We won’t have enough people to analyze all these data in a timely manner with the traditional methods,” Perreault Levasseur said. “Neural networks will help us identify interesting objects and analyze them quickly. This will give us more time to ask the right questions about the universe.”

A Revolutionary Approach

Neural networks are inspired by the architecture of the human brain, in which a dense network of neurons quickly processes and analyzes information.

In the artificial version, the “neurons” are single computational units that are associated with the pixels of the image being analyzed. The neurons are organized into layers, up to hundreds of layers deep. Each layer searches for features in the image. Once the first layer has found a certain feature, it transmits the information to the next layer, which then searches for another feature within that feature, and so on.

“The amazing thing is that neural networks learn by themselves what features to look for,” said KIPAC staff scientist Phil Marshall, a co-author of the paper. “This is comparable to the way small children learn to recognize objects. You don’t tell them exactly what a dog is; you just show them pictures of dogs.”

But in this case, Hezaveh said, “It’s as if they not only picked photos of dogs from a pile of photos, but also returned information about the dogs’ weight, height and age.”

Although the KIPAC scientists ran their tests on the Sherlock high-performance computing cluster at the Stanford Research Computing Center, they could have done their computations on a laptop or even on a cell phone, they said. In fact, one of the neural networks they tested was designed to work on iPhones.

“Neural nets have been applied to astrophysical problems in the past with mixed outcomes,” said KIPAC faculty member Roger Blandford, who was not a co-author on the paper. “But new algorithms combined with modern graphics processing units, or GPUs, can produce extremely fast and reliable results, as the gravitational lens problem tackled in this paper dramatically demonstrates. There is considerable optimism that this will become the approach of choice for many more data processing and analysis problems in astrophysics and other fields.”

TOP IMAGES….KIPAC researchers used images of strongly lensed galaxies taken with the Hubble Space Telescope to test the performance of neural networks, which promise to speed up complex astrophysical analyses tremendously. (Yashar Hezaveh/Laurence Perreault Levasseur/Phil Marshall/Stanford/SLAC National Accelerator Laboratory; NASA/ESA)

LOWER IMAGE….Scheme of an artificial neural network, with individual computational units organized into hundreds of layers. Each layer searches for certain features in the input image (at left). The last layer provides the result of the analysis. The researchers used particular kinds of neural networks, called convolutional neural networks, in which individual computational units (neurons, gray spheres) of each layer are also organized into 2-D slabs that bundle information about the original image into larger computational units. (Greg Stewart/SLAC National Accelerator Laboratory)

anonymous asked:

Hey. I was wondering if you could answer a few questions about a project I'm doing for chemistry. So we got to chose a topic and it's basically a lab report that we must find that goes with our title, so my title is How Much Has The Universe Expanded Over Time and I'm not sure how to go about the materials needed for finding that as well as the procedural steps, control, independent, and dependent variables. Anything you have to say would be a big help.

Hmm since I don’t exactly know the specifics of your project my answers may not be too helpful. With that in mind, and judging by the title you’ve chosen, I would probably talk about the Hubble Constant. The Hubble Constant is the unit of measurement used to describe the rate of expansion of the universe. I’d also talk about the early universe expanding at a slower rate and the possibility that the expansion is accelerating in our present time.

There is a quite a lot of information you could cover in your lab report about the expansion of the universe. Also, since this is all heavy in astronomy and physics, I’m not exactly sure how much Chemistry you have to toss into it. Nevertheless, here are a few other resources related to what I’m talking about:

Hubble Law and the Expanding Universe
Universe Expansion
Evidence for an Accelerating Universe

Hope this helps! If you have any other questions please let me know.

7

How long has the Universe been accelerating?

“The Universe has been accelerating for the past six billion years, and if we had come along sooner than that, we might never have considered an option beyond the three possibilities our intuition would have led us to. Instead, we get to perceive and draw conclusions about the Universe exactly as it is, and that’s perhaps the greatest reward of all.”

One of the biggest surprises in our understanding of the Universe came at the end of the 20th century, when we discovered that the Universe wasn’t just expanding, but that the expansion was accelerating. That means the fate of our Universe is a cold, lonely and isolated one, but it’s a fate that we wouldn’t have uncovered if we were born when the Universe was just half its current age. By understanding the Universe’s expansion history and determining what the different components are that it’s made of, we can figure out exactly how long the Universe has been accelerating. We find that dark energy rose to prominence some 7.8 billion years ago, and the Universe has been accelerating for the last 6 billion years. As the acceleration continues, more and more galaxies become unreachable from our perspective, even at the speed of light; that number’s already up to 97% of the galaxies in our visible Universe.

10

Could a new type of supernova eliminate dark energy?

“Imagine you had a box of candles that you thought were all identical to one another: you could light them up, put them all at different distances, and immediately, just from measuring the brightness you saw, know how far away they are. That’s the idea behind a standard candle in astronomy, and why type Ia supernovae are so powerful.

But now, imagine that these candle flames aren’t all the same brightness! Suddenly, some are a little brighter and some are a little dimmer; you have two classes of candles, and while you might have more of the brighter ones close by, you might have more of the dimmer ones far away. That’s what we think we’ve just discovered with supernovae: there are actually two separate classes of them, where one’s a little brighter in the blue/UV, and one’s a little brighter in the red/IR, and the light curves they follow are slightly different. This might mean that, at high redshifts (large distances), the supernovae themselves are actually intrinsically fainter, and not that they’re farther away.”

Back in the 1990s, scientists were quite surprised to find that when they measured the brightness and redshifts of distant supernovae, they appeared fainter than one would expect, leading us to conclude that the Universe was expanding at an accelerating rate to push them farther away. But a 2015 study put forth a possibility that many scientists dreaded: that perhaps these distant supernovae were intrinsically different from the ones we had observed nearby. Would that potentially eliminate the need for dark energy altogether? Or would it simply change ever-so-slightly the amount and properties of dark energy we required to explain modern cosmology? A full analysis shows that dark energy is here to stay, regardless of the supernova data.