A 2030s Supernova May Hold The Key To The Age Of The Entire Universe

A long time ago, in a galaxy far away, a star much like our own sun had entered the final eons of its life. Its hydrogen and helium had been consumed, and all that was left was a dense, high-gravity white dwarf star (via BBC). Unlike our star, this one had a twin, which was being voraciously consumed by its sibling's intense gravity. As the white dwarf continued to feed, its mass and the crushing force of its gravity grew until it reached a critical limit, and the inward-pressing force of the gravity overcame the outward pressure of the condensed mass within. The result was as catastrophic as it was brilliant (via National Schools' Observatory).

Advertisement

As the star collapsed under its own weight, it created a stellar explosion — a supernova — that shone so brightly that its light survived a 10-billion-year journey across the vast reaches of the universe to reach us in 2016. Along the way, its light was twisted and split so that we didn't see it just one time — we saw it three times at once. And, if researchers are right, we'll see it at least one more time in the 2030s, which could potentially lead to a better understanding of the age of the universe (via Sci.News).

The age of the Universe

For most of human history, the age of the universe was derived from religion and mythology. Up until the 18th century, the best minds of Europe estimated that everything had been created as early as 5500 B.C. or as late as 3928 B.C. (via Natural Environment Research Council). Based upon a growing — but incomplete — understanding of natural processes, scientists began to rethink the age of the Earth, which would give a minimum baseline for the age of the universe. Some considered the tilt of subsurface strata and the rate of sediment deposition to arrive at an answer. Others postulated that the Earth was born molten and calculated how long it would take to cool enough to form a solid crust. The general consensus was that the Earth was around 100 million years old (via Scientific American).

Advertisement

In the early 20th century, researchers were beginning to understand the science of radioactive decay. By understanding the rate of decay of atoms like uranium, scientists could estimate the age of rock and mineral samples. British scientist Arthur Holmes was the first to put this theory into practice and estimated that the Earth was 1.6 billion years old in 1913 (via American Museum of Natural History).

A static Universe

Around the same time, astronomers were refining their view of the universe. The leading view was that the universe was unchanging and eternal; it didn't have a beginning, nor would it have an end. All that was — the Earth, the sun, everything visible in the night sky — was encapsulated within the luminous confines of the Milky Way (via ESSAI).

Advertisement

In 1916, an American astronomer named Harlow Shapley began studying globular clusters: densely packed groups of stars that formed around the same time. Shapely theorized that globular clusters were evenly distributed throughout the galaxy and, by measuring their position, could get a rough idea of the shape of the Milky Way and our place in it (via American Institute of Physics). To achieve this, he pioneered the use of standard candles — stars whose intrinsic brightness is known — to measure interstellar distances. His observations dislodged the notion that the Earth was in the center of the galaxy and relegated us to "the unfashionable end of the western spiral arm of the galaxy." Central to his calculations was a special type of star, the absolute brightness of which could be predicted to a very high degree.

Advertisement

A standard candle in the void

In the late 18th century, astronomers first observed this new type of star that changed in size and brightness at regular intervals. They were named after one of the constellations they were initially discovered in, Cepheus, near the north celestial pole. Over 100 years later, Henrietta Swan Leavitt was studying a group of Cepheid variable stars in the Small Magellanic Cloud (via American Institute of Physics). Over the course of her observations, she noticed that there was a relationship between how often the star's brightness changed and its absolute magnitude (how bright it would be when viewed from 10 parsecs). That meant that if you knew how often the star pulsed, you could determine how bright it was.

Advertisement

This was the yardstick by which Harlow Shapley measured the galaxy and, so he thought, the universe. The prevailing idea of the time was that the Milky Way was the sum of the universe. Some speculated that the so-called spiral nebulae were galaxies unto themselves. Shapley thought that couldn't be true because if his estimate of the size of the galaxy were correct (it was pretty close) and these spiral nebulae were galaxies of comparable size, they would have to be millions of light years away to account for their size in the sky. And that would be absurd, right?

The universe is big

In 1923, Edwin Hubble was observing the Andromeda Galaxy. At the time, it was known as the Andromeda Nebula, one of the poorly understood spiral nebulae populating the night sky. In the course of his observations, he discovered a Cepheid. Calculating its distance the same way Harlow Shapley had a few years earlier, Hubble estimated that this variable star was nearly one million light years away (it's actually a good bit further), further away than the Milky Way is wide. This proved without a shadow of a doubt that spiral nebulae were, in fact, galaxies in their own right (via NASA).

Advertisement

A decade earlier, Vesto Slipher was studying the spectra of spiral nebulae (which we now know are galaxies) and found that they were moving faster than any celestial object yet measured: over 180 miles per second (per Royal Observatory Edinburgh). What turned out to be more interesting was that most of these galaxies were moving away from us (via Lowell Observatory). In 1929, Hubble and his assistant Milton Humason expanded on Slipher's survey of galaxies and discovered that the further away the galaxy was, the faster it was moving away from us. Hubble calculated that nearly every extragalactic object in the sky was receding from us at over 300 miles per second per megaparsec (over 3,000,000 light years) of distance (via University of Chicago). The relationship between velocity and distance is now known as Hubble's Law, and the ratio of speed to distance is known as the Hubble constant.

Advertisement

An expanding Universe

Edwin Hubble's observations showed unequivocally that the universe is expanding in all directions. By measuring the speed of expansion, it's possible to work backward in time to establish the moment everything began expanding — the Big Bang (via the American Institute of Physics). The problem is that we're still trying to figure out exactly what the Hubble constant is. One of the ways to do that is to find something far away and then calculate its distance and speed. Speed can be determined simply by measuring the Doppler shift of the light. Figuring out the distance of a far-away object is a little bit trickier (via the University of Chicago).

Advertisement

One way to do it is to look at a distant event like a supernova, which emits light that reaches us at two or more different times due to the lensing of an intermediate object. By comparing the arrival times of the event, the distance to the object can be inferred (via SciTechDaily). If and when the light reaches us again around 2037, cosmologists should be able to extrapolate its distance from us, which — coupled with the receding speed of the galaxy — will allow us to home in on a more precise value for the Hubble constant. Better understanding the proportion between the speed of an object and its distance from us will give us a more precise date for the Big Bang that started it all (via Science News).

Advertisement

Recommended

Advertisement