Radiation bombardment from ancient supernovae may have triggered climate change
-   +   A-   A+     11/07/2016
Shakespeare said that the fault lies not in the stars, but in ourselves. Maybe, but scientists at the University of Kansas say that the stars might have some explaining to do. According to computer models, a pair of supernovae that exploded 300 light years away between 1.7 and 8.7 millions of years ago could have released radiation that seriously affected life on prehistoric Earth and may even have triggered an ice age.

Shakespeare said that the fault lies not in the stars, but in ourselves. Maybe, but scientists at the University of Kansas say that the stars might have some explaining to do. According to computer models, a pair of supernovae that exploded 300 light years away between 1.7 and 8.7 millions of years ago could have released radiation that seriously affected life on prehistoric Earth and may even have triggered an ice age.

Previous research published in April on ancient seabed deposits of iron-60 isotopes produced what the Kansas team called "slam dunk" evidence that two supernovae exploded 300 light years from Earth. A supernova is a star that explodes with such energy that it can outshine all the stars in a galaxy 20 times over. It's caused either by a white dwarf star accreting so much gas from a neighboring star that it detonates, a supermassive star that uses up most of its hydrogen and helium and implodes on itself, or two white dwarfs colliding. Isotopes like iron-60 are traces of such events.

In this case, one star exploded 1.7 to 3.2 million years ago and the other 6.5 to 8.7 million years ago. This would have produced dazzling blue-lit nights that would have disrupted animal sleep and other nocturnal patterns, but the big problem would have been a huge jump in cosmic radiation that would not only have affected land animals, but shallow-living sea life as well..

"The big thing turns out to be the cosmic rays," says professor of physics Adrian Melott. "The really high-energy ones are pretty rare. They get increased by quite a lot here — for a few hundred to thousands of years, by a factor of a few hundred. The high-energy cosmic rays are the ones that can penetrate the atmosphere. They tear up molecules, they can rip electrons off atoms, and that goes on right down to the ground level. Normally that happens only at high altitude."

According to the Kansas team, the supernovae would have subjected the Earth to 20 times the normal dosage of muons. These subatomic particles strike the Earth all the time, but cause little damage. Melliot says that despite being hundreds of times heavier than an electron, muons can penetrate hundred of meters of solid rock without interacting, but in large enough doses the effect can be significant.

In the case of the supernovae, Melliot estimates that it increased radiation doseages by a factor of three. This would have been enough to increase the frequency of cancer and the rate of mutation. It might also have triggered climate change about 2.59 million years ago by ionizing the lower levels of the atmosphere, much like a laboratory cloud chamber, and increasing cloud cover to cool the Earth as well as increasing the amount of cloud-to-ground lightning.

"There was climate change around this time," says Melott. "Africa dried out, and a lot of the forest turned into savannah. Around this time and afterwards, we started having glaciations — ice ages — over and over again, and it's not clear why that started to happen. It's controversial, but maybe cosmic rays had something to do with it."


Read count: 2948 Previous page Back to top
Other news