by Govert Schilling on 26 May 2011, 12:44 PM
BOSTON—For decades, astronomers and climatologists have debated whether a prolonged 17th century cold spell, best documented in Europe, could have been caused by erratic behavior of the sun. Now, an American solar physicist says he has new evidence to suggest that the sun was indeed the culprit.
The sun isn’t as constant as it appears. Instead, its surface is regularly beset by storms of swirling magnetic fields. As a result, like a teenager plagued with acne, the face of the sun often sprouts relatively dark and short-lived “sunspots,” which appear when strong magnetic fields inhibit the upwelling of hotter gas from below. The number of those spots waxes and wanes regularly in an 11-year cycle. However, even that cycle isn’t immutable.
In 1893, English astronomer Edward Maunder, studying historical records, noted that the cycle essentially stopped between 1645 and 1715. Instead, the sun was almost devoid of sunspots during this period. In 1976, American solar physicist John “Jack” Eddy suggested there might have been a causal link between this “Maunder Minimum” in the number of sunspots and the contemporaneous Little Ice Age, when average temperatures in Europe were a degree centigrade lower than normal.
One might expect the absence of dark spots to make the sun slightly brighter and hotter. But the absence of other signs of magnetic activity, such as bright patches of very hot gas known as faculae more than compensates for this effect. So in fact, the total energy output of the sun is lower during a solar minimum. If the minimum is prolonged, as it was in the second half of the 17th century, the dip in output might indeed affect Earth’s climate.
However, scientists have debated whether the effect could have been large enough. For instance, in a recent paper in Geophysical Research Letters, solar physicist Karel Schrijver of the Lockheed Martin Advanced Technology Center in Palo Alto, California, and his colleagues argue that during the Maunder Minimum, the sun couldn’t have dimmed enough to explain the Little Ice Age. Even during a prolonged minimum, they claim, an extensive network of very small faculae on the sun’s hot surface remains to keep the energy output above a certain threshold level.
Not so, says Peter Foukal, an independent solar physicist with Heliophysics Inc. in Nahant, Massachusetts, who contends that Schrijver and his colleagues are “assuming an answer” in a circular argument. According to Foukal, who presented his work yesterday here at the summer meeting of the American Astronomical Society, there is no reason to believe that the network of small faculae would persist during long periods of solar quiescence. In fact, he says, observations between 2007 and 2009, when the sun was spotless for an unusually long time, reveal that all forms of magnetic activity diminished, including the small-faculae network.
What’s more, detailed observations from orbiting solar telescopes have shown that the small faculae pump out more energy per unit surface area than the larger ones already known to disappear along with the sunspots. So if the small faculae start to fade, too, that would have an even stronger effect on the total energy production of the sun. “There’s tantalizing evidence that [during the Maunder Minimum] the sun may have actually dimmed more than we have thought until now,” Foukal says.
Even so, Foukal concedes that other factors, such as enhanced volcanic activity around the globe, may also have played a role in causing Europe’s Little Ice Age. Meanwhile, the biggest worry to solar physicists—and to society—is that no one knows what caused the sun’s prolonged quiescence in the first place. As far as anybody knows, a repeat of the Maunder Minimum could start within a few years with the next dip in the number of sunspots.