Total solar output is now measured to vary (over the last two 11-year sunspot cycles) by less than 0.1% or about 1 W/m² peak-to-trough of the 11 year sunspot cycle.The above quote ignores that, since 1850, solar output is up several times that.
For a long time -- in fact, up until late in the 20th century -- it was thought that more sunspots (which are cooler than the rest of the Sun) meant less total output. We now know that solar faculae -- bright areas associated with sunspots -- mean that the sun's net output increases when there are more sunspots. Even Wikipedia, if you follow the links, ultimately admits this -- just not on the same page:
Faculae are produced by concentrations of magnetic field lines, and are most commonly found in the vicinity of sunspots; this is why the Sun is actually brighter when sunspots are more numerous.If you'd restricted yourself to the main page on Solar Variance, all you would learn is:
The Sun's surface is also the most active when there are more sunspots, although the luminosity does not change much due to an increase in bright spots (faculae). (Emphasis added -- LH)Who defines "much"? Under some definitions, the Sun does not change much, ever. But despite reinforcing further down that "the variation during recent cycles has been 0.1%", it does let slip that "Since the Maunder Minimum, over the past 300 years there probably has been an increase of 0.1 to 0.6%, with climate models often using a 0.25% increase."
I have seen no papers supporting the 0.1% number for that period, and others report as much as 1.25%. However, even the 0.6% number has mostly happened in the last half-century or so, as we will see.
Wikipedia's main article includes this handy graphic, showing 11,000 years of sunspot activity:
Note that for the last two millennia, the trend has been downward -- less sunspots, thus less total energy from the Sun reaching the Earth. When you look at this chart, you can immediately discard the notion of the Sun being involved in any recent warming trends.
The text nearby, though, tells you that something doesn't jive:
The level of solar activity during the past 70 years is exceptional - the last period of similar magnitude occurred over 8,000 years ago. The Sun was at a similarly high level of magnetic activity for only ~10% of the past 11,400 years, and almost all of the earlier high-activity periods were shorter than the present episode.Why doesn't the graph look like that? Because the graph was constructed by a Wikipedia member, who apparently did not want to use the scarier graph built by NOAA. Here's the NOAA data that he used, if you want to plug it in yourself.. The data supplied by NOAA stops at 1899, though you have to read carefully to realize this. (The "55" in the data means "the decade centered 55 years before 1950.)
I added data to 1995 from this source, and produced the chart below. The overlap from 1645 to 1895 is good -- but NOW you can see how dramatic the last few decades have been. Note the data in red, making the chart rather different from the cut-down Wikipedia version above.
What effect has the dramatic increase in solar activity had on Earth's air temperature? It's politically correct to say "very little" -- but I cannot see support for this position in the actual data.
Solar activity to 1995
An update of the chart in Wikipedia, which cuts off in 1895. That original "short" chart is at: http://en.wikipedia.org/wiki/Image:Suns
My chart, the one you see here, is at:
(Edit: In the Y axis legend, I wrote "Average sunspots per decade" -- a better expression would have been "Average sunspots per year, averaged by decade." This is the same method that the Wikipedia chart used; my phrasing was imprecise.)
===|==============/ Level Head