Wednesday, July 31, 2013

New paper supports planetary theory of solar variation

A new paper by Dr. Nicola Scafetta & Dr. Richard Willson, published in Astrophysics & Space Science, finds additional evidence supporting the planetary theory of solar variation, that gravitational effects from the planets explain solar cycles. Prior analysis has shown that planetary harmonics correlate with solar activity and subsequent climate change via a variety of solar amplification mechanisms.

Empirical evidences for a planetary modulation of total solar irradiance and the TSI signature of the 1.09-year Earth-Jupiter conjunction cycle

Nicola Scafetta, Richard C. Willson


Abstract: The time series of total solar irradiance (TSI) satellite observations since 1978 provided by ACRIM and PMOD TSI composites are studied. We find empirical evidence for planetary-induced forcing and modulation of solar activity. Power spectra and direct data pattern analysis reveal a clear signature of the 1.09-year Earth-Jupiter conjunction cycle, in particular during solar cycle 23 maximum. This appears to suggest that the Jupiter side of the Sun is slightly brighter during solar maxima. The effect is observed when the Earth crosses the Sun-Jupiter conjunction line every 1.09 years. Multiple spectral peaks are observed in the TSI records that are coherent with known planetary harmonics such as the spring, orbital and synodic periods among Mercury, Venus, Earth and Jupiter: the Mercury-Venus spring-tidal cycle (0.20 year); the Mercury orbital cycle (0.24 year); the Venus-Jupiter spring-tidal cycle (0.32 year); the Venus-Mercury synodic cycle (0.40 year); the Venus-Jupiter synodic cycle (0.65 year); and the Venus-Earth spring tidal cycle (0.80 year). Strong evidence is also found for a 0.5-year TSI cycle that could be driven by the Earth’s crossing the solar equatorial plane twice a year and may indicate a latitudinal solar-luminosity asymmetry. Because both spring and synodic planetary cycles appear to be present and the amplitudes of their TSI signatures appear enhanced during sunspot cycle maxima, we conjecture that on annual and sub-annual scales both gravitational and electro-magnetic planet-sun interactions and internal non-linear feedbacks may be modulating solar activity. Gravitational tidal forces should mostly stress spring cycles while electro-magnetic forces could be linked to the solar wobbling dynamics, and would mostly stress the synodic cycles. The observed statistical coherence between the TSI records and the planetary harmonics is confirmed by three alternative tests.

Related: Analysis finds the Sun explains climate change, not CO2

New paper finds N. Atlantic ocean heat content & sea levels controlled by the natural Atlantic Multidecadal Oscillation [AMO]

A paper published today in the Journal of Geophysical Research Oceans finds ocean heat content and sea levels in the Northern North Atlantic are associated with the cycles of the natural Atlantic Multidecadal Oscillation [AMO, also referred to as AMV]. The authors find the warming evident in sea levels and ocean heat content in the N. Atlantic during the satellite era [since 1979] "represents transition of the AMV from cold to warm phase" and note "an abrupt change 2009–2010 [in N. Atlantic sea levels] reaching a new minimum in 2010." 

The AMO, also sometimes referred to as Atlantic Multidecadal Variability [AMV], is an approximate ~70 year natural climate cycle that is highly correlated to Northern hemisphere and global temperature change.  Increases in greenhouse gases cannot possibly explain these shifts in ocean heat content and sea levels in the N. Atlantic. 

Northern North Atlantic sea surface height and ocean heat content variability

Sirpa Häkkinen, Peter B. Rhines, Denise L. Worthen


Abstract: The evolution of nearly 20 years of altimetric sea surface height (SSH) is investigated to understand its association with decadal to multidecadal variability of the North Atlantic heat content. Altimetric SSH is dominated by an increase of about 14 cm in the Labrador and Irminger seas from 1993 to 2011, while the opposite has occurred over the Gulf Stream region over the same time period. During the altimeter period the observed 0–700 m ocean heat content (OHC) in the subpolar gyre mirrors the increased SSH by its dominantly positive trend. Over a longer period, 1955–2011, fluctuations in the subpolar OHC reflect Atlantic multidecadal variability (AMV) and can be attributed to advection driven by the wind stress “gyre mode” bringing more subtropical waters into the subpolar gyre. The extended subpolar warming evident in SSH and OHC during the altimeter period represents transition of the AMV from cold to warm phase. In addition to the dominant trend, the first empirical orthogonal function SSH time series shows an abrupt change 2009–2010 reaching a new minimum in 2010. The change coincides with the change in the meridional overturning circulation at 26.5°N as observed by the RAPID (Rapid Climate Change) project, and with extreme behavior of the wind stress gyre mode and of atmospheric blocking. While the general relationship between northern warming and Atlantic meridional overturning circulation (AMOC) volume transport remains undetermined, the meridional heat and salt transport carried by AMOC's arteries are rich with decade-to-century timescale variability.

New papers call into question the global sea surface temperature record

Two new companion papers published in Ocean Science call into question the data and methods used to construct global sea surface temperature records of the past 150 years. The authors find that measurements taken from ship engine cooling intakes can be "overly-warm by greater than 0.5°C on some vessels," which by way of comparison is about the same magnitude as the alleged global sea surface temperature warming since 1870. 

Furthermore, the authors "report the presence of strong near-surface temperature gradients day and night, indicating that intake and bucket measurements cannot be assumed equivalent in this region. We thus suggest bucket and buoy measurements be considered distinct from intake measurements due to differences in sampling depth. As such, we argue for exclusion of intake temperatures from historical SST datasets and suggest this would likely reduce the need for poorly field-tested bucket adjustments. We also call for improvement in the general quality of intake temperatures from Voluntary Observing Ships... We suggest that reliable correction for such warm errors is not possible since they are largely of unknown origin and can be offset by real near-surface temperature gradients."
Data sets combining ship intake and bucket measurements show ~0.5C warming since 1870, but this new paper argues that the two types of measurement are from different sampling depths and should not be combined. Graph source: Bob Tisdale via WUWT
For more on the ship intake vs. buckets issue and the questionable adjustments involved, see these posts at WUWT & links to Climate Audit:

Historical Sea Surface Temperature Adjustments/Corrections aka “The Bucket Model”…


Buckets, Inlets, SST’s and all that – part 1



Full papers available here:

Comparing historical and modern methods of sea surface temperature measurement – Part 1: Review of methods, field comparisons and dataset adjustments

J. B. R. Matthews
School of Earth and Ocean Sciences, University of Victoria, Victoria, BC, Canada
Abstract. Sea surface temperature (SST) has been obtained from a variety of different platforms, instruments and depths over the past 150 yr. Modern-day platforms include ships, moored and drifting buoys and satellites. Shipboard methods include temperature measurement of seawater sampled by bucket and flowing through engine cooling water intakes. Here I review SST measurement methods, studies analysing shipboard methods by field or lab experiment and adjustments applied to historical SST datasets to account for variable methods. In general, bucket temperatures have been found to average a few tenths of a °C cooler than simultaneous engine intake temperatures. Field and lab experiments demonstrate that cooling of bucket samples prior to measurement provides a plausible explanation for negative average bucket-intake differences. These can also be credibly attributed to systematic errors in intake temperatures, which have been found to average overly-warm by >0.5 °C on some vessels. However, the precise origin of non-zero average bucket-intake differences reported in field studies is often unclear, given that additional temperatures to those from the buckets and intakes have rarely been obtained. Supplementary accurate in situ temperatures are required to reveal individual errors in bucket and intake temperatures, and the role of near-surface temperature gradients. There is a need for further field experiments of the type reported in Part 2 to address this and other limitations of previous studies.

Comparing historical and modern methods of sea surface temperature measurement – Part 2: Field comparison in the central tropical Pacific

J. B. R. Matthews1 and J. B. Matthews2
1School of Earth and Ocean Sciences, University of Victoria, Victoria, BC, Canada
2Dr. J. B. Matthews Consulting, Tennis Road, Douglas, Isle of Man, British Isles

Abstract. Discrepancies between historical sea surface temperature (SST) datasets have been partly ascribed to use of different adjustments to account for variable measurement methods. Until recently, adjustments had only been applied to bucket temperatures from the late 19th and early 20th centuries, with the aim of correcting their supposed coolness relative to engine cooling water intake temperatures. In the UK Met Office Hadley Centre SST 3 dataset (HadSST3), adjustments have been applied over its full duration to observations from buckets, buoys and engine intakes. Here we investigate uncertainties in the accuracy of such adjustments by direct field comparison of historical and modern methods of shipboard SST measurement. We compare wood, canvas and rubber bucket temperatures to 3 m seawater intake temperature along a central tropical Pacific transect conducted in May and June 2008. We find no average difference between the temperatures obtained with the different bucket types in our short measurement period (∼1 min). Previous field, lab and model experiments have found sizeable temperature change of seawater samples in buckets of smaller volume under longer exposure times. We do, however, report the presence of strong near-surface temperature gradients day and night, indicating that intake and bucket measurements cannot be assumed equivalent in this region. We thus suggest bucket and buoy measurements be considered distinct from intake measurements due to differences in sampling depth. As such, we argue for exclusion of intake temperatures from historical SST datasets and suggest this would likely reduce the need for poorly field-tested bucket adjustments. We also call for improvement in the general quality of intake temperatures from Voluntary Observing Ships. Using a physical model we demonstrate that warming of intake seawater by hot engine room air is an unlikely cause of overly warm intake temperatures. We suggest that reliable correction for such warm errors is not possible since they are largely of unknown origin and can be offset by real near-surface temperature gradients.

Monday, July 29, 2013

The EPA's secret 'settled science'

The EPA's Game of Secret Science

The agency pursues rules that will cost billions but refuses to reveal its research. Maybe a subpoena will be needed.


By LAMAR SMITH

WSJ.COM 7/29/13: As the Environmental Protection Agency moves forward with some of the most costly regulations in history, there needs to be greater transparency about the claimed benefits from these actions. Unfortunately, President Obama and the EPA have been unwilling to reveal to the American people the data they use to justify their multibillion-dollar regulatory agenda.

To cite a few examples of where the EPA would like to take the country, the agency is moving forward with strict new limits on ozone that by its own estimates will cost taxpayers $90 billion per year, which would make the regulation the most costly in history. Other examples include a Mercury and Air Toxics Standard for power plants (previously known as "Utility MACT") that the EPA estimates could cost up to $10 billion a year. Yet more than 99% of the EPA's health-based justifications for the rule are derived from scientific research that the EPA won't reveal. Taxpayers are supposed to take on faith that EPA policy is backed by good science.

We know this much: Virtually every major EPA air-quality regulation under President Obama has been justified by citing two sets of decades-old data from the Harvard Six Cities Study and the American Cancer Society's Cancer Prevention Study II. The EPA uses the data to establish an association between fine-particulate emissions and mortality.
image
Associated Press
Gina McCarthy

For two years, the House Science, Space and Technology Committee, of which I am the chairman, has sought to make this information available to the public. But the EPA has obstructed the committee's request at every step. To date, the committee has sent six letters to the EPA and other top administration officials seeking the data's release.

In September 2011, the EPA's then-Assistant Administrator Gina McCarthy committed to provide these data sets to the committee. But the data still remain out of sight. Ms. McCarthy was recently confirmed by the Senate as administrator of the EPA. Now that she leads the agency, Ms. McCarthy has no excuse not to make these taxpayer-funded studies public.

Simple transparency is not the only reason this information should be released. The costs of these rules will be borne by American families. They deserve to know what they are paying for. Time is almost up. If the administration does not provide this data by the end of July, the science committee will force its release through a subpoena.

The federal government has no business justifying regulations with secret information. This principle has been supported by two of the president's own science and technology advisers, John Holdren and Deborah Swackhamer. "The data on which regulatory decisions and other decisions are based should be made available to the committee and should be made public," said Dr. Holdren in testimony before the committee last year. Executive-branch rules dating to the Clinton administration require that federally funded research data be made publicly available, especially if it is used for regulatory purposes.

The data in question have not been subjected to scrutiny and analysis by independent scientists. And the EPA does not subject its cost-benefit claims to peer review. This means we have no way of evaluating the quality of the science being used to justify the agency's claims.

The withholding of information is troubling—and not just because it is being done by "the most transparent administration in history," as the president boasted in February. The National Academy of Sciences declared in 2004 that the data the EPA is using is of "little use for decision-making." Similarly, President Obama's Office of Management and Budget recently acknowledged that "significant uncertainty remains" about the EPA's claims based on its data sets, saying that the claims "may be misleading" and should be treated with caution.

Yet the EPA presses on: The same data are used to justify the agency's claims about the health benefits of recent proposals to limit emissions for refineries and vehicles. The agency is also poised to use the data to justify its expensive new ozone standards—the EPA's Regulatory Impact Analysis estimated that lowering the ozone standard to 60-70 parts per billion would cost up to $90 billion per year in compliance costs. The regulation could force large areas of the country into non-attainment, a designation that would drastically limit economic growth. Inevitably, the costs would be borne by working families and would include higher gasoline and electricity prices.

The administration's reliance on secret science doesn't stop there. President Obama's ambitious and costly new climate agenda is backed by a finding from a federal interagency working group regarding the "social cost of carbon." How that "social cost" was determined remains unclear. This new justification for economy-wide regulations was developed without public comment or peer review.

The U.S. saw dramatic improvements in air quality well before the Obama administration came to Washington, yet the White House has upped the ante, launching an aggressive anti-fossil-fuel, regulatory assault on affordable energy—while refusing to reveal the scientific basis for the campaign. The EPA should reveal the research it uses and let the American people decide whether the agency's costly regulations are justified.
Rep. Lamar Smith represents the 21st District of Texas and is chairman of the House Committee on Science, Space and Technology.

New paper finds climate change over decades primarily determined by the oceans

A new paper published in Nature finds climate change over timescales longer than 10 years is "primarily determined by the ocean," which skeptics, including famed Professor Emeritus of Atmospheric Science Dr. William Gray, have been saying for years. According to the paper, "the ocean significantly affects long term climate fluctuations, while the seemingly chaotic atmosphere is mainly responsible for the shorter-term, year-to-year changes." 

According to the authors, "Our findings suggest that the predictability of mid-latitude North Atlantic air–sea interaction could extend beyond the ocean to the climate of surrounding continents," corroborating the many papers which have demonstrated that ocean oscillations control land-based climate as well. Ocean oscillations, in turn, have been correlated to solar activity. 


North Atlantic region, dark blue area was used for temperature data, red area for the heat flux. (Credit: C. Kersten, GEOMAR)
Deciphering the Air-Sea Communication: Ocean Significantly Affects Long-Term Climate Fluctuations

July 25, 2013 — Why does hurricane activity vary from decade to decade? Or rainfall in the Sahel region? And why are the trans-Atlantic changes frequently in sync? A German-Russian research team has investigated the role of heat exchange between ocean and atmosphere in long-term climate variability in the Atlantic. The scientists analyzed meteorological measurements and sea surface temperatures over the past 130 years. It was found that the ocean significantly affects long term climate fluctuations, while the seemingly chaotic atmosphere is mainly responsible for the shorter-term, year-to-year changes.


The study appears in the current issue of the journal Nature, and provides important information on the predictability of long-term climate fluctuations.

How do the ocean and atmosphere communicate? What information do they exchange, and what are the results? These are questions that climate scientists must ask, especially if they want to understand the cause of natural climate fluctuations of varying duration. These fluctuations superimpose the general global warming trend since the beginning of industrialization and thus complicate the accurate determination of human influence on the climate. The causes and mechanisms of natural climate variability, however, are poorly understood. A study led by scientists at the GEOMAR Helmholtz Centre for Ocean Research Kiel shows that the ocean currents influence the heat exchange between ocean and atmosphere and thus can explain climate variability on decadal time scales.

The presumption of such predictability potential has been around for more than half a century. In 1964, the Norwegian climate researcher Jacob Bjerknes postulated different causes of climate variability on different time scales. While the atmosphere is mainly causing climate variations on shorter time scales, from months to years, the longer-term fluctuations, such as those on decadal time scales, are primarily determined by the ocean. The first part of this hypothesis has been well studied by now, but the second part still required some verification. "In the current study, we can utilize a new analysis of shipboard measurements, taken since the end of the 19th century, to verify the second part of the Bjerknes hypothesis," says Prof. Mojib Latif of GEOMAR, co-author of the study. "In particular, for the long-term climate variability in the Atlantic sector, the Gulf Stream circulation is of vital importance," said Latif.

Ocean currents affect the surface temperature of the oceans and thus the heat exchange with the atmosphere -- eventually causing climate variations on the adjacent continents. The most evident is an oscillation with a period of 60 years. "Such decadal climate fluctuations are superimposed on the general warming trend, so that at times it seems as if the warming trend slowed or even stopped. After a few decades, it accelerates once again," explains Prof. Latif. "It is important for us to understand these natural cycles, so that we can finally provide better climate predictions as well." One of the major problems, as Latif explained, is that there are just very few long-term oceanic measurements, thereby complicating the analysis and interpretation of climate change signals. Therefore, scientists are using increasingly refined statistical methods to extract more and more information from the available data sets.

"We need both, realistic model simulations and long-term data records, and really sophisticated analysis methods to produce reliable climate predictions. Our work is an additional piece in the giant puzzle of global climate variability, but I am confident that we will be able to extract the secrets underlying the natural climate fluctuations," says Prof. Latif.

Journal Reference:



North Atlantic Ocean control on surface heat flux on multidecadal timescales

    Nature 499 464–467 (25 July 2013)  doi:10.1038/nature12268
Nearly 50 years ago Bjerknes1 suggested that the character of large-scale air–sea interaction over the mid-latitude North Atlantic Ocean differs with timescales: the atmosphere was thought to drive directly most short-term—interannual—sea surface temperature (SST) variability, and the ocean to contribute significantly to long-term—multidecadal—SST and potentially atmospheric variability. Although the conjecture for short timescales is well accepted, understanding Atlantic multidecadal variability (AMV) of SST23 remains a challenge as a result of limited ocean observations. AMV [Atlantic multidecadal variability]  is nonetheless of major socio-economic importance because it is linked to important climate phenomena such as Atlantic hurricane activity and Sahel rainfall, and it hinders the detection of anthropogenic signals in the North Atlantic sector456. Direct evidence of the oceanic influence of AMV can only be provided by surface heat fluxes, the language of ocean–atmosphere communication. Here we provide observational evidence that in the mid-latitude North Atlantic and on timescales longer than 10 years, surface turbulent heat fluxes are indeed driven by the ocean and may force the atmosphere, whereas on shorter timescales the converse is true, thereby confirming the Bjerknes conjecture. This result, although strongest in boreal winter, is found in all seasons. Our findings suggest that the predictability of mid-latitude North Atlantic air–sea interaction could extend beyond the ocean to the climate of surrounding continents.

Friday, July 26, 2013

New paper finds the same climate model produces different results when run on different computers

As if climate models didn't already have enough problems, a paper published today in the Monthly Weather Review finds that the same climate model run on different computer hardware and operating systems produces different results, "primarily due to the treatment of rounding errors by the different software systems" and that these errors propagate over time. According to the authors, "The [hardware & software] system dependency, which is the standard deviation of the 500-hPa geopotential height [areas of high & low pressure] averaged over the globe, increases with time." The authors find, "the ensemble spread due to the differences in software system is comparable to the ensemble spread due to the differences in initial conditions that is used for the traditional ensemble forecasting." The initial conditions of climate models have already been shown by many papers to produce significantly different projections of climate

Could climate catastrophe be due to a rounding error?


An Evaluation of the Software System Dependency of a Global Atmospheric Model

Song-You Hong,1 Myung-Seo Koo,1 Jihyeon Jang,1 Jung-Eun Esther Kim,2 Hoon Park,1,3 Min-Su Joh,4 Ji-Hoon Kang,4 and Tae-Jin Oh5
1 Department of Atmospheric Sciences, Yonsei University, Seoul, Korea
2 National Oceanic and Atmospheric Administration (NOAA)/Earth System Research Laboratory (ESRL), Boulder, Colorado, USA
3 Numerical Weather Prediction center, Korea Meteorological Administration, Seoul, Korea
4 Supercomputer Center, Korea Institute of Science and Technology Information, Daejeon, Korea
5 Korea Institute of Atmospheric Prediction Systems, Seoul, Korea


Abstract
This study presents the dependency of the simulation results from a global atmospheric numerical model on machines with different hardware and software systems. The global model program (GMP) of the Global/Regional Integrated Model system (GRIMs) is tested on 10 different computer systems having different central processing unit (CPU) architectures or compilers. There exist differences in the results for different compilers, parallel libraries, and optimization levels, primarily due to the treatment of rounding errors by the different software systems. The system dependency, which is the standard deviation of the 500-hPa geopotential height averaged over the globe, increases with time. However, its fractional tendency, which is the change of the standard deviation relative to the value itself, remains nearly zero with time. In a seasonal prediction framework, the ensemble spread due to the differences in software system is comparable to the ensemble spread due to the differences in initial conditions that is used for the traditional ensemble forecasting.

New paper finds no increase of climate variability over recent decades

A paper published today in Nature finds that there has been no increase of global temperature variability over the past few decades. According to the authors, "Our findings contradict the view that a warming world will automatically be one of more overall climatic variation" and "Evidence from Greenland ice cores shows that year-to-year temperature variability was probably higher in some past cold periods." Furthermore, the paper finds, "Many climate models predict that total variability will ultimately decrease under high greenhouse gas concentrations..." The paper corroborates other peer-reviewed papers, the IPCC SREX Report, and the Congressional testimony of Dr. Roger Pielke, Jr. & Dr. John Christy finding no evidence of any increase of extreme weather or climate variability with global warming. 


No increase in global temperature variability despite changing regional patterns

Nature (2013) doi:10.1038/nature12310
Evidence from Greenland ice cores shows that year-to-year temperature variability was probably higher in some past cold periods1, but there is considerable interest in determining whether global warming is increasing climate variability at present23456. This interest is motivated by an understanding that increased variability and resulting extreme weather conditions may be more difficult for society to adapt to than altered mean conditions3. So far, however, in spite of suggestions of increased variability2, there is considerable uncertainty as to whether it is occurring7. Here we show that although fluctuations in annual temperature have indeed shown substantial geographical variation over the past few decades2, the time-evolving standard deviation of globally averaged temperature anomalies has been stable. A feature of the changes has been a tendency for many regions of low variability to experience increases, which might contribute to the perception of increased climate volatility. The normalization of temperature anomalies2 creates the impression of larger relative overall increases, but our use of absolute values, which we argue is a more appropriate approach, reveals little change. Regionally, greater year-to-year changes recently occurred in much of North America and Europe. Many climate models predict that total variability will ultimately decrease under high greenhouse gas concentrations, possibly associated with reductions in sea-ice cover. Our findings contradict the view that a warming world will automatically be one of more overall climatic variation.

Claim: 300,000 years required for climate to cool after CO2 emissions cease

The author of a new paper published in Nature Geoscience claims "we can imagine a scenario in which, after human CO2 emissions ceased, the planet's climate would start to recover and cool down. The bad news is that it's likely this would take around 300,000 years." This is absurd for several reasons, including, 1) the residence time and lifetime of CO2 in the atmosphere are only 5-10 years, 2) atmospheric CO2 is controlled by temperature, 3) many other factors including solar activity and ocean oscillations are far more important determinants of planetary temperature than CO2, and 4) the next ice age is imminent sometime within 2000 years or overdue.

Why should humanity bother to cease CO2 emissions if the benefits won't occur for another 300,000 years?

Rocks Can Restore Our Climate ... After 300,000 Years

ScienceDaily  July 26, 2013 — A study of a global warming event that happened 93 million years ago suggests that the Earth can recover from high carbon dioxide emissions faster than thought, but that this process takes around 300,000 years after emissions decline. Scientists from Oxford University studied rocks from locations including Beachy Head, near Eastbourne, and South Ferriby, North Lincolnshire, to investigate how chemical weathering of rocks 'rebalanced' the climate after vast amounts of carbon dioxide (CO2) were emitted during more than 10,000 years of volcanic eruptions.


In chemical weathering CO2 from the atmosphere dissolved in rainwater reacts with rocks such as basalt or granite, dissolving them so that this atmospheric carbon then flows into the oceans, where a large proportion is 'trapped' in the bodies of marine organisms.

The team tested the idea that, as CO2 warms the planet, the reactions involved in chemical weathering speed up, causing more CO2 to be 'locked away', until, if CO2 emissions decline, the climate begins to cool again. The Oxford team looked at evidence from the 'Ocean Anoxic Event 2' in the Late Cretaceous when volcanic activity spewed around 10 gigatonnes of CO2 into the atmosphere every year for over 10,000 years. The researchers found that during this period chemical weathering increased, locking away more CO2 as the world warmed and enabling the Earth to stabilise to a cooler climate within 300,000 years, up to four times faster than previously thought.

A report of the research is published in Nature Geoscience.

'Looking at this event is rather like imagining what the Earth would be like if humans disappeared tomorrow,' said Dr Philip Pogge von Strandmann of Oxford University's Department of Earth Sciences, who led the research. 'Volcanic CO2 emissions in this period are similar to, if slightly slower than, current manmade emissions so that we can imagine a scenario in which, after human CO2 emissions ceased, the planet's climate would start to recover and cool down. The bad news is that it's likely this would take around 300,000 years.'

Reconstructing a record of past chemical weathering is challenging because of how plants and animals take carbon out of the environment. To get around this the team used a recently-developed technique involving studying lithium isotopes in marine limestone (this lithium could only come from weathering and is not changed by biological organisms).

The Ocean Anoxic Event 2 is believed to have been caused by a massive increase in volcanic activity in one of three regions: the Caribbean, Madagascar, or the Solomon Islands. The event saw the temperature of seawater around the equator warm by about 3 degrees Celsius. It is thought that this warming caused around 53% of marine species to go extinct. Animals like turtles, fish, and ammonites were amongst those severely affected.

'Everyone remembers the mass extinction of land animals caused by the K-T meteorite impact 30 million years later, thought to be responsible for the demise of the dinosaurs, but in many ways this was just as devastating for marine life,' said Dr Pogge von Strandmann. 'Whilst nutrients from weathering caused a population boom of some species near the surface of the oceans, it also led to a loss of oxygen to the deeper ocean, killing off over half of all marine species and creating a 'dead zone' of decaying animals and plants. It's a scenario we wouldn't want to see repeated today.

'Our research is good news, showing that the Earth can recover up to four times faster than we thought from CO2 emissions, but even if we stopped all emissions today this recovery would still take hundreds of thousands of years. We have to start doing something soon to remove CO2 from the atmosphere if we don't want to see a repeat of the kind of mass extinctions that global warming has triggered in the past.'

The research was supported by the UK's Natural Environment Research Council.

New book says Earth is too cold for maximum development of the biosphere

A new book by Professor of Earth Sciences Toby Tyrell "finds that the Earth is actually too cold for the maximum development of the biosphere." A review of the book published today in Nature Climate Change states, "Using net primary production and biodiversity as metrics, Tyrell finds that the Earth is actually too cold for the maximum development of the biosphere" and "the climate and biogeochemical cycles of the Holocene [the past ~11,000 years] have been unusually stable."

Requiem for a grand theory

Nature Climate Change
 
3,
 
697
 
 
doi:10.1038/nclimate1953
Published online
 

On Gaia: A Critical Investigation of the Relationship between Life and Earth

Toby Tyrell
PRINCETON UNIV. PRESS: 2013.
311 pp.$35.00

Gaia, the brainchild of James Lovelock, was born in 1972. The historical constancy of Earth's chemical conditions and climate seemed just too much for chance alone. In Gaia it is postulated that the Earth's conditions were determined by the biosphere and regulated for the further benefit of life's persistence and activity. Gaia has motivated a huge number of biogeochemists to think about ecology on the planetary scale, and to examine what causes the movement and transformation of elements in global cycles. The theory has survived withering criticism and numerous international conventions — some to extend its reach and others to bandage a wounded Gaia with modifications and caveats. Even now, when students read Lovelock's first book on Gaia they engage with his insights enthusiastically.
In this book, Toby Tyrell — a professor of Earth science at the University of Southampton — offers a systematic, dispassionate, retrospective examination of Gaia. It will be hard to ignore the flaws in Gaia, illustrated nicely in a table showing the success and failure of Gaia relative to some alternative theories based on the geosciences and coevolution. In the face of data, Gaia fails in its idea that the Earth is held at conditions optimal for life. Using net primary production and biodiversity as metrics, Tyrell finds that the Earth is actually too cold for the maximum development of the biosphere. Gaia also fails in its postulate that the Earth is held at relatively stable conditions. True, the climate and biogeochemical cycles of the Holocene [the past ~11,000 years] have been unusually stable, but over longer periods of time the biosphere has been buffeted by events that have dealt it quite a blow. What is remarkable is that life persisted at all — a statement of the power of evolution to rebuild the biosphere everywhere as long as life has endured somewhere...

Thursday, July 25, 2013

Report debunks alarmist claptrap from 'climate expert' Katherine Hayhoe

A new report from SPPI:

delaware_future_weather.png
[Illustrations, footnotes and references available in PDF version]
Excerpts:

During this hot, wet summer, a “national climate expert” recently told Delawareans that they can expect even hotter summers – with a climate like Savannah, Georgia’s – by the end of the century. The culprit, naturally: runaway global warming.
Savannah residents are long accustomed to their climate and, thanks to air conditioning and other modern technologies, are better able to deal with the heat and humidity. Nevertheless, the impact on Delaware will be disastrous, Dr. Katherine Hayhoe claims. Nonsense.
Her forthcoming report promises to be no different than other proclamations that persistently predict dire consequences from climate change – and then present taxpayers with a hefty bill. In this scenario, the State’s Department of Natural Resources and Environmental Control (DNREC) paid $46,000 for her report, presumably to suggest that “independent scholars” support the state’s positions.
The preliminary release of her report reads like the script from a bad disaster movie – think The Day After Tomorrow and An Inconvenient Truth. Like them, it also plays fast and loose with the facts.
Dr. Hayhoe’s bases her extreme scenarios on climate models – the same models that have predicted major temperature trends that have not materialized; greatly exaggerated short-term trends in rainfall, droughts and violent storms; and failed to predict the lack of warming since 1998. So why should we believe them now?
The real reason behind this report is to provide the State with the justification to enact draconian measures to control Delawareans’ energy use and provide major subsidies for “alternative” and “renewable” energy projects.

Armed with this new “scientific” report, what draconian measures might Mr. O’Mara and the Markell Administration have in store for the citizens of Delaware? Time alone will tell. However, given their track record thus far, Delawareans are going to get burned – and not by global warming.

Even worse, the same sneaky shenanigans are being played out in other states, in Washington, and all over the world, through the UN, EU and environmentalist pressure groups – in the name of saving the planet from computer model
and horror movie disasters. These are bigger power grabs than anything King George III tried. We the People need to take notice, and take action.

Review finds increased CO2 will lead to increased crop yields, even if warming resumes

A new review paper from SPPI & CO2 Science finds "future increases in the atmosphere's CO2 concentration will likely lead to increased crop growth and yield production, even in areas where reduced soil moisture availability produces significant plant water stress."

water_stress.png
[Illustrations, footnotes and references available in PDF version]
Excerpts:

As the air's CO2 content continues to rise, nearly all of earth's plants will exhibit increases in photosynthesis and biomass production; but climate alarmists periodically claim that elevated concentrations of atmospheric CO2 will lead to more droughty conditions in many parts of the world and thereby significantly reduce or totally negate these CO2-induced benefits. Therefore, to help determine to what degree this claim has any validity, we here review and summarize the results of numerous CO2-enrichment studies that were designed and conducted in such a way as to reveal the various means by which atmospheric CO2 enrichment may actually help a number of important food crops to successfully cope with this potential problem of more frequent periods of less-than-optimal water availability.
If atmospheric CO2 enrichment allows plants to maintain a better water status during times of water stress, it is only logical to expect that they should exhibit greater rates of photosynthesis than plants growing in similarly-water-deficient soil in non-CO2-enriched air.
Where water availability is a prime limiting economic resource, it can be distributed more effectively under higher CO2 conditions
Average wheat yields are likely to increase by 1.2 to 2 t/ha (15-23%) by the 2050s because of a CO2-related increase in radiation use efficiency.

To briefly summarize the findings of this review of the effects of water insufficiency on the productivity of the world's major agricultural crops, the earlier optimistic conclusions of Idso and Idso (1994) are found to be well supported by the recent peer-reviewed scientific literature, which indicates that the ongoing rise in the air's CO2 content will likely lead to substantial increases in the photosynthetic rates and biomass production of the world's major agricultural crops, even in the face of the stressful conditions imposed by less-than-optimum soil moisture conditions. Therefore, future increases in the atmosphere's CO2 concentration will likely lead to increased crop growth and yield production, even in areas where reduced soil moisture availability produces significant plant water stress.

World energy consumption to grow 56% by 2040, mostly from fossil fuels

Federal report: World energy consumption to grow 56 percent by 2040

By Ben Geman - 07/25/13 The Hill

Global energy consumption will increase 56 percent between 2010 and 2040, and fossil fuels will keep providing the lion’s share of supply even as green energy sources quickly expand.

Those are some of the conclusions in federal Energy Information Administration’s (EIA) big new “International Energy Outlook,” which examines estimated supply, consumption and emissions trends over the next three decades.

China and India will together account for half the increase in global energy use, according to the EIA, which is the Energy Department’s independent statistical arm, and more broadly, the developing world will largely drive vast bulk of the increase.

Natural gas, oil and coal will remain the primary energy sources, even as renewables like wind and solar are expanding.

“Renewable energy and nuclear power are the world's fastest-growing energy sources, each increasing by 2.5 percent per year. However, fossil fuels continue to supply almost 80 percent of world energy use through 2040. Natural gas is the fastest-growing fossil fuel in the outlook,” the report states.

Greenhouse gas emissions will keep rising globally, increasing by nearly 50 percent, a level that scientists warn will lead to dangerous climactic changes.

“Given current policies and regulations limiting fossil fuel use, worldwide energy-related carbon dioxide emissions rise from about 31 billion metric tons in 2010 to 36 billion metric tons in 2020 and then to 45 billion metric tons in 2040, a 46-percent increase,” the report states.

From the Report:

"Coal use grows faster than petroleum and other liquid fuel use until after 2030, mostly because of increases in China's consumption of coal and tepid growth in liquids demand attributed to slow growth in the OECD regions and high sustained oil prices."

New paper finds another amplification mechanism by which the Sun controls climate

A paper published today in Quaternary Research implies solar activity caused changes in the Asian monsoon, which in turn causes large-scale climate change over much of the globe.  The paper adds to many others finding amplification mechanisms by which small changes in solar activity can have large amplified effects on climate, via ocean oscillations such as the North Atlantic Oscillationatmospheric oscillations such as the Madden-Julian Oscillation, Quasi-biennial Oscillation, Aleutian LowEurasian pattern, & Asian monsoon, and via stratospheric ozone, the tropospheric jet stream, and sunshine hours/clouds.

Centennial-scale Asian monsoon variability during the mid-Younger Dryas from Qingtian Cave, central China


  • a College of Geography Science, Nanjing Normal University, Nanjing 210023, China
  • b Department of Geology and Geophysics, University of Minnesota, Minneapolis, MN 55455, USA
  • c Institute of Global Environmental Change, Xi'an Jiaotong University, Xi'an 710049, China

Abstract

The regional climate correlation within the Northern Hemisphere in the cold/dry mid-Younger Dryas event (YD) remains elusive. A key to unraveling this issue is sufficient knowledge of the detailed climate variability at the low latitudes. Here we present a high-resolution (3-yr) δ18O record of an annually laminated stalagmite from central China that reveals a detailed Asian monsoon (AM) history from 13.36 to 10.99 ka. The YD in this record is expressed as three phases, characterized by gradual onsets but rapid ends. During the mid-YD, the AM [Asian monsoon] variability exhibited an increasing trend superimposed by three centennial oscillations, well-correlated to changes in Greenland temperatures. These warming/wetting fluctuations show a periodicity of ~ 200 yr, generally in agreement with centennial changes in cosmogenic nuclides indicated by the 10Be flux [a proxy of solar activity] from the Greenland ice. This relationship implies that centennial-scale climate changes during the mid-YD are probably caused by solar output and rapidly transported over broad regions through atmosphere reorganization.