We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Old climate models are often evaluated on whether they made correct predictions of global warming. But, if the old models were missing processes that we know now to be important, any correctness of their predictions would have to be attributed to a fortuitous compensation of errors, creating a paradoxical situation. Climate models are also tested for falsifiability by using them to predict the impact of short-term events like volcanic eruptions. But climate models do not exhibit the numeric convergence to a unique solution characteristic of small-scale computational fluid dynamics (CFD) models, like the ones that simulate flow over a wing. Compensating errors may obscure the convergence of individual components of a climate model. Lack of convergence suggests that climate modeling is facing a reducibility barrier, or perhaps even a reducibility limit.
Global warming became a growing public concern following Jim Hansen’s US Senate testimony in 1988 asserting that the warming was happening. The Intergovernmental Panel on Climate Change (IPCC) was formed in response to this concern. The IPCC issues periodic assessments summarizing recent scientific developments relating to climate change. Climate models were used to attribute global warming to increasing concentrations of carbon dioxide and other greenhouse gases. Certain types of extreme weather can also be probabilistically attributed to these causes. The effect of aerosols and stochastic variability on the past global warming signal is described. The IPCC projects the global warming signal into the future using a range of carbon dioxide emission scenarios, resulting in different degrees of predicted warming. The importance of regional climate change and the difficulty of predicting it are discussed.
The fundamental difference between weather prediction and climate prediction is explained, using a “nature versus nurture” analogy. To predict weather, we start from initial conditions of the atmosphere and run the weather forecast model. To predict climate, the initial conditions matter less, but we need boundary conditions, such as the angle of the sun or the concentration of carbon dioxide in the atmosphere, which control the greenhouse effect. Charles David Keeling began measuring carbon dioxide in the late 1950s, and found that its concentration was steadily increasing. Carbon dioxide concentrations for the past 800,000 years can also be measured using ice cores that contain trapped air. These ice core data show that the rise in carbon dioxide concentrations measured by Keeling was unprecedented. Manabe, and another scientist, Jim Hansen, used climate models to predict that increasing carbon dioxide could cause global warming.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.