Skip to content
Paperback A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming Book

ISBN: B01HEILCWA

ISBN13: 9780262518635

A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming

(Part of the Infrastructures Series)

Select Format

Select Condition ThriftBooks Help Icon

Recommended

Format: Paperback

Condition: New

$51.46
50 Available
Ships within 2-3 days

Book Overview

The science behind global warming, and its history: how scientists learned to understand the atmosphere, to measure it, to trace its past, and to model its future.

Global warming skeptics often fall back on the argument that the scientific case for global warming is all model predictions, nothing but simulation; they warn us that we need to wait for real data, "sound science." In A Vast Machine Paul Edwards has news for these skeptics:...

Customer Reviews

1 rating

Understand the Roots of Our Understanding

Understanding how we know about climate, and even what it means to know about climate and climate change, is essential if we are to have an informed debate. This is far and away the best book I have read on the infrastructure behind our knowledge of climate change, how that infrastructure developed, and how the infrastructure shapes our understanding. The story begins in the 1600s as systematic collection of weather data began (at least in the modern period, other cultures such as the Chinese have older records and it would be interesting to unearth these, although the data normalization issues would be extreme). It picks up speed in the 19th C with global trade and then the telegraph. The more data collected, and the more data is exchanged, the more important it becomes to normalize data for comparison. Normalization requires some form of data model, a theory that makes the data meaningful. Indeed, this is Edwards point, all data about weather and climate only becomes meaningful in the context of a model (this is of course generally true). Work accelerated during WW2 and then exploded in the 50s and 60s as computers became more available. The role played by John Von Neumann in this is fascinating, as is the nugget that his second wife Klara Von Neumann taught early weather scientists how to program (there is a whole hidden history of the role of woman in developing computer programming that needs to be written - or if you know of one please add it to the comments of this review or tweet it to me @StevenForth). Edwards also introduces some useful concepts such as Data Friction and Computational Friction. I think my company can apply these in its own work, so for me this has been a very practical text. Modern models of climate are complex and are growing more so. They have to be to integrate data from multiple sources. One of the main lines of evidence for climate change is that data from many different sources are converging to suggest that climate change is a real and accelerating phenomena. One can meaningfully ask if this convergence is an artifact of the models, although this appears unlikely given the diversity of the data and models. But Edwards shows that it is idiotic to claim that the data and the models can be meaningfully separated. This is true in all science and not just climate science. A theory is a model to normalize and integrate data and to uncover and make meaningful relations between disparate data. That these models are now expressed numerically in computations, rather than as differential equations or sentences in a human language or drawings is one of the major shifts of the information age. It will be interesting to dig deeper into the formal relations between these diffferent modeling languages.
Copyright © 2023 Thriftbooks.com Terms of Use | Privacy Policy | Do Not Sell/Share My Personal Information | Cookie Policy | Cookie Preferences | Accessibility Statement
ThriftBooks® and the ThriftBooks® logo are registered trademarks of Thrift Books Global, LLC
GoDaddy Verified and Secured