Seeing the Climate Change Signal in Big Problems

We’ve been seeing correlations between climate change and localized biological events for many years. Now we’ve begun to see research linking climate change to regional and even global outcomes. In the last few months there’s been seperate studies suggesting a global warming driver behind extreme rainfall events, flooding, and now international food prices.

These are all interesting and alarming findings on their own. It’s also interesting that some combination of increasing magnitude of climate change and increasing intrepidness of research methodology is facilitating continent-scale climate outcome analysis. It’s one thing to identify a general trend of change in the climate. It’s another thing to move on from averages to spotting trends in extreme moments and changes in frequencies of outlier events. It’s another thing again to credibly link those trends and variances to specific outcomes big enough for people to care. Continental weather patterns are complicated systems with multi-step chains of causality. That’s hard to see through. Especially when you’re stacking a layer of economics on top of geo-physical systems, as in the case of food prices. But that doesn’t mean that climate won’t have serious outcomes at the local, regional and global level, and that means we very much need to try to spot those as soon as we can.

It’s also interesting to consider what effect these kinds of studies might have on opinion and policy, if science and media can get along well enough to effectively articulate them to the public and to governments. The likelihood of climate change hasn’t been enough to motivate us to prevent it. Maybe the identifiable presence of the consequences of climate change in our everyday life will be. That’s not just a science problem, although its surely that. Its also very much a communications problem. But I’m glad the science is being done.

As the Japanese Reactors Go, So Goes the Climate?

As I write this the news about the Japanese nuclear plant emergencies seems to be getting cautiously worse. Morning reports described a single reactor that was recieving insufficient cooling due to power loss. Now the news says there’s a second reactor with similar probems, and possibly a third with a fire on site. It’s a race now. The main power is cut to the reactor cores, the secondary diesel generators have failed somehow, and the tertiary battery-powered systems are apparently unable to pump enough cooling fresh water through the hot rods to keep them from turning the water that is there into radioactive, pressurized steam. If the diminished flow of water is less than the amount being boiled off the rods will eventually be exposed to the air, at which point they will melt. A ‘nuclear melt down’.

That probably won’t happen. Batteries are being delivered to the site to maintain the lessened flow. (A task currently being handled by US Air Force jets. How does a military jet deliver a battery to a power plant I wonder?). Eventually the secondary or primary power will come on and complete cooling will happen. Right?

Already one of the reactors has had to have some amount of steam vented into the open air, and residents within 6 kilometers are being advised to stay indoors. I have no idea what the impacts to residents in the region might be, either in the best or the worst case scenarios.

But the impacts to the climate are necessarily bad, even in the best case. In the worst case they might be terrible.

Increased build-out of nuclear power is likely a necessary but not sufficient condition for preventing worldwide climatic catastrophe. Wind, solar, algae and geothermal are of course superior energy generation technologies, but they are relatively immature practices unlikey to be able to deliver the several terawatts of power needed to supplant fossil fuels anytime in the near or even midterm future. Absent a conservation revolution, the practical alternative is that coal and petroleum plants that should have been mothballed twenty years ago will continue to empty their respiratory clogging, climate destabilizing waste into the air at a vast daily rate for decades to come. Nuclear power plants are at least technically able to be deployed at large scale within a few quick years. Siting a nuclear plant takes much longer than that in practice, but principally because residents are deeply suspicious of having their ugly threatening bulks lurking on the skyline. In the last few years there seems to have been a significant shift in the affections of green thinkers, and that shift seemed plausibly destined to filter down through the larger populace into actual power reactors getting actually built and plugged into the grid.

That perceptual shift has limits (as we’re presently witnessing with the resistance to the shipping of surplus nuclear parts through the St. Lawrence). It doesn’t matter these nuclear installations just absorbed the largest Japanese earthquake in recorded history. It doesn’t matter that they were built by GE in 1971 using rods-in-a-pool technology that is only slightly related to the relatively self-correcting closed-container sytems that could be erected tomorrow. People are going to look at what happens now and in the next few hours, and they are quite reasonably going to ask: do I want to receive a 3 kilometer evacuation warning of my own?

The primary safety systems failed. The secondary systems failed (I think). The tertiary systems turn out to be insufficient. All of which is happening in a country with a disaster readiness culture, no lack of forewarning about the possibility of earthquakes, and engineering standards as high as anywhere in the world. By late tonight we might just find out if popular opinion is going to turn against what is possibly only bridge energy source we have available to keep our climate predictable and stable.

Technical updates are available at the Union of Concerned Scientists website. I will probably edit this post tomorrow to be less emabarrasingly panicky.

(update 14.3.11: I didn’t. I’m still panicky.)

Degrees of Change: The Good, the Bad and the Blind Spot

There’s been plenty of media coverage of the Degrees of Change report just released by the National Round Table on the Environment and the Economy. The report predicts impacts on Canadian regions, with an economic focus.

Climate change: Is this what the future will look like? – National Post

Study seeks silver lining in climate change’s clouds — Globe and Mail

Don’t accentuate the positive on climate change – Globe and Mail

Global warming will vastly change Canada: study – CTV

How a 2-degree climate change would hit Canada — CBC

Both the report and the coverage it has generated are certainly interesting. There’s a little bad and a lot of good. But I think what’s may be most important is what’s been left out.

The Bad

The news media has always struggled with climate change reporting, in part because of the tendency to report both sides of the story, even if one “side” is consensus opinion of the credible components of the scientific institution and the other “side” is a slim minority of tangentially expert corporate-linked reactionaries. Any report that includes positive as well as negative predictions will inevitably play hard into that journalistic norm. So it’s not surprising that some of the coverage is giving as much weight to the potential positives impacts as to the negatives, regardless of their actual proportionality.

The Good

That said, there is so much that is good about this report, and the coverage.

Climate change is a global phenomenon to be sure, but the effects will be felt in highly regionally specific ways. This report gets that aspect very right, and that’s great to see.

By giving the issue a fresh framing, I think it will help nudge folks towards considering climate change as an actual impending event, rather than another political scuffle. Advocating for preparation for both the positive and the negative impacts will motivate people to think about preparation at all, and we’ve been sorely lacking a Canadian focus on the adaptation term of the prevention/mitigation/adaptation equation.

(As a rather lurid example of that, Canada just announced the specific allocation of our $400 million Copenhagen commitment to international climate change aid, and the adaptation portion — arguably the most important, in the context of aid to developing nations — works out to an even 5%.)

Even better, I suspect that people aren’t just going to be scared of the predicted negative impacts, they’re going to be scared of the positive changes as well. I just don’t think people like any kind of large scale lifestyle-affecting change at all. Especially the kind of small-c conservative folks who haven’t been much interested in climate change so far.

But let’s suppose that some people do get excited about the possible benefits to their region. People who want to log spruce trees in the north, or fish more cod, or (yes) golf in places they haven’t before, will see those benefits mostly just as pleasing possibilities. They’ll have relatively few resources to fight for them, given that they aren’t currently making money logging spruce fishing cod or selling tee times. But the people whose lives are currently tied to farming in places that will desertify, or cutting trees in places that are in danger of losing them to beetle kill, have a lot of skin in the game and are relatively likely to get politically involved. So I don’t mind a degree of focus on positive possibilities.


Areas at Risk of Desertification by 2050, National Post adapted from NRTEE

Also, it’s just true: there will be some positive outcomes somewhere. True facts deserve coverage regardless.

The Blind Spot

The problem I have with this report – or at least with the report as I understand it from the media – is that it leaves out some of the most important economic and ecological impacts that climate change will bring. Irregularity, cascades, thresholds, extremes. All the stuff that can’t be specifically predicted, but will happen anyway.

The climate system, and the regional ecologies that exist under it, are the way they are because of millienia of co-adaptive evolution. If the North Atlantic thermohaline circulation conflicted with the south atlantic thermohaline circulation, one or the other would have collapsed long ago. If the conifer forests of interior BC were vulnerable to infestation by beetles that could survive interior BC winters, then the forests (and, in turn, the beetles) would have had their distribution scaled back already.

The danger is that if you start forcing some of the controlling variables — such as average temperature — those cycles start to wobble, degrade, and generally bang around like an unbalanced washing machine.

current conveyor belt
The thermohaline circulation. Adapted from Alexandre Van de Sande.

On the weather side, extreme temperatures will start showing up, both at the high end and the low end. The timing of weather events will get less predictable. The general trend will (in most places) be towards higher temperatures, but the thermometer will probably sawtooth nuttily on the way there. That counts if you’re planting crops, investing in tourist infrastructure, or building your house in a flood plain.

Then there’s the biological side. Sudden species explosions have been rarer in real life than in mathematical models because any given species has evolved in a matrix of other predator and prey relationships, buffering extreme swings and selecting for calmly persistent dynamics. When temperature thresholds shift suddenly and new species move into environments where they don’t have established ecological relationships, those constraints fall away, possibly leading to outbreaks of any number of possible species — benign, nuetral or downright pestilential.

The poster child for that movement has been the colossal arrival of pine beetles in interior BC. Climate change wasn’t sufficient on it’s own to induce an industry-tanking biological outbreak, it also required the presence of single-age stands of trees, among other factors. But it did indeed require warmer winters for the beetles to survive, and when those ducks got in a row, problems happened. With more climate change we can anticipate more aligned ducks, more problems.

Could we possibly have predicted the pine beetle outbreak? Probably not, but it still happened, and it still had (and has) a massive economic impact. The Degrees of Change report, for all that I find good in it, doesn’t pretend to include those unpredictabilities or the economic impacts they would generate, despite that they will impinge directly on it’s stated goal of predicting ecological and economic outcomes. I’m not sure how it could. And yet those unpredictables will be all too real. When contemplating our economic and ecological future, we should be looking for ways to keep that in mind.

Massive Risk Management

Governor Schwarzenegger made an announcement on Monday. He’s withdrawing support for a planned offshore oil project in California state waters. He was very clear: this decision was made specifically because of the Gulf oil leaks.

“I think that we all go through the endless amount of studies and research and everything, and before you make a decision like that, you are convinced that this will be safe,” the governor added. “But then again, you know, you see that, you turn on television and see this enormous disaster and you say to yourself, why would we want to take that risk?”

We have a hard time planning around risks that have low probability but potentially massive impact. Most risk assessment is done intuitively, and our intuition gets fickle around long-tail events. Our gut instincts differ person-to-person, and also perhaps within ourselves. Somehow I can never be bothered to wear a helmet when I get on a bicycle, but when riding a motorcycle in states without a helmet requirement, the idea of taking mine off strikes me as absolutely insane.

Formal cost-benefit analyses can be used to mathematize planning around uncertain outcomes, and they often are. But CBA can lead to especially extreme cases of garbage-in/garbage-out, and usually does. What number do you assign to the “cost” of a species loss, for example. And how would an equation help if you didn’t know what the probability of a species loss was anyway? Laplaces’s insufficient reason criterion can be used to hold together these shaky formalizations, but that criterion states that if you can’t guess the outcome, insert a 50/50 chance of it happening. Which makes intuitive sense I guess, but here we are at intuition again.

Intuition is sensitive to recent circumstances. And so, because of the timing of the “pictures on TV”, California won’t have offshore drilling. I’m sympathetic to the governor; I’m sure he was shown credible evidence that safety standards in oil rigging have been much improved. But how much safety is enough? It depends what’s on TV at the moment.

We seem to collectively deal with a lot of these low-probability/high-impact decisions. I think they’re some of the most important choices societies make (whatever that means). For example, the question “how to deal with the threat of terrorism?”, is premised, often invisibly, on the question “how much of a threat is terrorism?”. Is the attempted Times Square bombing proof that Americans are living under threat? Or is it a reminder that American citizens are remarkably safe from home front terrorism? When the potential consequences are so important, declaring something irrelevant because it’s out-of-the-ordinary doesn’t seem right somehow. And yet, and yet.

Or how about that crazy climate change? Critics suggest that because we have uncertainty in the outcomes — which we absolutely do — we shouldn’t be pouring resources into combatting an unknown. Which isn’t so crazy, if you consider the opportunity costs: the money and time and political capital we spend keeping carbon out of the atmosphere could be going to plenty of other deserving projects. But my intuition tells me that the uncertainty associated with climate change is precisely the reason we should fear it. I worry that we’re going to learn too late the value of a predictable climate. Each specific climate-linked tragedy may be unlikely to the point of absolute unknowability, but somehow that collection of unknowable tragedies sounds like the worst thing in the world to me.

I have a hard time articulating that threat to myself or to others, but the precautionary principle speaks to it. According to wikipedia, the principle states that

“if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is harmful, the burden of proof that it is not harmful falls on those who advocate taking the action.”

Environmental systems are weird. They often seem to be complex in the academic sense, behaving in aggregate in ways which can be either resistant to perturbation or suddenly highly sensistive to it. Formal complex systems theory usually isn’t very good at predicting outcomes in environmental systems (although I think it’s fabulous at helping us to understand why we can’t make those predictions). Ecologies are weird and unknowable, but they are also crucial to our lives, both in big ways and small ways. We will all die if the ecosystem services we rely on are thrown out of wack, but we will all be miserable and grumpy long before those services completely collapse. That combination of complexity and cruciality makes predictions around unlikely but potentially significant environmental dangers especially perplexing.

I’m not usually a small-c conservative, I tend to value experimentation and liberal politics. But because of the particularly fraught nature of environmental choices, I’m a big believer in that precautionary principle.

Salmon vs Farmers for California Water

11 Western Democrats object to Feinstein water delivery plan — Michael Doyle, McClatchy Newspapers

“The escalating fight pits region against region, and some of California’s most influential politicians against one another. It’s already splitting fragile alliances among California water users, who in recent years have inched toward comity.”

Is global warming to blame? Who knows. The entire southwest of the US has been going through unprecedented droughts and water shortages for nearly a decade now. It could be that slightly increased prevailing temperatures have contributed a difficult duck to a dangerous row. It could be that precipitation would have dropped off in California even without an increase in worldwide temperatures. It could even be the case that global temperatures actually haven’t risen much yet. One thing is for sure: this is a template for the conflict we’re going to see even in western countries when resource scarcity and unpredictability does ramp up in our climate irked future. These days you don’t have to look around too much to spot quite a collection of those templates.

Mayor David Miller On Behalf of Canada

That’s Toronto mayor David Miller accepting two (!) Fossil of the Day awards on behalf of Canada today at the Copenhagen talks. Photo from friend Heidi, who is attending Copenhagen promoting a program on adaptation in Africa. Not every Canadian presence is about stalling collective action. Go mayor Miller. Go Heidi.

Google Massively Automates Tropical Deforestation Detection

Landcover change analysis has been an active area of research in the remote sensing community for many years. The idea is to make computational protocols and algorithms that take a couple of digital images collected by satellites or airplanes, turn them into landcover maps, layer them on top of each other, and pick out the places where the landcover type has changed. The best protocols are the most precise, the fastest, and which can chew on multiple images recorded under different conditions. One of the favourite applications of landcover change analysis has been deforestation detection. A particularly popular target for deforestation analysis is the tropical rainforests, which are being chainsawed down at rates which are almost as difficult to comprehend as it is to judge exactly how bad the effects of their removal will be on biological diversity, planetary ecosystem functioning and climate stability.

Google has now gotten itself into the environmental remote sensing game, but in a Google-esque way: massively, ubiquitously, computationally intensively, plausibly benignly, and with probable long-term financial benefits. They are now running a program to vacuum up satellite imagery and apply landcover change detection optomized for spotting deforestation, and for the time being targeted at the amazon basin. The public doesn’t currently get access to the results, but presumably that access will be rolled out once Google et al are confident in the system. I have to hand it to Google: they are technically careful, but politically aggressive. Amazon deforestation is (or should still be) a very political topic.

The particular landcover change algorithms they are using are apparently the direct product of Greg Asner’s group at Carnegie Institution for Science and Carlos Souza at Imazon. To signal my belief in the importance of this project I’m not going to make a joke about Dr. Asner, as would normally be required by my background in the Ustin Mafia. (AsnerLAB!)

From the Google Blog:

“We decided to find out, by working with Greg and Carlos to re-implement their software online, on top of a prototype platform we’ve built that gives them easy access to terabytes of satellite imagery and thousands of computers in our data centers.”

That’s an interesting comment in it’s own right. Landcover/landuse change analysis algorithms presumably require a reasonably general-purpose computing environment for implementation. The fact that they could be run “on top of a prototype platform … that gives them easy access to … computers in our data centers” suggests that Google has created some kind of more-or-less general purpose abstraction layer than can invoke their unprecedented computing and data resource.

They back that comment up in the bullet points:

“Ease of use and lower costs: An online platform that offers easy access to data, scientific algorithms and computation horsepower from any web browser can dramatically lower the cost and complexity for tropical nations to monitor their forests.”

Is Google signaling their development of a commerical supercomputing cloud, a la Amazon S3? Based on the further marketing-speak in the bullets that follow that claim, I woud say absolutely yes. This is a test project and a demo for that business. You heard it here first, folks.

Mongobay points out that it’s not just tropical forests that are quietly dissapearing, and Canada and some other developed countries don’t do any kind of good job in aggregating or publically mapping their own enormous deforestation. I wonder: when will Google point its detection program at British Columbia’s endlessly exanding network of just-out-of-sight-of-the-highway clearcuts? And what facts and figures will become readily accessible when it does?


View Larger Map

Mongobay also infers that LIDAR might be involved in this particular process of detecting landcover change, but that wouldn’t be the case. Light Detection and Ranging is commonly used in characterizing forest canopy, but it’s still a plane-based imaging technique, and as such not appropriate for Google’s world-scale ambitions. We still don’t have a credible hyperspectral satellite, and we’re nowhere close to having a LIDAR satellite that can shoot reflecting lasers at all places on the surface of the earth. Although if we did have a satellite that shot reflecting lasers at all places on the surface of the earth, I somehow wouldn’t be surprised if Google was responsible.

Which leads me to the point in the Google-related post where I confess my nervousness around GOOG taking on yet another service — environmental change mapping — that should probably be handled by a democratically directed, publically accountable organization rather than a publically-traded for-profit corporation. And this is the point in the post where I admit that they are taking on that function first and/or well.

Math-Checking the Carbon Pledges

President Obama today announced that he’ll be going to the Copenhagen climate talks, and that he’ll be taking an emissions cut pledge with him. That would be a 17% cut from 2005 levels by 2020. It’s great to hear a US leader setting quantitative targets.

There are a few caveats:

  • the President doesn’t get to pass laws and congress hasn’t committed, so it’s not clear how he can make a unilateral pledge
  • he’s going to the start of the talks, rather than the end, which is when all the rest of the leaders are supposed to hang out

I’m happy to look past those. Setting targets and filling in the details later beats nothing, and what the hell ever happens at leader’s photo-ops anyway? But there’s one more

  • that’s a 17% cut from 2005 levels

When countries announce emissions reductions, they almost always either baseline them against 1990, or some time in the last few years. If they pick 1990, it’s because they’re serious and they want to use the same standard that’s been in play since the days when the Kyoto Treaty was being formulated. Using 1990 means you can compare it against everybody else’s reduction commitments, since they’re all using 1990 levels as well.

Except for that second group, who will use a recent year, like 2005 or 2006. Some time just long enough ago that emissions data is firmly on record, but recent enough that the proportional calculation includes all the increases in emissions that have gone on since we were supposed to get serious about reductions back in the ’90s. Obama has chosen to be in that second group.

How about Canada? We’ve pledged (also without saying how we’re going to do it) 20% cuts from 2006 levels. Second group. Short bus.

Hard to sort out what all those numbers mean: 20% vs. 17% of two different emissions levels, 1990 vs 2005 vs 2006. Luckily, Stephen Wolfram’s massive ego begot Wolfram Alpha for exactly this sort of operation.

So, Wolfram Alpha, if the U.S. cut its greenhouse gas levels to 17% of what it emitted in 2005, what proportion of the amount emitted in 1990 would that be?

If Wolfram’s data is correct and the U.S. followed through on this current pledge, by 2020 the nation would be emitting 97% of the greenhouse gases released in 1990. That’s a 3% cut against 1990 levels, to compare with the 20 to 30% the European Union has pledged, for example. Keep in mind, by 1990 we had already realized that greenhouse gas levels were too high to maintain a stable climate. And by 2020 we will have had yet another decade of destabilization. I’m still glad he’s setting targets, we may well need to take a few baby steps before we start walking somewhere useful. But that’s not fully reassuring, yet.

What about Canada? Environment Minister Jim Prentice is tickled that Canada and the U.S. are “harmonizing” their responses, regardless of their quality. He’s pointing out that the targets are oh-so-close to each other. Great! But if we want to use the 1900 baseline, just how close our targets are would depend on how we compare with regards to relative increase in emissions since then. Let’s check.

Unfortunately, Wolfram Alpha’s greenhouse emissions data only goes up to 2005, so we’ll have to fudge the Canadian calculation a little and run it against 2005 data instead of the 2006 that the government is using in their calculations. That said, here goes:

That’s 100.3% of our 1990 emissions (assuming again that the data is correct). Very close to the U.S. commitment, yes. We’ve just very classily managed to commit to nudging our emissions commitment a teeny bit higher than the 1990 amounts that were scary back then.

Canada to World: Plan On Us Failing on Climate

The whole world is looking for leadership on climate change. Canada is being very clear on this subject: it ain’t us.

Climate change laws years away: Prentice — Nov 17th, CBC

Harper will only go to climate conference if other leaders do: aides — Nov 15th, Canwest News

Canada can’t cut emissions in isolation from U.S.: Prentice — Nov 13th, Edmonton Journal

What Prentice and Harper have to say in the above articles is what every politician wants to be able to get away with saying: we don’t want to be the first ones to move. We want to wait and see how things shake out before we commit ourselves. Lately Canada has been impressively vocal about our insistence on being on the wait-and-see team.

If everyone plays the game that way there will be no sufficient action taken, ever. What is needed is for a few players to decide that since there must eventually be a collective response, they might as well just act as if it was happening, and do something brave with the confidence that they will eventually be backed up. To lead, as it were. Once some countries are out front, then those that have been waiting to see what will happen can fall in behind. Presumably that leading action is going to come from relatively democratic, uncorrupt nations whose policy is meant to reflect the long-term will of the populace, and who by-the-way bare the greatest physical responsibility for stripping everyone of a predictable climate.

In other words, Canada should be leading. I guess we get points for being transparent about our complete failure to step up to that position. At least other countries can plan ahead around our pending failure.