It may be cold up there in the Arctic, but that doesn’t mean it doesn’t burn. And as the planet gets warmer, tundra fires are not only becoming more common, they may also shift a huge amount of carbon from the soil into the atmosphere, a new study reports.
Back in 2007, lightning struck the remote North Slope of Alaska, igniting the largest fire to hit the region since modern recording began in the 1950s. The fire burned for nearly three months until snowfall finally put it out in October. It left behind a charred scar of 400 square miles — big enough to see from space.
The fire didn’t get much press coverage in the lower 48 at the time, but it did catch the eye of ecologists studying the global carbon balance.
The 2007 fire, they found, sent as much stored carbon up in smoke as the entire arctic tundra stores in a year, researchers report in an article published yesterday in Nature.
Michelle Mack, a biologist from the University of Florida, and Syndonia Bret-Harte, of the University of Alaska-Fairbanks, reconstructed how much soil and plant material was burned away and used carbon-dating to determine the age of the burnt soil.
Arctic tundra covers the northernmost fringes of North America and Eurasia, where the lower layers of soil are permanently frozen (called permafrost). Vegetation there is limited to the few scrubby plants that can stand the cold. Scientists watch the tundra carefully because of its potential role in climate change. Intact, the upper layers of soil and plants absorb carbon dioxide from the atmosphere and insulate the permafrost below, keeping those layers cold. But damage to the tundra may actually accelerate global warming.
Tundra fires have become more common over the past two decades as average temperatures in the Arctic have risen and sea ice has receded. And according to scientists, there has been a marked increase in lightning activity on the North Slope in the last two decades. Warmer temperatures may allow more vegetation to grow — which, in turn, makes it more of a fire risk when lightning strikes.
Normally, the tundra in this region of Alaska, near the Anaktuvuk River, takes up more CO2 from photosynthesis than it gives off every year from natural decomposition.
(An interesting aside, the researchers were also able to tell that the soil and plants that were burned away were just the youngest layers. The oldest ash was from the 1950s and came just before the tell-tale “bomb peak.” Nuclear testing in the 1950s and 1960s caused a dramatic spike in the amount of radioactive carbon in the global atmosphere.)
Overall, the amount of carbon released — about 2.1 terragrams — is comparable to what comes from forest fires in warmer parts of the planet, according to Dr. Mack. “But what’s surprising to me,” she says, “is that forests have huge trees, and tundra has six-inch-tall, tiny little plants. All that extra carbon is coming from the soil.”
All that combustion has many consequences. In addition to the released CO2, the burn left behind a thinner layer of soil to protect the permafrost. Also, the charred, dark surface on the burn site will absorb more heat. Normally, light-colored tundra reflects heat back into space. All of this adds up to a scenario in which the Arctic tundra becomes a source of carbon dioxide, instead of a sink.
One last thing worth noting: While the fire took three decades’ worth of accumulated carbon out of the Arctic tundra, it also depleted four centuries of stored nitrogen — the chemical that controls how many plants can grow there.
“This fire [burned an area] about a half of one percent of just this region of Alaska — that’s a very small proportion of the Arctic tundra that’s in Canada, Greenland, Russia,” says Mack. “But one small fire is enough to offset what the whole biome uptakes.”