Green Homes News

And Now, Land May Be Sinking

Green Homes - Wed, 2019-02-20 10:05
February 20, 2019

The Harvard Gazette

And Now, Land May Be Sinking0

By Peter Reuell, Harvard Staff Writer

In the coming decades, cities and towns up and down the eastern seaboard will have to come to terms with the impact of rising sea level due to climate change. A new study, however, is suggesting that rising sea levels may be only part of the picture — because the land along the coast is also sinking.

That’s the key finding of Professor of Earth and Planetary Sciences Peter Huybers, Frank B. Baird Jr. Professor of Science Jerry Mitrovica, and Christopher Piecuch, an assistant scientist at the Woods Hole Oceanographic Institution, who used everything from tide gauges to GPS data to paint the most accurate picture ever of sea-level rise along the east coast of the U.S. The researchers are co-authors of the study, recently published in Nature.

“What we are seeing at a large scale, and this was a surprise to me, is a very clear pattern that you would expect if the response to the last ice age were the primary control on the differential rates of sea-level rise across the eastern U.S.,” said Huybers. In other words, between 20,000 and 95,000 years ago, the Laurentide Ice Sheet, which covered most of northern North America, levered the land upwards. “Now, thousands of years after the ice is gone,” Huybers said, “the mid-Atlantic crust is still subsiding.

“In New England, there is not too much additional sea-level rise from land motion because it’s near the hinge point,” he continued. “The bulge caused by the ice sheet was centered on the mid-Atlantic, and because it’s still settling down, the relative rise of sea level in the mid-Atlantic is about twice the global average.”

What that means, Huybers said, is that we need to prepare for greater rates of relative sea-level rise along the mid-Atlantic because of the combined effects of the natural subsidence of the land and human-caused rises in sea level.

“The fact that the mid-Atlantic is subsiding because of long-term geologic processes means that it will continue for centuries and millennia, in addition to whatever other changes in sea level occur,” Huybers said. “The mid-Atlantic is already having to cope with routine coastal flooding, and this problem is only going to get worse with time.”

Developing estimates of how much various factors contribute to sea level rise, however, is easier said than done.

“Sea level is a noisy place,” said Mitrovica. “Tides go up and down, waves crash, there is ice melting, ocean circulation changes, the warming of the ocean. … If you want to understand sea level in its totality, you need to know what all those factors are doing.”

One of the first researchers to attempt that feat, he said, was Carling Hay, a former postdoctoral fellow in Mitrovica’s lab and now an assistant professor at Boston College.

In 2014, while at Harvard, Hay published a groundbreaking study that used advanced statistical techniques to sift through dozens of data sets and factors influencing sea-level rise. She came to the surprising conclusion that during the 20th century, sea levels had risen more slowly than many had estimated.

“Unfortunately, what that means is that if sea levels weren’t rising as fast as we thought in the 20th century, they have been going up significantly faster than we thought over the last 20 years,” Mitrovica said. “That was a real demonstration of the power of statistical work in a field where it had not been very common.”

With the new study, Mitrovica said, Piecuch took that idea and ran with it. But rather than trying to estimate worldwide sea level rise over the past century, he chose to home in on one particular region over a shorter time period.

“So he can use all sorts of data sets,” Mitrovica said. “He can use GPS, which tells you how the land is moving, but he’s also got sea-level data going back several thousand years, tide gauges, and other data. He throws all that into the stew … and asks where the east coast is going and what’s contributing to that change. What Chris has done has solved this long-standing, 30-year problem.”

But the work, Mitrovica pointed out, is part of a trajectory. “The next thing that’s going to happen is we will be able to bring in satellite data and we can step back and look at this globally,” he said. “And I think for the first time we may be able to separate out the various contributors to sea-level rise.”

“There is a rather confusing montage of possibilities for why sea level could be changing,” Huybers added. “What Chris has done is to pull together disparate information that were distributed across a number of locations and different time intervals and put them together in a fully probabilistic way, allowing for better estimates of historical rates of sea-level change and how the ongoing response to the last ice age will contribute to future changes.”

Image source: AP

Research Areas: 

Categories: Green Homes

Think Different, Act More

Green Homes - Tue, 2019-02-19 14:14
February 19, 2019

The Harvard Gazette

Think Different, Act More0

By Clea Simon, Harvard Correspondent

The threat of climate change is dire, but Hal Harvey sees a path forward.

In “Getting to Zero on Climate Change,” a stirring presentation recently at the Harvard University Center for the Environment, Harvey, the CEO of Energy Innovation Policy and Technology in San Francisco, stressed both the urgency of the problem and specific steps that could, he said, make the difference between accelerating toward destruction or innovating toward prosperity.

Citing “a massive problem with horrifying dimensions,” Harvey, the co-author of “Designing Climate Solutions: A Policy Guide for Low-Carbon Energy,” said that today, “extremes have become the norm.” From drought and wildfires to unprecedented floods and cold snaps, he detailed the effects we have already begun to experience. More terrifying is how close we are to climate tipping points, such as the thawing of the tundra — what used to be known as “permafrost” ­— which would release massive amounts of methane and carbon.

“After a certain point you unleash natural systems and there’s no going back,” he said.

But Harvey also laid out a series of steps that could counteract the problem, including policy recommendations that could be affected by concerned citizens.

Rejecting small-scale, feel-good campaigns — “We shouldn’t have a strategy about plastic straws,” he said — Harvey broke down the problem into four major sectors that contribute the most to climate change: energy and the electric grid, transportation, buildings, and industry. Again pushing for efficacy, he suggested moving for change in the top 20 countries that contribute to climate change, specifically global heavyweights such as the U.S., China, E.U., India, and Russia. He then isolated specific policy changes here that could make a difference for our country and, ultimately, for the globe.

One is the electrical grid. Harvey noted that green energy sources such as solar and wind are already becoming more cost-effective. In fact, with new technologies, such as larger wind turbines and turbines that can float and thus be placed farther from land, off-shore wind power is on the verge of becoming a major industry.

However, these sources are intermittent, leaving many green-energy proponents focused on expensive and, thus far inefficient, battery technology. Instead, Harvey suggested that smarter and more flexible grids can enable municipalities to share resources, leveling out supply and demand. He said demand can also be managed, for instance by cooling skyscrapers in advance of extreme weather, thus using less energy during peak demand times.

He also supported the wider use of renewable portfolio standards that would reward those who invest in these green power sources.

Turning to transportation, Harvey applauded but dismissed the electric vehicle movement, which he said has too small a share of the market to make a difference. For vehicles already on the road, he suggested increasing incentives such as tax rebates and nonfinancial rewards, such as free parking. For the billions of cars that will be built in coming years, however, Harvey argued for super fuel-efficiency, pointing out that in addition to changes in engine technology, efficiency can be increased by making vehicles lighter and less wind-resistant.

For buildings, Harvey said low-emission windows, coated with an invisible metallic layer, already greatly decrease demands for heating and cooling. He called for stronger building codes, like California’s, that focus on annual percentage gains in efficiency. Such continuous and expected progress does not need to be revisited legislatively, he noted, and creates a stable environment that let businesses plan for future construction. As a corollary, Harvey also called for stronger appliance efficiency, a trend that has already proved popular with consumers.

Harvey said industry can also take steps to reduce waste. New 3-D technologies are already helping, as they define the specific components of building projects, eliminating the waste of concrete and other materials. In industrial engines, variable speed settings that use smart technologies to adjust automatically save on power as well as costs.

Harvey pointed out that “most of the money [to make these changes] is there already.” While many climate change activists spend time trying to raise funds to help emerging countries, Harvey said the “world already spends about $5 trillion dollars a year on energy and another $6 trillion for infrastructure setting up consumption.” Reallocating these resources, rather than battling for new ones, is an achievable goal, he said. To effect the change, concerned citizens need only find out who is really in charge. Public utility commissions, for example, often have more practical impact than legislative bodies and regularly hold open meetings.

Do the triage,” Harvey said. “Understand which policies make a difference and pay attention to who makes the decisions.

“With a modest amount of work, a few tens of hours, you can become a player.”

Image by: Jon Chase/Harvard Staff Photographer

Research Areas: 

Categories: Green Homes

How to Think About the Costs of Climate Change

Green Homes - Fri, 2019-01-18 14:57
January 17, 2019

The New York Times

How to Think About the Costs of Climate Change0

By Neil Irwin

By now, it’s clear that climate change poses environmental risks beyond anything seen in the modern age. But we’re only starting to come to grips with the potential economic effects.

Using increasingly sophisticated modeling, researchers are calculating how each tenth of a degree of global warming is likely to play out in economic terms. Their projections carry large bands of uncertainty, because of the vagaries of human behavior and the remaining questions about how quickly the planet will respond to the buildup of greenhouse gases.

A government report in November raised the prospect that a warmer planet could mean a big hit to G.D.P. in the coming decades.

And on Thursday, some of the world’s most influential economists called for a tax on carbon emissions in the United States, saying climate change demands “immediate national action.” The last four people to lead the Federal Reserve, 15 former leaders of the White House Council of Economic Advisers, and 27 Nobel laureates signed a letter endorsing a gradually rising carbon tax whose proceeds would be distributed to consumers as “carbon dividends.”

The Trump administration has long rejected prescriptions like a carbon tax. But policy debates aside, many of the central economic questions of the decades ahead are, at their core, going to be climate questions. These are some of the big ones.

How permanent will the costs be?

When we think about the economic damage from a hotter planet, it’s important to remember that not all costs are equivalent, even when the dollar values are similar. There is a big difference between costs that are high but manageable versus those that might come with catastrophic events like food shortages and mass refugee crises.

Consider three possible ways that climate change could exact an economic cost:

  • A once-fertile agricultural area experiences hotter weather and drought, causing its crop yields to decrease.
  • A road destroyed by flooding because of rising seas and more frequent hurricanes must be rebuilt.
  • An electrical utility spends hundreds of millions of dollars to build a more efficient power grid because the old one could not withstand extreme weather.

The farmland’s yield decline is a permanent loss of the economy’s productive capacity — society is that much poorer, for the indefinite future. It’s worse than what happens in a typical economic downturn. Usually when factories sit idle during a recession, there is a reasonable expectation that they will start cranking again once the economy returns to health.

The road rebuilding might be expensive, but at least that money is going to pay people and businesses to do their work. The cost for society over all is that the resources that go to rebuilding the road are not available for something else that might be more valuable. That’s a setback, but it’s not a permanent reduction in economic potential like the less fertile farmland. And in a recession, it might even be a net positive, under the same logic that fiscal stimulus can be beneficial in a downturn.

By contrast, new investment in the power grid could yield long-term benefits in energy efficiency and greater reliability.

There’s some parallel with military spending. In the 1950s and ’60s, during the Cold War, the United States spent more than 10 percent of G.D.P. on national defense (it’s now below 4 percent).

Most of that spending crowded out other forms of economic activity; many houses and cars and washing machines weren’t made because of the resources that instead went to making tanks, bombs and fighter jets. But some of that spending also created long-term benefits for society, like the innovations that led to the internet and to reliable commercial jet aircraft travel.

Certain types of efforts to reduce carbon emissions or adapt to climate impacts are likely to generate similar benefits, says Nicholas Stern, chair of the Grantham Research Institute on Climate Change and the Environment at the London School of Economics.

“You couldn’t provide sea defenses at large scale without very heavy investment, but it’s not investment of the kind that you get from the things that breed technological progress,” Mr. Stern said. “The defensive adaptations don’t carry anything like the dynamism that comes from different ways of doing things.”

There is more fertile ground in areas like transportation and infrastructure, he said. Electric cars, instead of those with internal combustion engines, would mean less air pollution in cities, for example.

How should we value the future compared with the present?

Seeking a baseline to devise environmental regulations, the Obama administration set out to calculate a “social cost of carbon,” the amount of harm each new ton of carbon emissions will cause in decades ahead.

At the core of the project were sophisticated efforts to model how a hotter earth will affect thousands of different places. That’s necessary because a low-lying region that already has many hot days a year is likely to face bigger problems, sooner, than a higher-altitude location that currently has a temperate climate.

Michael Greenstone, who is now director of the Becker Friedman Institute at the University of Chicago and of the Energy Policy Institute there, as well as a contributor to The Upshot, was part of those efforts.

“We’ve divided the world into 25,000 regions and married that with very precise geographic predictions on how the local climate will change,” Mr. Greenstone said. “Just having the raw computing power to be able to analyze this at a more disaggregated level is a big part of it.”

But even once you have an estimate of the cost of a hotter climate in future decades, some seemingly small assumptions can drastically alter the social cost of carbon today.

Finance uses something called the discount rate to compare future value with present value. What would the promise of a $1,000 payment 10 years from now be worth to you today? Certainly something less than $1,000 — but how much less would depend on what rate you use.

Likewise, the cost of carbon emissions varies greatly depending on how you value the well-being of people in future decades — many not born yet, and who may benefit from technologies and wealth we cannot imagine — versus our well-being today.

The magic of compounding means that the exact rate matters a great deal when looking at things far in the future. It’s essentially the inverse of observing that a $1,000 investment that compounds at 3 percent a year will be worth about $4,400 in 50 years, whereas one that grows 7 percent per year will be worth more than $29,000.

In the Obama administration’s analysis, using a 5 percent discount rate — which would put comparatively little weight on the well-being of future generations — would imply a social cost of $12 (in 2007 dollars) for emitting one metric ton of carbon dioxide. A metric ton is about what would be released as a car burns 113 gallons of gasoline. A 2.5 percent rate would imply a cost of $62, which adds up to hundreds of billions of dollars a year in society-wide costs at recent rates of emissions.

The Obama administration settled on a 3 percent discount rate that put the social cost of carbon at $42 per metric ton. The Trump administration has subsequently revised that estimate to between one dollar and seven dollars.

That sharp decrease was achieved in part by measuring only the future economic costs to the United States, not factoring in the rest of the world. And the Trump administration analyzed a discount rate of up to 7 percent — a rate at which even costs far into the future become trivial.

Mr. Greenstone favors substantially lower discount rates, based on evidence that financial markets also place high value on investments that protect against risk.

Understood this way, spending today to reduce carbon emissions tomorrow is like insurance against some of the most costly effects of a hotter planet — and part of the debate is over how much that insurance is really worth, given that the biggest benefits are far in the future.

How might climate change fuel inequality?

When a government report raises the possibility of a 10 percent hit to G.D.P. as a result of a warming climate, it can be easy to picture everyone’s incomes being reduced by a tenth.

In reality there is likely to be enormous variance in the economic impact, depending on where people live and what kind of jobs they have.

Low-lying, flood-prone areas are at particularly high risk of becoming unlivable — or at least uninsurable. Certain industries in certain places will be dealt a huge blow, or cease to exist; many ski slopes will turn out to be too warm for regular snow, and the map of global agriculture will shift.

Adaptation will probably be easier for the affluent than for the poor. Those who can afford to move to an area with more favorable impacts from a warmer climate presumably will.

So the economic implications of climate change include huge shifts in geography, demographics and technology, with each affecting the other.

“To look at things in terms of G.D.P. doesn’t really capture what this means to people’s lives,” said William Nordhaus, a Yale economist who pioneered the models on which modern climate economics is based and who won a Nobel for that work. “If you just look at an average of all the things we experience, some in the marketplace and some not in the marketplace, it’s insufficient. The impact is going to be highly diverse.”

Can we adapt to a warmer climate?

Despite all these risks, it’s important to remember that humanity tends to be remarkably adaptable. A century ago, most people lived without an automobile, a refrigerator, or the possibility of traveling by airplane. A couple of decades before that, almost no one had indoor plumbing.

Changes in how people live, and the technology they use, could both mitigate the impact of climate change and ensure that the costs are less about a pure economic loss and more about rewiring the way civilization works.

Most capital investments last only a decade or two to begin with; people are constantly rebuilding roads, buildings and other infrastructure. And a warmer climate could, if it plays out slowly enough, merely shift where that reinvestment happens.

But a big risk is that the change happens too quickly. Adaptation that might be manageable over a generation could be impossible — and cause mass suffering or death — if it happens over a few years.

Imagine major staple food crops being wiped out for a few consecutive years by drought or other extreme weather. Or a large coastal city wiped out in a single extreme storm.

“Whether it’s jobs, consumption patterns or residential patterns, if things are changing so fast that we can’t adapt to them, that will be very, very costly,” Mr. Nordhaus said. “We know we can adapt to slow changes. Rapid changes are the ones that would be most damaging and painful.”

It’s clear that climate change and its ripple effects are likely to be a defining challenge of the 21st-century economy. But there are wide ranges of possible results that vary based on countless assumptions. We should also recognize that the economic backdrop of society is always changing. Projecting what that will mean for ordinary people is not simply a matter of dollars.

“I’ve spent the last 20 years trying to communicate it and it’s not easy to process,” Joseph Aldy, who teaches at Harvard’s Kennedy School for Public Policy, said of the connection between climate change and the economy. “It’s really hard to convey something that is long term and gradual until it’s not.”

Research Areas: 

Categories: Green Homes

A Growing Role As a Living Lab

Green Homes - Fri, 2019-01-18 14:23
January 16, 2019

The Harvard Gazette

A Growing Role As a Living Lab0

By Deborah Blackwell, Arnold Arboretum Communications

Andrew Groover celebrates the complexity of trees and makes it his life’s work to unlock how they adapt to their environments. It’s knowledge that’s critical for the U.S. Forest Service research geneticist — he works in California, where concerns about climate change have grown as wildfires there have increased in frequency and intensity.

A practical problem for Groover, who is a University of California, Davis, adjunct professor of plant biology, is efficient access to the variety of trees he studies. His research requires a ready supply of species diversity, a tall order without laborious travel. But in 2012 his search for the perfect resource brought him to the Arnold Arboretum of Harvard University — a 281-acre living museum holding more than 2,100 woody plant species from around the world.

“Trees are fascinating for biology and research, but one of the greatest challenges in this research is finding trees tractable for study,” Groover said. “If you have a list of a dozen or two different species, where do you get all those? The Arnold Arboretum has all of the species we would ever want to look at, and then some.”

The Arboretum also contains one of the most extensive collections of Asian trees in the world, which Groover said is advantageous to his research. Typically a researcher has to travel to various locations throughout the world, determine whether the trees are on public or private property, obtain permission to study and transport samples, overcome language and other barriers, and potentially return to the same site later to complete research, which can be challenging.

“The Arnold Arboretum plays a crucial role in research and science and educating the public, connecting them with trees and forests. But it’s also a living laboratory and repository of hard-to-source species for research and is renowned for its collection of Asian disjuncts,” he said. “We can actually study these species pairs found in both Asia and the U.S. directly in the Arboretum. We didn’t need to go anywhere else.”

Director of the Arnold Arboretum and Arnold Professor of Organismic and Evolutionary Biology William (Ned) Friedman emphasized the extraordinary efforts that go into creating such a high-impact research destination.

“Importantly, beyond the more than 16,000 accessioned woody plants at the Arnold Arboretum, we have a staff of world-class horticulturists, propagators, IT professionals, curators, and archivists, all of whom are devoted to ensuring that the living collections are what I call a ‘working collection’ of plants,” he said. “The plants of the Arboretum may look great in flower, or at the peak of fall colors, but these plants are here primarily to be studied by scholars at Harvard and from around the world. In 2018 alone, there were 79 different research projects using the living collections and landscape of the Arnold Arboretum.”

Groover’s work with the Arboretum became a long-term collaboration. In 2014 he won a Sargent fellowship, and, working with Arboretum scientists, collected small samples of genetic material from specific Arboretum trees and propagated them in his own laboratory greenhouses. In 2015 Groover, with Friedman, organized the 35th New Phytologist Symposium held at the Arboretum. He has also given several research talks there, most recently in December on genomic approaches to understanding the development and evolution of forest trees.

“When the Weld Hill Research Building was completed [in 2011], many of us in the research community saw that as a real commitment holding great possibilities for expanding into new areas of research,” he said. “We could not only access a broad range of species all in one location, we had a physical facility for research activities.”

Groover’s work investigates genetic regulation of wood formation — the triggers of gene expression within the wood — which is driven by environment, including light, temperature, wind, water, gravity, even insects and disease. Studying diverse tree species helps him identify the genetic basis of how different species modify their growth and adapt to different environmental conditions.

“Trees, in general, are very responsive to the environment, and trees can actually make adjustments in their wood anatomy to suit the environment,” Groover said. “One thing that is really interesting about trees is that they are perennial and live to decades or even thousands of years in the same place, and they have to be able to cope with all of the variation.”

The collaboration with the Arboretum is special because its trees contain valuable provenance.

“The trees are well-cared for, are not likely to disappear or die so you can go back again, and they are all right there next to each other,” Groover said.

While his in-depth research is on poplars (Populus spp.), the knowledge obtained may be beneficial in the study of many other tree species.

“If the genetic regulation of a trait is conserved among species, then what we learn in poplar can be transferred to the hundreds of other species we would like to be able to better manage or understand,” Groover said. “We can transfer knowledge across different species and potentially use that information in the future for things like reforestation and restoration.”

Suzanne Gerttula of the Forest Service began working in developmental plant genetics more than three decades ago and joined Groover’s laboratory in 2010. The former staff research associate in plant biology at U.C., Davis, has an interest in the underlying mechanisms of trees’ responses to gravity, such as occurs in weeping varieties.

“The Arboretum is an incredible resource for both weeping and upright trees. It’s fascinating, fun, and inspiring to me to be able to get at the some of the biochemical bases of how life works,” she said.

Groover’s enthusiasm for his subject spans sectors from ecological to economic. From understanding Earth cycles and climate change to helping the lumber, paper, fiber, and even biofuel industries, he hopes his research can inform solutions for forest management and conservation and identify new forms of renewable energy.

“I think it’s important we have places like the Arnold Arboretum to help provide this sort of basic information that has the potential to help in the conservation and management of forests,” he said.

Michael Dosmann, Keeper of the Living Collections at the Arboretum, said it has research potential across a wide swath of disciplines — taxonomic, horticultural, plant conservation, ecology, and developmental biology.

“Our living collection’s research potential could never be exhausted; there is a constant need for its use, growth, and development,” he said. “[The] dynamic interplay between living collections and scientific research demonstrates the vital importance collections have to science and to society.”

Scientists such as Groover enjoy access not only to the living collections, but also to other Arboretum resources, including affiliated collections containing herbarium specimens, archives, images, historical records, on-site greenhouse and laboratory space, centralized expertise, and, frequently, financial assistance in the form of grants and fellowships.

“All too often, the cost both in time and dollars of assembling collections at their own institutions is prohibitive for researchers, making places like the Arboretum a vital resource, especially for those working with limited budgets,” Dosmann said.

Evolving technology also plays a critical role, according to Dosmann, giving researchers the ability to access the Arboretum’s expansive resources, and making plant species more attainable.

“With the aid of databases and other information systems, it is now much easier to see collections in the multiple dimensions within which they exist and appreciate their unlimited research potential,” he said.

Groover said that with forests facing multiple threats, there’s never been a more important time to address forest biology and the use of technology.

“In the west especially, we need new insights into how to make forests more resilient to drought and heat, including understanding the biology underlying stress responses in different tree species,” he said. “We are learning the complexities of forest trees and hope to ultimately be able to select genotypes or species that might perform better in the future. Working with the Arboretum offers the resources for this important research.”

Research Areas: 

Categories: Green Homes

Is the Green New Deal For Real?

Green Homes - Fri, 2019-01-18 10:50
January 10, 2019

90.9 WBUR

Is the Green New Deal For Real?0

Listen to podcast

The mission, as it turned out, was to transform the American economy and save the country, no less, over twelve years. Franklin Roosevelt called it his New Deal, starting in 1933. New-breed Democrats in Congress today are talking about a Green New Deal, starting now, deep into the crisis of a changing climate that goes way beyond the weather. FDR had a working class revolt driving him forward, and later he had a Nazi threat and a world war to focus every fiber of mind and muscle on a reinvention. Which may be what the climate is demanding. Here’s one test: at mention of an all-new renewable energy system, is your first thought Costs? Savings? Or Survival? Getting real about the Green New Deal, this week on Open Source.

Three words and one picture sum up the new scene in Washington—and the relief, for starters, from a two-year fixation on President You-Know-Who. The picture is of the so-called Sunrise Movement siege of Nancy Pelosi’s office from last November, and of the rapturous, insurgent Congressperson from the Bronx, Alexandria Ocasio-Cortez, sweeping up the moment and putting its three little words—Green New Deal—at the top of the evolving agenda in D.C. It’s as slippery a promise as universal health care, but here’s our first crack at what it could mean: a resurrection of spirit, perhaps, at the bold Rooseveltian scale, after 75 years? A reset in relations with work, among workers, which Roosevelt’s New Deal was? We’ll see. Does it mean a warfor clean, renewable energy, against the embedded power of fossil-fuels? Unavoidably. A “system upgrade” for the power grid and the whole economy? About time, you say! But can it be done?

Guest List

Bill McKibben
environmentalist and journalist

Naomi Oreskes
Professor of the History of Science at Harvard

Daniel Schrag
Professor of Geology and Environmental Science and Engineering at Harvard and director of the Harvard Center for the Environment

Research Areas: 

Categories: Green Homes

The Double-Edged Sword of Palm Oil

Green Homes - Thu, 2019-01-17 14:28
January 16, 2019

Stanford Earth

The Double-Edged Sword of Palm Oil0

BY ROB JORDAN, STANFORD WOODS INSTITUTE FOR THE ENVIRONMENT

Nearly ubiquitous in products ranging from cookies to cosmetics, palm oil represents a bedeviling double-edged sword. Widespread cultivation of oil palm trees has been both an economic boon and an environmental disaster for tropical developing-world countries, contributing to large-scale habitat loss, among other impacts. New Stanford-led research points the way to a middle ground of sustainable development through engagement with an often overlooked segment of the supply chain.

"The oil palm sector is working to achieve zero-deforestation supply chains in response to consumer-driven and regulatory pressures, but they won’t be successful until we find effective ways to include small-scale producers in sustainability strategies,” said Elsa Ordway, lead author of a Jan. 10 Nature Communications paper that examines the role of proliferating informal oil palm mills in African deforestation. Ordway, a postdoctoral fellow at The Harvard University Center for the Environment, did the research while a graduate student in Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth).

Using remote sensing tools, Ordway and her colleagues mapped deforestation due to oil palm expansion in Southwest Cameroon, a top producing region in Africa’s third largest palm oil producing country. 

Contrary to a widely publicized narrative of deforestation driven by industrial-scale expansion, the researchers found most oil palm expansion and associated deforestation occurred outside large, company-owned concessions, and that expansion and forest clearing by small-scale, non-industrial producers was more likely near low-yielding informal mills, scattered throughout the region. This is strong evidence that oil palm production gains in Cameroon are coming from extensification instead of intensification.

Possible solutions for reversing the extensification trend include improving crop and processing yields by using more high-yielding seed types, replanting old plantations, and upgrading and mechanizing milling technologies, among other approaches. To prevent intensification efforts from inciting further deforestation, they will need to be accompanied by complementary natural resource policies that include sustainability incentives for smallholders.

In Indonesia, where a large percentage of the world’s oil palm-related forest clearing has occurred, a similar focus on independent, smallholder producers could yield major benefits for both poverty alleviation and environmental conservation, according to a Jan. 4 Ambio study led by Rosamond Naylor, the William Wrigley Professor in the School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment and the Freeman Spogli Institute for International Studies (Naylor coauthored the Cameroon study led by Ordway).

Using field surveys and government data, Naylor and her colleagues analyzed the role of small producers in economic development and environmental damage through land clearing. Their research focused on how changes in legal instruments and government policies during the past two decades, including the abandonment of revenue-sharing agreements between district and central governments and conflicting land title authority among local, regional and central authorities, have fueled rapid oil palm growth and forest clearing in Indonesia.

They found that Indonesia’s shift toward decentralized governance since the end of the Suharto dictatorship in 1998 has simultaneously encouraged economic development through the expansion of smallholder oil palm producers (by far the fastest growing subsector of the industry since decentralization began), reduced rural poverty, and driven ecologically destructive practices such as oil palm encroachment into more than 80 percent of the country’s Tesso Nilo National Park.

Among other potential solutions, Naylor and her coauthors suggest Indonesia’s Village Law of 2014, which devolves authority over economic development to the local level, be re-drafted to enforce existing environmental laws explicitly. Widespread use of external facilitators could help local leaders design sustainable development strategies and allocate village funds more efficiently, according to the research. Also, economic incentives for sustainable development, such as an India program in which residents are paid to leave forests standing, could make a significant impact.

There is reason for hope in recent moves by Indonesia’s government, including support for initiatives that involve large oil palm companies working with smallholders to reduce fires and increase productivity; and the mapping of a national fire prevention plan that relies on financial incentives.

“In all of these efforts, smallholder producers operating within a decentralized form of governance provide both the greatest challenges and the largest opportunities for enhancing rural development while minimizing environmental degradation,” the researchers write.

Coauthors of "Decentralization and the environment: Assessing smallholder oil palm development in Indonesia” include Matthew Higgins, a research assistant at Stanford’s Center on Food Security and the Environment; Ryan Edwards of Dartmouth College, and Walter Falcon, the Helen C. Farnsworth Professor of International Agricultural Policy, Emeritus, at Stanford.

Coauthors of “Oil palm expansion at the expense of forests in Southwest Cameroon associated with proliferation of informal mills” include Raymond Nkongho, a former fellow at Stanford’s Center for Food Security and the Environment; and Eric Lambin, the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment.

Image: Elsa Ordway

Research Areas: 

Categories: Green Homes

Mercury Matters 2018: A Science Brief for Journalists and Policymakers

Green Homes - Tue, 2019-01-15 12:45
December 1, 2018

Center for Climate, Health, and the Global Environment

Mercury Matters 2018: A Science Brief for Journalists and Policymakers0

Mercury in Context

Coal-fired power plants are the largest source of mercury in the U.S., accounting for approximately 48% of mercury emissions in 2015[1].

The Mercury and Air Toxics Standards (MATS) were finalized in 2011 and currently regulate emissions of mercury, acid gases and other hazardous air pollutants (HAPs) from U.S. electric utilities.

The MATS rule is expected to reduce mercury emissions from the power sector by 90%, improve public health, and play an integral role in meeting U.S. commitments under the international 2017 Minamata Convention on Mercury. 

The Latest from EPA

In August 2018, the U.S. Environmental Protection Agency (EPA) announced plans to revisit the Agency’s prior determination that regulating HAPs emitted from power plants under section 112 of the Clean Air Act was “appropriate and necessary”.

A proposal to reopen one or more aspects of MATS is currently under interagency review at the Office of Management and Budget and could result in lifting limits on mercury emissions from electric utilities in the U.S. 

The Issue

Recent research shows that MATS has substantially reduced mercury levels in the environment and improved public health at a much lower cost than anticipated. However, the Regulatory Impact Assessment (RIA) that the Administration is relying on in its rollback proposal does not reflect current scientific understanding of the local impacts and societal cost of mercury pollution in the U.S.[2],[3].

Many of the health effects associated with mercury exposure are not fully reflected in the RIA, and the final estimate of the mercury-related benefits from MATS only accounted for benefits to children of freshwater recreational anglers in the U.S., a small fraction of the total population affected. 

Mercury Emissions Matter to Human Health and the Environment

Mercury in the form of methylmercury is a potent neurotoxin. Important facts about the health effects of methylmercury include the following:

Children exposed to methylmercury during a mother’s pregnancy can experience persistent and lifelong IQ and motor function deficits[4].

In adults, high levels of methylmercury exposure have been associated with adverse cardiovascular effects, including increased risk of fatal heart attacks[5].

Other adverse health effects of methylmercury exposure that have been identified in the scientific literature include endocrine disruption[6], diabetes risk[7], and compromised immune function[8].

The societal costs of neurocognitive deficits associated with methylmercury exposure in the U.S. were estimated in 2017 to be approximately $4.8 billion per year[9].

No known threshold exists for methylmercury below which neurodevelopmental impacts do not occur[10],[11].

Mercury exposure in the U.S. occurs primarily through the consumption of freshwater fish and seafood (fish and shellfish). The consumption of marine fish, often harvested from U.S. coastal waters, accounts for greater than 80% of methylmercury intake by the U.S. population[12]. Dietary supplements cannot counteract methylmercury toxicity in U.S. consumers. A safe and consumable fishery is important to retaining a healthy, low-cost source of protein and other nutrients that are essential for pregnant women, young children, and the general population.

After mercury is emitted from power plants it is deposited back to Earth where it can be converted to methylmercury, a highly toxic form of mercury that magnifies up food chains, reaching concentrations in fish that are 10 to 100 million times greater than concentrations in water[13].

With increasing levels of mercury in the environment due to human activities, virtually all fish from U.S. waters now have detectable levels of methylmercury. Some fish, such as swordfish, large species of tuna, and freshwater game fish, can have levels that exceed consumption guidelines.

States post fish consumption advisories for waterbodies that are known to have elevated contaminants. In 2013, consumption advisories for mercury were in effect in all 50 states, one U.S. territory, and three tribal territories, and accounted for 81% of all U.S. advisories[14]. This represents more advisories for mercury than for all other contaminants combined.

Wildlife that consume fish, such as common loons, bald eagles, otter and mink, and many marine mammals can also experience adverse effects from mercury and are unable to heed advisories[15]. The health of many songbird and bat species is threatened due to methylmercury exposure in wetland habitats. The productivity of economically valuable game fish stocks can also be compromised[16].

As Mercury Emissions in the U.S. Have Declined, Health Has Improved

In the 2011 MATS RIA, it was assumed that mercury emissions from coal-fired utilities are mainly transported long-distances away from the U.S. and that a substantial fraction of mercury in the U.S. comes from international sources. Since that time, scientific understanding of the fate of U.S. mercury emissions has advanced[17],[18]. Recent research reveals that the contribution of U.S. coal-fired power plants to local mercury contamination in the U.S. has been markedly underestimated. Accordingly, controls on mercury emissions from U.S. electric utilities have contributed to the following human health and environmental improvements.

Mercury emissions from U.S. coal-fired power plants have declined by 85% from 92,000 pounds in 2006 to 14,000 pounds in 2016[19]since states began setting standards and MATS was introduced in 2011. Eleven states had implemented mercury emissions standards for power plants prior to 2011.

Concurrent with declines in mercury emissions, mercury levels in air, water, sediments, loons, freshwater fisheries, and Atlantic Ocean fisheries[20] have decreased appreciably.

Mercury levels in the blood of women in the U.S. declined by 34% between 2001 and 2010 as mercury levels in some fish decreased, and fish consumption advisories improved[21].

The estimated number of children born in the U.S. each year with prenatal exposure to methylmercury levels that exceed the EPA reference dose has decreased by half from 200,000-400,000 to 100,000-200,000, depending on the measure used[22].

The Benefits of Reducing Mercury Are Much Larger Than Previously Estimated

The EPA estimated in the MATS RIA that the annualized mercury-related health benefits of reducing mercury emissions would be less than $10 million. Recent studies that account for more pathways of methylmercury exposure and additional health effects suggest that the monetized benefits of reducing power plant mercury emissions in the U.S. are likely in the range of several billion dollars per year[23],[24],[25]. These and other studies support the conclusion that the mercury-related benefits from MATS are orders of magnitude larger than previously estimated in the MATS RIA[26].

In addition to the mercury-related benefits, MATS has also decreased sulfur dioxide and nitrogen oxide emissions, improving air quality and public health by reducing fine particulate matter and ground-level ozone. The EPA estimated that the annualized value of these additional benefits is $24 to $80 billion; bringing the total annual benefits from MATS to tens of billions of dollars. Even with these more complete estimates, substantial benefits of reducing mercury and other air toxics remain unquantified due to data limitations[27].

On the cost side, new information suggests that the EPA’s original cost-estimate for MATS of $9.6 billion is much higher than the actual cost due to declines in natural gas prices and lower than expected control equipment and renewable energy costs[28]. Yet, even with the original overestimate, the EPA projected that MATS would increase the monthly electric bill of the average American household by only $2.71 (or 0.3 cents per kilowatt-hour). This value is well within the price fluctuation consumers experienced between 2000 and 2011[29].

The Bottom Line

The science is clear, the health impacts of U.S. mercury emissions in the U.S. are large and disproportionately affect children and other vulnerable populations. Mercury emission standards in the U.S. have markedly reduced mercury in the environment and improved public health. The mercury-related benefits of MATS are much larger than previously estimated, the actual costs appear to be substantially lower than projected by the EPA, and the total monetized benefits across all pollutants far outweigh the costs of the standards.

Contributors

Charles Driscoll, Department of Civil and Environmental Engineering, Syracuse University

Elsie Sunderland, Harvard Paulson School of Engineering & Applied Sciences and Harvard T.H. Chan School of Public Health, Department of Environmental Health, Exposure, Epidemiology and Risk

Kathy Fallon Lambert, Harvard T.H. Chan School of Public Health, Center for Climate, Health, and the Global Environment

Joel Blum, Department of Earth and Environmental Sciences, University of Michigan

Celia Chen, Department of Biological Sciences, Dartmouth College

David Evers, BioDiversity Research Institute

Philippe Grandjean, Harvard T.H. Chan School of Public Health, Department of Environmental Health, Environmental and  Occupational Medicine and Epidemiology

Rob Mason, Departments of Chemistry and Marine Sciences, University of Connecticut

Emily Oken, Harvard Medical School

Noelle Selin, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology

Literature Cited

[1] Streets, D.G.; Horowitz, H.M.; Lu, Z.; Levin, L.; Thackray, C.P.; Sunderland, E.M. Global and regional trends in mercury emissions and concentrations, 2010-2015. Atmospheric Environment. Accepted.

[2] Sunderland, E.M.; Driscoll, Jr., C.T.; Hammitt, J.K.; Grandjean, P.; Evans, J.S.; Blum, J.D.; Chen, C.Y.; Evers, D.C.; Jaffe, D.A.; Mason, R.P.; Goho, S.; Jacobs, W. 2016. Benefits of Regulating Hazardous Air Pollutants from Coal and Oil-Fired Utilities in the United States. Environmental Science & Technology. 50 (5), 2117-2120. DOI: 10.1021/acs.est.6b00239.

[3] Giang, A.; Mulvaney, K; Selin, N.E. 2016. Comments on “Supplemental Finding That It Is Appropriate and Necessary to Regulate Hazardous Air Pollutants from Coal- and Oil-Fired Electric Utility Steam Generating Units”.

[4] Grandjean, P. and Bellanger, M. 2017. Calculation of the disease burden associated with environmental chemical exposures: application of toxicological in health economic estimation. 16:123. DOI: 10.1186/s12940-017-0340-3.

[5] Genchi G., Sinicropi M.S., Carocci A., Lauria G., Catalano A. 2017. Mercury Exposure and Heart Diseases. Int J Environ Res Public Health. 2017;14(1):74. Published Jan 12. DOI:10.3390/ijerph14010074.

[6] Tan, S.W.; Meiller, J.C.; Mahaffey, K.R. 2009. The endocrine effects of mercury in humans and wildlife. Crit. Rev. Toxicol. 39 (3), 228−269.

[7] He, K.; Xun, P.; Liu, K.; Morris, S.; Reis, J.; Guallar, E. 2013. Mercury exposure in young adulthood and incidence of diabetes later in life: the CARDIA trace element study. Diabetes Care. 36, 1584−1589.

[8] Nyland, J. F.; Fillion, M.; Barbosa, R., Jr.; Shirley, D. L.; Chine, C.; Lemire, M.; Mergler, D.; Silbergeld, E.K. 2011. Biomarkers of methylmercury exposure and immunotoxicity among fish consumers in the Amazonian Brazil. Env. Health Persp. 119 (12), 1733− 1738.

[9] Grandjean and Bellanger 2017.

[10] Rice, G.E.; Hammitt, J.K; and Evans, J.S. 2010. A probabilistic characterization of the health benefits of reducing methyl mercury intake in the United States. Environ Sci Technol. 1;44(13):516-24. DOI:10.1021/es903359u.

[11] Grandjean and Bellanger 2017.

[12] Sunderland, E. M.; Li, M.; Bullard, K. 2018. Decadal Changes in the Edible Supply of Seafood and Methylmercury Exposure in the United States. Environ. Health Persp. DOI: 10.1289/EHP2644.

[13] Driscoll, C.T.; Han, Y-J; Chen, C.; Evers, D.; Lambert, K.F.; Holsen, T.; Kamman, N.; and Munson, R. 2007. Mercury Contamination on Remote Forest and Aquatic Ecosystems in the Northeastern U.S.: Sources, Transformations, and Management Options. BioScience. 57(1):17-28.

[14] U.S. Environmental Protection Agency. 2011 National Listing of Fish Advisories. 2013. EPA-820-F-13-058.

[15] Chan, N.M.; Scheuhammer, A.M.; Ferran, A.; Loupelle, C.; Holloway, J.; and Weech, S. 2003. Impacts of Mercury on Freshwater Fish-eating Wildlife and Humans. Human and Ecological Risk Assessment. 9(4): 867-883.

[17] Zhang, Y.; Jacob, D.; Horowitz, H.; Chen, L.; Amos, H.; Krabbenhoft, D.; Slemr, F.; St. Louis, V.; Sunderland, E. 2016. Observed decrease in atmospheric mercury explained by global decline in anthropogenic emissions. PNAS. 113 (3) 526-531.  DOI: 10.1073/pnas.1516312113.

[18] Lepak, R.F.; Yin, R.; Krabbenhoft, D.; Ogorek, J.; DeWild, J.; Holsen, T.; and Hurley, J. 2015. Use of Stable Isotope Signatures to Determine Mercury Sources in the Great Lakes. Environmental Science & Technology Letters. 2 (12), 335-34. DOI: 10.1021/acs.estlett.5b00277.

[19] U.S. Environmental Protection Agency. 2018. https://www.epa.gov/trinationalanalysis/electric-utilities-mercury-relea....

[20] Cross, F.A.; Evans, D.W.; Barber, R.T. 2015. Decadal declines of mercury in adult bluefish (1972−2011) from the mid-Atlantic coast of the U.S.A. Environ. Sci. Technol. 49, 9064−9072.

[21] U.S. Environmental Protection Agency. 2013. Trends in Blood Mercury Concentrations and Fish Consumption Among U.S. Women of Childbearing Age NHANES 1999-2010. EPA-823-R-13-002. https://www.regulations.gov/document?D=EPA-HQ-OAR-2009-0234-20544.

[22] U.S. Environmental Protection Agency. 2013. EPA-823-R-13-002.

[23] Rice et al. 2010.

[24] Giang, A.; Selin, N. E. Benefits of mercury controls for the United States. Proc. Natl. Acad. Sci. U. S. A. 2016, 113, 286.

[25] Sunderland et al. 2016.

[26] Giang et al. 2016.

[27] Sunderland et al. 2016.

[28] Declaration of James E. Staudt, Ph.D. CFA, September 24, 2015, White Stallion Energy Center, et al., v. United States Environmental Protection Agency, Case No. 12-1100 and Summary plus cases, Exhibit 1 Declaration of James E. Staudt, Ph.D., CFA, U.S. Court of Appeals for the District of Columbia.

[29] U.S. Environmental Protection Agency. Final Consideration of Cost in the Appropriate and Necessary Finding for the Mercury and Air Toxics Standards for Power Plants. https://www.epa.gov/sites/production/files/2016-05/documents/20160414_ma....

Photo by Pixabay user 12019

Research Areas: 

Categories: Green Homes

Environmental Health Capacity Building

Green Homes - Tue, 2019-01-08 10:14
November 8, 2018

Harvard T.H. Chan School News

Environmental Health Capacity Building0

By Chris Sweeney

Dust storms in Kuwait. Tourism in Tunisia. Air pollution in Uganda. Three different countries facing three different challenges. A common thread? Harvard T.H. Chan School researchers are working in each setting to understand how environmental factors are impacting the health of the people who live and work in these regions.

At a panel discussion on “Environmental Health Capacity Building In Africa And The Middle East” held on October 25, 2018 as part of Harvard Worldwide Week, attendees were given a look at these ongoing research projects.

“Developing countries in Africa and the Middle East are bearing a disproportionate health burden from climate change and environmental contamination,” said Douglas Dockery, John L. Loeb and Frances Lehman Loeb Research Professor of Environmental Epidemiology. “For this panel we brought together three investigators across Harvard who are partnering with institutions in this region to build local capacity to address these challenges.”

The event, hosted by the Department of Environmental Health and the Harvard Chan-NIEHS Center for Environmental Health, kicked off with a presentation from Petros Koutrakis, professor of environmental sciences and an expert on air pollution. Koutrakis shared an overview of his work in the Middle East, which dates back to the 1990s when he and colleagues assessed the environmental health impacts of the hundreds of oil wells that were set ablaze during the Gulf War.

More recently, Koutrakis has turned his attention to dust storms in Kuwait, a fairly common meteorological event that may have a significant impact on human health and social dynamics. Using satellite data, historical weather records, and air quality sensors, Koutrakis and colleagues are gleaning new insights on how desert vegetation and wind patterns affect the severity and frequency of dust storms.

“Dust is not something we can control, and so people have to adjust,” Koutrakis said, noting that these adjustments can impact human activity and health. For instance, on days when dust storms are severe, people may be forced to stay indoors, reducing their ability to exercise. There is also the potential that exposure to dust storms over long periods of time may be associated with chronic respiratory problems.

Koutrakis was followed up by Misbath Daouda, a master’s candidate in environmental health who’s studying how the growing tourism industry in Tunisia may impact the local environment.

Daouda’s research so far has shown that in some tourism hot spots, electricity demand surges by more than 50% during the busy season and that the sector is responsible for more than one-third of water consumption in Djerba, an island oasis off the eastern coast of Tunisia. As Daouda explained, her hope is to build a framework to measure tourism growth and its impact on the environment and human health in order to assist policymakers who will have to wrestle with important choices on how to mitigate the sector’s impact in the North African country over the coming years.

Rounding out the event was a presentation from Crystal North, a pulmonologist at Massachusetts General Hospital who has been collaborating with Harvard Chan School researchers to study air pollution in Uganda.

The work involves tracking air quality and following a cohort of HIV-positive and HIV-negative patients in the East African nation. Among the challenges are the air quality data from developing countries are relatively sparse, and there are very few sensors in Uganda to measure ambient air quality.

Previously North’s research has looked at inflammation and lung function in HIV-positive patients, who tend to be at increased risk of tuberculosis. She hopes to build on that with this new research by focusing on whether air pollution and HIV are synergistic in their effects on lung function. “Hopefully in the next year or two we’ll have some initial results to share,” North said.

Research Areas: 

Categories: Green Homes

The Long Memory of the Pacific Ocean

Green Homes - Mon, 2019-01-07 12:04
January 4, 2019

SEAS Communications

The Long Memory of the Pacific Ocean0

By Leah Burrows, SEAS Communications

The ocean has a long memory. When the water in today’s deep Pacific Ocean last saw sunlight, Charlemagne was the Holy Roman Emperor, the Song Dynasty ruled China and Oxford University had just held its very first class. During that time, between the 9th and 12th centuries, the earth’s climate was generally warmer before the cold of the Little Ice Age settled in around the 16th century. Now, ocean surface temperatures are back on the rise but the question is, do the deepest parts of the ocean know that?

Researchers from the Woods Hole Oceanographic Institution and Harvard University have found that the deep Pacific Ocean lags a few centuries behind in terms of temperature and is still adjusting to the advent of the Little Ice Age. Whereas most of the ocean is responding to modern warming, the deep Pacific may be cooling.

The research is published in Science.

"Climate varies across all timescales,” said Peter Huybers, Professor of Earth and Planetary Sciences in the Department of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences and co-author of the paper.  “Some regional warming and cooling patterns, like the Little Ice Age and the Medieval Warm Period, are well known. Our goal was to develop a model of how the interior properties of the ocean respond to changes in surface climate.”

What that model showed was surprising.

“If the surface ocean was generally cooling for the better part of the last millennium, those parts of the ocean most isolated from modern warming may still be cooling,” said Jake Gebbie, a physical oceanographer at Woods Hole Oceanographic Institution and lead author of the study. 

The model is a simplification of the actual ocean. To test the prediction, Gebbie and Huybers compared the cooling trend found in the model to ocean temperature measurements taken by scientists aboard the HMS Challenger in the 1870s and modern observations from the World Ocean Circulation Experiment of the 1990s.

The HMS Challenger, a three-masted wooden sailing ship originally designed as a British warship, was used for the first modern scientific expedition to explore the world’s ocean and seafloor. During the expedition from 1872 to 1876, thermometers were lowered into the ocean depths and more than 5,000 temperature measurements were logged.

“We screened this historical data for outliers and considered a variety of corrections associated with pressure effects on the thermometer and stretching of the hemp rope used for lowering thermometers,” said Huybers. 

The researchers then compared the HMS Challenger data to the modern observations and found warming in most parts of the global ocean, as would be expected due to the warming planet over the 20th Century, but cooling in the deep Pacific at a depth of around two kilometers depth. 

“The close correspondence between the predictions and observed trends gave us confidence that this is a real phenomenon,” said Gebbie.

These findings imply that variations in surface climate that predate the onset of modern warming still influence how much the climate is heating up today.  Previous estimates of how much heat the Earth had absorbed during the last century assumed an ocean that started out in equilibrium at the beginning of the Industrial Revolution. But Gebbie and Huybers estimate that the deep Pacific cooling trend leads to a downward revision of heat absorbed over the 20th century by about 30 percent.

"Part of the heat needed to bring the ocean into equilibrium with an atmosphere having more greenhouse gases was apparently already present in the deep Pacific,” said Huybers. "These findings increase the impetus for understanding the causes of the Medieval Warm Period and Little Ice Age as a way for better understanding modern warming trends."

This research was funded by the James E. and Barbara V. Moltz Fellowship and National Science Foundation grants OCE-1357121 and OCE-1558939.

Research Areas: 

Categories: Green Homes

The Long Memory of the Pacific Ocean

Green Homes - Mon, 2019-01-07 12:04
January 4, 2019

SEAS Communications

The Long Memory of the Pacific Ocean0

By Leah Burrows, SEAS Communications

The ocean has a long memory. When the water in today’s deep Pacific Ocean last saw sunlight, Charlemagne was the Holy Roman Emperor, the Song Dynasty ruled China and Oxford University had just held its very first class. During that time, between the 9th and 12th centuries, the earth’s climate was generally warmer before the cold of the Little Ice Age settled in around the 16th century. Now, ocean surface temperatures are back on the rise but the question is, do the deepest parts of the ocean know that?

Researchers from the Woods Hole Oceanographic Institution and Harvard University have found that the deep Pacific Ocean lags a few centuries behind in terms of temperature and is still adjusting to the advent of the Little Ice Age. Whereas most of the ocean is responding to modern warming, the deep Pacific may be cooling.

The research is published in Science.

"Climate varies across all timescales,” said Peter Huybers, Professor of Earth and Planetary Sciences in the Department of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences and co-author of the paper.  “Some regional warming and cooling patterns, like the Little Ice Age and the Medieval Warm Period, are well known. Our goal was to develop a model of how the interior properties of the ocean respond to changes in surface climate.”

What that model showed was surprising.

“If the surface ocean was generally cooling for the better part of the last millennium, those parts of the ocean most isolated from modern warming may still be cooling,” said Jake Gebbie, a physical oceanographer at Woods Hole Oceanographic Institution and lead author of the study. 

The model is a simplification of the actual ocean. To test the prediction, Gebbie and Huybers compared the cooling trend found in the model to ocean temperature measurements taken by scientists aboard the HMS Challenger in the 1870s and modern observations from the World Ocean Circulation Experiment of the 1990s.

The HMS Challenger, a three-masted wooden sailing ship originally designed as a British warship, was used for the first modern scientific expedition to explore the world’s ocean and seafloor. During the expedition from 1872 to 1876, thermometers were lowered into the ocean depths and more than 5,000 temperature measurements were logged.

“We screened this historical data for outliers and considered a variety of corrections associated with pressure effects on the thermometer and stretching of the hemp rope used for lowering thermometers,” said Huybers. 

The researchers then compared the HMS Challenger data to the modern observations and found warming in most parts of the global ocean, as would be expected due to the warming planet over the 20th Century, but cooling in the deep Pacific at a depth of around two kilometers depth. 

“The close correspondence between the predictions and observed trends gave us confidence that this is a real phenomenon,” said Gebbie.

These findings imply that variations in surface climate that predate the onset of modern warming still influence how much the climate is heating up today.  Previous estimates of how much heat the Earth had absorbed during the last century assumed an ocean that started out in equilibrium at the beginning of the Industrial Revolution. But Gebbie and Huybers estimate that the deep Pacific cooling trend leads to a downward revision of heat absorbed over the 20th century by about 30 percent.

"Part of the heat needed to bring the ocean into equilibrium with an atmosphere having more greenhouse gases was apparently already present in the deep Pacific,” said Huybers. "These findings increase the impetus for understanding the causes of the Medieval Warm Period and Little Ice Age as a way for better understanding modern warming trends."

This research was funded by the James E. and Barbara V. Moltz Fellowship and National Science Foundation grants OCE-1357121 and OCE-1558939.

Research Areas: 

Categories: Green Homes

Changing Temperatures Boost U.S. Corn Yield — For Now

Green Homes - Wed, 2018-12-05 12:45
November 6, 2018

SEAS Communications

Changing Temperatures Boost U.S. Corn Yield — For Now0

By Leah Burrows, SEAS Communications

The past 70 years have been good for corn production in the Midwestern U.S., with yields increasing fivefold since the 1940s. Much of this improvement has been credited to advances in farming technology, but researchers at Harvard University are asking if changes in climate and local temperature may be playing a bigger role than previously thought.

In a new paper, researchers found that a prolonged growing season due to warmer temperatures, combined with the natural cooling effects of large fields of plants, have had a major contribution to improved corn production in the U.S.

“Our research shows that improvements in crop yield depend, in part, on improvements in climate,” said Peter Huybers, professor of Earth and planetary sciences in the Department of Earth and Planetary Sciences (EPS) and of environmental science and engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS). “In this case, changing temperatures have had a beneficial impact on agricultural production, but there is no guarantee that benefit will last as the climate continues to change. Understanding the detailed relationships between climate and crop yield is important as we move toward feeding a growing population on a changing planet.”

The research is published in the Proceedings of the National Academy of Sciences (PNAS).

The researchers modeled the relationship between temperature and crop yield from 1981 to 2017 across the so-called Corn Belt: Illinois, Indiana, Iowa, Kansas, Kentucky, Michigan, Minnesota, Missouri, Nebraska, Ohio, South Dakota, and Wisconsin. They found that as temperatures increased due to global climate change, planting days got earlier and earlier, shifting by about three days per decade.

“One of farmers’ biggest decisions is what they plant and when they plant it,” said Ethan Butler, a postdoctoral research associate in the Department of Forest Resources at the University of Minnesota, first author of the paper, and a former graduate student in EPS. “We are seeing that farmers are planting earlier — not only because they have hardier seeds and better planting equipment, but also because it’s getting warmer sooner.”

Early planting means the corn has more time to mature before the growing season ends.

There is also a second, more surprising trend that has benefited corn yields. Whereas the vast majority of temperatures have warmed over the last century, the hottest days during the Midwestern growing season have actually cooled.

“Increasingly productive and densely planted crops can evaporate more water from leaves and soils during hot days,” said Nathaniel Mueller, former postdoctoral research fellow at the Harvard University Center for the Environment and co-author of the paper. “Widespread increases in rates of evaporation apparently help shield maize from extreme heat, cooling the surrounding area and helping to boost yields.”

Mueller is currently an assistant professor of Earth system science at the University of California, Irvine.

The researchers estimate that more than a quarter of the increase in crop yield since 1981 can be attributed to the twin effects of a longer growing season and less exposure to high temperatures, suggesting that crop yields are more vulnerable to climate change than previously thought.

The researchers also show that the planting and harvest dates farmers currently use are significantly better adapted to the present climate than they would have been to climates in earlier decades.

“Farmers are incredibly proactive and we’re seeing them take advantage of changes in temperature to improve their yield. The question is, how well can they continue to adapt in response to future changes in climate,” said Huybers.

This research was supported in part by the Packard Foundation and the National Science Foundation

Image: Pixabay​

Research Areas: 

Categories: Green Homes

Getting from No Nuclear to Slow Nuclear

Green Homes - Wed, 2018-12-05 10:07
December 4, 2018

The Harvard Gazette

Getting from No Nuclear to Slow Nuclear0

By Alvin Powell, Harvard Staff Writer

Harvard scientists say that low-carbon nuclear power may eventually gain greater support in the United States, suggesting that new, more economical plants could play an important role in the country’s energy production midcentury and beyond.

Beset by high construction costs and undercut by cheaper natural gas, wind, and even solar power, the nation’s nuclear fleet is struggling, with nuclear power producing about 20 percent of U.S. electricity today. Plant development is rare and economically risky, while the pace of retirements is increasing, driven by aging infrastructure and red ink.

Environmental fellow Michael Ford and climate scientist Daniel Schrag say those conditions are unlikely to change soon, but that the low-carbon power provided by nuclear plants may prove an important part of a future energy mix, one designed to fight climate change.

“Certainly right now, the existing fleet is struggling,” Ford said. “There are quite a few plants that are under market pressure, many early closings. … And if you pay any attention to attempts to deploy new nuclear in this country, efforts in South Carolina failed in 2016 and the Vogtle construction project in Georgia is something on the order of 100 percent over budget and years behind schedule.”

The central challenge, according to a paper Ford and Schrag published in October in the journal Nature Energy, is ensuring that acceptable technology — safe, cheaper, more efficient — is available to be deployed by midcentury, when market trends may again make nuclear power competitive with other sources.

At the moment, the main barrier for American nuclear projects is “the high price of new development and market economics, driven by cheap renewables and natural gas — combinations that are somewhat unique to the U.S.,” said Ford, whose doctoral work at Carnegie Mellon University focused on the state of the U.S. nuclear industry.

Ford and Schrag recommend that the U.S. government initiate a “tortoise” approach, investing in steady development of a range of advanced nuclear technologies, evaluating each so that the best option will be ready to go when need and market conditions arise.

“We believe the likely timescale for the demand for this technology is still at least a couple of decades away,” said Schrag, the Sturgis Hooper Professor of Geology, a professor of environmental science and engineering, and director of the Harvard University Center for the Environment.

In his work with the center, Ford, a retired Navy captain and nuclear engineer, has developed models of possible futures for the industry and is now creating a more detailed version of the proposal outlined in the October paper.

The industry itself has pinned its hopes on developing a generation of smaller, less-expensive reactors that could be built more quickly on site, he said. A number of companies are betting on new nuclear designs to get there — technologies that use substances other than water for cooling, such as liquid sodium or molten salts.

Ford and Schrag warn that there’s a risk in rushing ahead with a single advanced technology — what they call the “hare” approach — because it’s hard to know which of several options will work well enough to be deployable. In addition, whatever the technology, market conditions affecting nuclear power today are going to take decades to shift, they say.

“The idea of developing advanced nuclear like the Manhattan Project actually doesn’t fit the pace of the problem,” Schrag said. “What we’re trying to explain is that there are a sequence of steps in the decarbonization process [of the electricity grid], and absolutely we can accelerate them, but those steps are unlikely to change.”

Cheap and relatively clean natural gas will likely drive the closing of most of the nation’s coal-fired plants in the coming decades, Schrag said. At the same time, renewable power from wind and solar — both of which have become cheaper than nuclear in recent years — will likely take a larger share of the power grid.

Market conditions could shift in nuclear’s favor as the intermittent nature of wind and solar creates higher demand for new sources of support.

Though market conditions will initially favor natural gas in that role, gas is likely to become more expensive as demand increases, say Ford and Schrag, adding that pressure to include carbon capture technology could also boost the cost of electricity produced by natural gas plants.

“A new demand for nuclear power in the U.S. is likely to come only when natural gas is expensive or considered dirty or both,” Schrag said.

Ford and Schrag suggest that the U.S. direct research and development support to advanced nuclear designs and adopt a strategy that provides $200 million to $250 million annually to develop four or five promising technologies. Policy makers should also commit to providing additional funding — in partnership with industry — to demonstration plants that can scale up and test competing technologies, they say.

Schrag said the amount needed is relatively small in the context of overall energy spending, making it more likely to survive budget battles and swings in priorities between Republican and Democratic administrations.

Nonetheless, advanced nuclear won’t have a guaranteed spot in the nation’s energy mix, the scientists say. The industry would still have to compete with technologies such as carbon capture and large-scale battery storage.

“We’re saying nuclear needs to be an option, we’re not saying that nuclear has to win that competition,” Schrag said.

Image: Jon Chase/Harvard Staff Photographer

Research Areas: 

Categories: Green Homes
RMC facebook RMC twitter
Scroll to Top