Eating Our Way to a Sustainable Future

Thu, 2019-03-14 08:16
March 13, 2019

The Harvard Gazette

Eating Our Way to a Sustainable Future0

By Alvin Powell, Harvard Staff Writer

Can we eat our way out of some global environmental problems?

That’s a question asked by author Paul Greenberg, who has made his career writing about the problems of the seafood industry and seafood’s potential — with careful management and a shift in consumer practices — to be the foundation of a healthier diet, promote more sustainable use of the environment, and even reduce carbon emissions.

Greenberg, the author of three books on fish, seafood, and the fishing industry, said one health benefit of eating more seafood is consuming more of the omega-3 fatty acids it contains. Those fats, also marketed commercially as supplements, have been long suspected to have heart-healthy effects, and results last fall from the VITAL study, led by JoAnn Manson, the Michael and Lee Bell Professor of Women’s Health at Harvard Medical School and Brigham and Women’s Hospital, showed that taking omega-3 supplements caused large reductions in cardiovascular risk for those who had little fish in their diet.

But “eating seafood” today means something different than it did just a few generations ago, Greenberg said. Instead of diets varying depending on locally available fish, recent decades have seen seafood becoming standardized, with the industrial focus narrowing to four foods: tuna, shrimp, salmon, and several species lumped together as “whitefish.”

Those species can be very energy-intensive to harvest, and rediversifying the diet to increase the consumption of things like mussels and seaweed — which are both high in omega-3s and use much less energy to produce and process — can ease carbon emissions related to seafood harvesting.

Another strategy, Greenberg said, would be to utilize the 20 million to 30 million metric tons of smaller fish — currently ground up to use as fertilizer, for pig and chicken feed, and in aquaculture — for direct human consumption. That would increase the efficiency of a system that currently expends a lot of energy harvesting fish to feed them to something else that humans then eat.

With the planet’s population slated to continue growing, future challenges include not just shifting toward healthier and less-energy-intensive diets, but also simply providing more food. Greenberg said the aquaculture industry has the potential to meet this need and could reduce current problems of near-shore pollution by co-locating its pens at offshore wind farms.

Greenberg, whose talk mirrored the title of his latest book, “The Omega Principle: Seafood and the Quest for Long Life and a Healthier Planet,” spoke at Harvard’s Science Center on Tuesday afternoon. His lecture was followed by a discussion with three Harvard nutrition and seafood experts, including Professor of Epidemiology and Nutrition Walter Willett, Assistant Professor of Nutrition and Planetary Health Christopher Golden, and Assistant Professor of Medicine Susan Korrick. The event was presented by the Harvard University Center for the Environment.

Their discussion, Willett said, goes to the heart of what will be one of the major challenges facing humanity over the next century: feeding a growing population in a sustainable way.

While omega-3 fatty acids are an important nutritional component of seafood, Willett said questions remain as to how much is needed for optimum health.

Studies have shown that omega-3 supplements can reduce cardiovascular risk among those whose diets contain little fish, he said, but it’s likely that the effect plateaus and adding more beyond that to the diet would be of little use. At high enough doses, he said, some nutrients become toxic. Salt, for example, is an essential nutrient, but Willett said it’s likely we’ve gone beyond the plateau of beneficial effects and typically eat too much.

“Like many nutrients, there’s not a linear relationship between intake and how healthy we are,” Willett said. “There are many things that are essential, but we don’t need more.”

Resource-poor parts of the world are facing a somewhat different scenario, Golden said. In places where locally caught fish provide important nutrients in the diet, fish populations are expected to shift and body sizes to decline due to climate change, portending difficult times. Fish provide not just calories and protein for more than a billion people, but also micronutrients that are absent from the tubers and grains that are likely to take their place in the diet.

“It’s very troubling, in my opinion, to look at these types of statistics,” Golden said.

In addition to nutrients, seafood also contains pollutants that concentrate as they make their way up the food chain, Korrick said. Those pollutants, such as mercury, have long been known to be a risk of marine foods, but the use of fish meal to feed land animals like pigs and chicken makes that a problem for the terrestrial food chain as well.

“If there’s the political will and the political interest to really rethink and reimagine our food supply, considering contamination is a critical piece of that process,” Korrick said.

Despite the many challenges, Greenberg contended that increasing seafood consumption, heightening use of aquaculture, and shifting toward less-energy-intensive foods point the way toward a sustainable future.

“We could end up with a planet that is more balanced and perhaps a human body that’s more balanced,” he concluded.

Research Areas: 

Categories: Green Homes

The Blue-Green Revolution

Wed, 2019-03-13 11:51
March 13, 2019

Harvard Business School

The Blue-Green Revolution0

By Alexander Gelfand; illustration by Eric Nyquist

Much hope and plenty of money are riding on the idea that battery-powered electric cars will help slow global warming by reducing tailpipe emissions. But when it comes to reducing the greenhouse gases produced by heavy transportation—namely, the trucks, planes, trains, and ships that move large volumes of goods and people long distances—humanity’s best bet might lie with overweight algae. And staving off the climate apocalypse could be just the beginning.

That’s the premise underlying a decadelong joint effort to develop algae biofuel by ExxonMobil and Synthetic Genomics Inc. (SGI), a private biotech company cofounded by genomics pioneer Craig Venter and Nobel Laureate Hamilton Smith, together with writer and life sciences investor Juan Enriquez (MBA 1986). And it may soon come to fruition: Last March, in the wake of a scientific breakthrough by SGI, the two companies committed to producing 10,000 barrels of algae biofuel a day by 2025.

According to Enriquez, who directed HBS’s Life Sciences Project prior to his current role as managing director of Excel Venture Management, algae biofuels were once the darlings of the alternative energy sector. That’s because the aquatic microorganisms use sunlight, water, and carbon dioxide to photosynthesize sugar, proteins, and fat—the latter in the form of an oil that can replace fossil fuels in applications where batteries either can’t store enough power or are simply too heavy to lug around, like commercial aviation and maritime shipping.

In addition, algae can grow in salty or brackish water under extremely harsh conditions; so unlike other biofuel feedstocks such as corn and soy, algae don’t need to compete with agricultural crops for fresh water and arable land. And the oil they produce is free from the pollutants that must be removed from fossil crude.

As a result, algae could pull fossil-fuel generated CO2 out of the atmosphere and transform it into nearly carbon-neutral diesel or jet fuel with minimal environmental impact—a handy trick when demand for transportation energy is on the rise, and the need to manage global emissions grows ever more urgent.

The prospect of a clean energy source that could serve double duty as a carbon-capture technology has proven irresistible to investors, who have sunk hundreds of millions of dollars into dozens of algae biofuel startups. Unfortunately, says HBS Senior Fellow Joseph Lassiter, whose research focuses on developing carbon-neutral energy supplies, efforts to produce algal crude cheaply and efficiently have met with nothing but failure. The reason: basic biology.

One can easily persuade algae to produce more oil by starving them of nutrients like nitrogen, prompting the single-celled organisms to bulk up on fat like bears preparing for winter. Alas, just like bears, the microscopic butterballs eventually go into hibernation. And once that happens, they stop growing, negating the gains made in oil production.

SGI solved that biological catch-22 by genetically engineering algae to get fat without going comatose. As a result, “You can take the brakes off oil formation without putting the brakes on growth,” says SGI’s CEO, Oliver Fetzer.

In a study published in Nature Biotechnology in 2017, SGI researchers analyzed the genome and metabolism of the marine algae Nannochloropsis gaditana and uncovered a group of genes responsible for regulating oil production. By tweaking one of those genes with the powerful editing tool known as CRISPR, the team ultimately doubled the amount of oil produced by the algae without significantly hindering their growth.

SGI’s breakthrough finally provides a line of sight to a scalable algae biofuel. The company is already growing algae in outdoor ponds at a test facility near California’s Salton Sea, and Fetzer envisions a day when large pools of algae will be located wherever saltwater and consistently warm temperatures are to be found. The ponds could even be parked next to heavy CO2 emitters like cement factories and power plants so that the organisms can suck up excess carbon while churning out clean, renewable biocrude.

Having cracked the problem of boosting oil production, engineering algae to make petrochemicals ranging from fertilizers to plastics ought to be relatively straightforward. What’s more, the knowledge gained from the biofuels project should eventually permit researchers to turn algae into microscopic factories for the manufacture of virtually any organic compound, leading to what Enriquez describes as a full-blown algal revolution. “You can make vaccines in the stuff, you can make medicines in the stuff, you can make food in the stuff,” he says.

Capitalizing on its burgeoning algal expertise, SGI has already bred one strain that can make highquality protein and healthful fatty acids, and it hopes to coax others into producing biological drugs such as the antibodies used to treat cancer and autoimmune diseases.

Right now, however, the biofuel breakthrough is generating the most buzz—and for good reason, given the looming possibility of catastrophic climate change and the desperate need for fossil-fuel substitutes.

Fetzer hopes to have a pilot facility up and running by 2025 that can meet ExxonMobil’s production goals while sucking CO2 from a heavy polluter. He readily admits that challenges remain—like figuring out how best to extract the algae from their ponds and expel their oil—but the goal of producing algae biofuel that can compete with traditional diesel is finally within reach.

Research Areas: 

Categories: Green Homes

Finding the Right "Dose" for Solar Geoengineering

Wed, 2019-03-13 08:31
March 11, 2019

SEAS Communications

Finding the Right "Dose" for Solar Geoengineering0

By Leah Burrows, SEAS Communications; Photo by chuttersnap on Unsplash

One of the key misconceptions about solar geoengineering — putting aerosols into the atmosphere to reflect sunlight and reduce global warming — is that it could be used as a fix-all to reverse global warming trends and bring temperature back to pre-industrial levels.

It can’t. Applying huge doses of solar geoengineering to offset all warming from rising atmospheric C02 levels could worsen the climate problem — particularly rainfall patterns — in certain regions. But could smaller doses work in tandem with emission cuts to lower the risks of a changing climate?

New research from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), in collaboration with MIT and Princeton University, finds that if solar geoengineering is used to cut global temperature increases in half, there could be worldwide benefits without exacerbating change in any large geographic area.

Finding the right "dose" for solar geoengineering

“Some of the problems identified in earlier studies where solar geo-engineering offset all warming are examples of the old adage that the dose makes the poison,” said David Keith, the Gordon McKay Professor of Applied Physics at SEAS and senior author of the study. “This study takes a big step towards using climate variables most relevant for human impacts and finds that no IPCC-defined region is made worse off in any of the major climate impact indicators. Big uncertainties remain, but climate models suggest that geoengineering could enable surprisingly uniform benefits.”

The research is published in Nature Climate Change.

To better understand what regions could experience worse climatic conditions if solar geoengineering were combined with emissions cuts, the researchers used a state-of-the-art high-resolution model to simulate extreme rainfall and tropical cyclones (a.k.a. hurricanes). It’s the first time such a model has been used to study the potential impact of solar geoengineering.

Researchers looked at temperature and precipitation extremes, water availability, and a measure of the intensity of tropical storms. They found that halving warming with solar geoengineering not only cools the planet everywhere but also moderates changes in water availability and extreme precipitation in many places and offsets more than 85 percent of the increase in the intensity of hurricanes.

Less than 0.5 percent of the land would see the effects of climate change exacerbated, according to the model.

“The places where solar geoengineering exacerbates climate change were those that saw the least climate change to begin with,” said Peter Irvine, Postdoctoral Research Fellow at SEAS and lead author of the study. “Previous work had assumed that solar geo-engineering would inevitably lead to winners and losers with some regions suffering greater harms; our work challenges this assumption. We find a large reduction in climate risk overall without significantly greater risks to any region.”

The researchers are quick to point out that this is a simplified experiment, which assumed doubled CO2 concentrations and represented solar geo-engineering by turning down the sun. However, it is a first step towards understanding how solar geoengineering could be used in tandem with other tools to mitigate some of the worse impacts of climate change.

"For years, geoengineering has focused on compensating for greenhouse gas induced warming without worrying too much about other quantities like rainfall and storms,” said Kerry Emanuel, the Cecil & Ida Green Professor of Atmospheric Science at MIT and co-author of the study.  “This study shows that a more modest engineered reduction in global warming can lead to better outcomes for the climate as a whole."

“The analogy is not perfect but solar geoengineering is a little like a drug which treats high-blood pressure,” said Irvine. “An overdose would be harmful, but a well-chosen dose could reduce your risks. Of course, it’s better to not have high-blood pressure in the first place but once you have it, along with making healthier lifestyle choices, it’s worth considering treatments that could lower your risks.”

This research was co-authored by Jie He, Larry W. Horowitz, and Gabriel Vecchi.

 

Research Areas: 

Categories: Green Homes

New Faculty: Bruno Carvalho

Wed, 2019-03-06 13:23
March 6, 2019

The Harvard Gazette

New Faculty: Bruno Carvalho0

By Jill Radsken, Harvard Staff Writer

Bruno Carvalho has published research on topics ranging from environmental justice and race to city planning and literature. His award-winning “Porous City: A Cultural History of Rio de Janeiro” made the case for his native city as a place of cultural history defined by porous spaces and structural inequalities. Carvalho earned his Ph.D. in Romance Languages and Literatures at The Graduate School of Arts and Sciences in 2009. He is co-editor of the book series “Lateral Exchanges,” about architecture and urbanism in a global context.

Q&A

GAZETTE: Can you talk about your research?

CARVAHLO: My research is a bit wide-ranging, but it broadly focuses on cities as lived and imagined spaces, especially in Brazil. I’m beginning work on a cultural history of futures. We can think of much of modernity in terms of competing visions of what the future ought to be like. In contrast, today, with the realities of climate change and labor precarity setting in, it often seems as if a dreadful future is inevitable. In the 1920s, for example, some vied for car-centric and highly segregated cities, others for mixed-race, multicultural utopias. Urban visions have become in many ways more modest and contingent. Contemporary urbanism absorbed important lessons from the failures of top-down, authoritarian, modernist projects, and though that’s of course a good thing, the daunting scale of our challenges demands that we conceive transformations with imagination and ambition.

Reflecting on how the future was conceived in the past can help to expand the realm of possibilities. Most of my research brings together perspectives from the social sciences, design, and cultural materials. Art, literature, film, and historical knowledge can all push us to confront entrenched intuitions and stretch the limits of the thinkable. We shouldn’t assume that cities are doomed to the levels of segregation so common in the United States today, nor that this will solve itself. The study of the past can act as an antidote to a type of conformity that our dire problems sometimes produce.

A lot of people already get that a certain status quo is untenable, whether it’s fossil-fuel dependency, hyper-concentration of wealth, or the war on drugs. But as I like to remind my students, if the scale of needed changes seem unviable, unforeseen social, political, and technological transformations happen as a norm.

I also have a body of research on the 18th-century period, which includes publications on topics like the emergence of anti-black racism in scientific thought, and on how the circulation of French translations of U.S. constitutional documents played a role in failed independence movements in Brazil. The Enlightenment sometimes appears to nonspecialists as this pivotal, but sort of flattened phenomenon — in some circles tied to progress and freedom, in others to Eurocentrism and exploitation. Rethinking the Enlightenment from the perspective of Brazil helps to foreground some of its tensions and contradictions, and allows us to trace the formation of modern ideas about sovereignty and individual liberty, as well as about race and white supremacy, cities as harbingers of civilization, and nature as a resource rather than something to which we belong.

GAZETTE: Last fall, you taught a seminar called “Writing and Urban Life”; this spring you’ll teach a graduate seminar called “Imagine Futures,” and you have a Gen Ed course debuting in two years. What are they about?

CARVAHLO: The fall seminar was a great welcome to Harvard. It brought together a wonderful group of graduate and undergraduate students. Writing and urbanization have entangled histories, and have become central to the very ways in which we constitute subjecthood in the modern world. Both are relatively recent developments in our history as a species. We discussed some very large questions and reviewed canonical debates, but we also concentrated on a set of authors from the past century or so, mostly from Brazil, who in various ways push us to “denaturalize” a lot of what we tend to take for granted about urban life. Much of the writing we analyzed is attuned to the strangeness in familiar modes of being, as well as to the perils, promises, and potentials of this massive experiment in which urbanites are now engaged: living closely in density among strangers. That’s just not how most humans before us did it.

This spring’s seminar will look at how past ideas of what the future should look like have helped to shape cities in all sorts of material ways — say, for example, associations between order and progress with geometric patterns like the grid. We’ll also try to recover ideas that largely lost out but can resonate today, such as challenges to human exceptionalism that contemplate the place of other life forms in the worlds we’ve built. We will discuss how in the history of planning, the unplanned and even the improbable happen often, and how the future is, by definition, out of reach and, therefore, always in a way imagined. We will focus on Brazil, the country; like the Americas as a whole, it’s a fertile space to generate these reflections because so many futures were projected on it throughout colonial and modern history. Brazil has been alternatively conceived as Edenic and dystopian. We’ll focus on historical turning points in urbanization and culture and try to understand their specificities, but we won’t lose sight of our current predicaments. After all, our collective planetary futures are very much at stake in regions like the Amazon, which is now a contested site for different visions of what the world ought to be like.

Next academic year when Neil Brenner, professor of urban theory at GSD, is back from sabbatical, we will co-teach a Gen Ed course called “Living in an Urban Planet.” So even though it’s already a cliché to say that more than half of the world population lives in cities, we actually tend to underestimate how much of the planet urbanization encompasses. If we think of energy systems and refuse, for example, or even the circulation of urban cultural production, where do our cities end? In this course we will discuss urban transformations at various scales, from the planetary to the sidewalk. It’s been stimulating to work with Neil on this. We share a number of interests, and tend to approach related questions in very different but complementary ways.

GAZETTE: You are leading an effort to create a secondary field in urban studies, something you were involved in at Princeton. Is it meant to be a cross-disciplinary effort, and what kinds of conversations do you hope will come from it?

CARVAHLO: My dream is to build a program in urban studies like our cities at their best: places of intellectual exploration, encounters with difference, lively exchanges. Urban experiences, much like a liberal arts education, can expose us to multiple ways of being and belonging in the world. They can move us to step outside of ourselves, to inhabit multiple perspectives, to exceed our assigned roles. Because there are already so many wonderful urban-related courses, and because there is no formal urban studies curriculum outside of the professional schools, we have the opportunity to build something really special. An urban studies curriculum can bring together students and faculty with mutual interests, but whose paths might not cross otherwise. At Princeton, we built a thriving program, and saw how it had transformative potential, especially for undergrads. Urban studies can introduce students to very basic facts about the world around them that they might not otherwise learn, like the role of segregatory housing in the U.S. wealth gap. It can introduce issues like inequality in resonant ways. Urban studies presents opportunities for poets and engineers to discuss different standards for value, or for anthropologists and computer scientists to rigorously debate the blind spots and uses of big data or GIS.

An institutional space around the urban could help us to break down siloes, building links across disciplinary and geographic boundaries. Neil, Eve Blau (GSD), and I are working with several colleagues on addressing some of these issues by reviving the Harvard Mellon Urban Initiative, which Eve and Julie Buckler, Samuel Hazzard Cross Professor of Slavic Languages and Literatures and of Comparative Literature, created as part of a grant funded by the Mellon Foundation.

GAZETTE: There has been so much challenging news coming out of Brazil, from the devastating National Museum fire to the recent presidential election. What are your thoughts about the cultural future of your homeland?

CARVAHLO: Brazil’s cultural landscapes are full of dynamism and utopian yearnings that have worked to destabilize structures of inequality, broadening the horizons of possibility. As elsewhere, we have recently seen extremist political movements take advantage of a very understandable sense of disillusionment and frustrations with futures that never arrived. Early in Brazil’s election, when not many expected surprises, I wrote a long essay on the appeal of politicians positioning themselves as anti-establishment, promising a return to a fantasy-based past, and of groups that have turned digital tools like YouTube and WhatsApp into engines for far-right radicalization and for the spread of misinformation. I think we cannot underestimate the grave threats to the environment, to a free press, to research and education and to vulnerable populations in Brazil, including indigenous groups. But there are many people fighting for democracy too.

The least important thing is setting ourselves up to say “I told you so.” We have to continue standing up for evidence-based approaches to our problems, but that won’t be enough. We also need to nurture alternative, inclusive visions for the future. One person who did that brilliantly was Marielle Franco, a young, Afro-Brazilian native of a Rio de Janeiro favela who was elected to the city council and was assassinated last year. Sidney Chalhoub (professor of history and African and African American studies) and I are planning an event here at Harvard with feminist leaders and former colleagues to celebrate her legacy. We do not yet know for certain who was behind her murder, but we know that the last electoral cycle empowered some individuals who mocked or made light of her death.

There are also renewed threats to the Amazon in growing deforestation and attacks on indigenous people. Brian Farrell, director of the David Rockefeller Center for Latin American Studies, Monique and Philip Lehner Professor for the Study of Latin America, and professor of biology; postdoc Bruno de Medeiros; and I are collaborating on a conference called “Amazonia and Our Planetary Futures.” We are assembling specialists from government, the private sector (including biodiversity economies), scientists, and indigenous leaders. It’s all hands on deck to avert catastrophe and create better futures!

Research Areas: 

Categories: Green Homes

Mountains and the Rise of Landscape

Tue, 2019-02-26 14:58
February 11, 2019

Harvard Graduate School of Design

Mountains and the Rise of Landscape0

To ask when we started looking at mountains is by no means the same as asking when we started to see them. Rather, it is to question what sorts of aesthetic and moral responses, what kinds of creative and reflective impulses, our new found regard for them prompted. It is evident enough that in a more or less recent geological time frame mountains have always just been there. It is possible that mountains, like the sea, best provide pleasure, visual and otherwise, when experienced from a (safe) physical and psychical distance. But it might also be the case that the pleasures mountains hold in store are of a learned and acquired sort.

Which is also to say that mountains themselves, for all their unforgiving thereness, are themselves the products of unwitnessed Neptunian and Vulcanian tumults or divine judgment. For the late seventeenth-century theologian and cosmogonist Thomas Burnet, mountains were “nothing but great ruins.” A dawning appreciation of these wastelands appeared in the critical writings of John Dennis. Satirized as “Sir Tremendous Longinus” for his rehabilitation of the antique aesthetic category of the sublime, Dennis expressed the complex concept of “delightful horror.” Mountain gloom was ready to become mixed with mountain glory. More work was still to be done on the literary and philosophical front before the Romantic breakthrough, one high vantage point being the essayist Joseph Addison’s dream of finding himself in the Alps, “astonished at the discovery of such a Paradise amidst the wildness of those cold hoary landscapes.”

But a kindred innovation in seeing and feeling was called for in the formation of mountains and the rise of landscape. Mountains, among other earth forms, are both the medium and outcome of still-evolving habits of experiencing, making, and imagining. Architects and landscape architects, mutually occupied with the horizontal surface, have had a touch equally as searching as that of mountaineers and poets in sensing the terrain. Mountains and the Rise of Landscape is the culmination of a curatorial project and a research seminar conducted at the Graduate School of Design, the latter focusing on the question, How do you model a mountain? The installation in the Druker Design Gallery and continuing in the Frances Loeb Library collects diverse objects and scientific instruments, drawings, photographs, and motion pictures of built and imagined projects and presents invitingly challenging modes of seeing (and hearing!) mountains of varied definition. Allied with the work of artists, visionaries, and interpreters of natural and cultural meaning, they propose new and foregone possibilities of perception and form-making in the acts of leveling and grading, cutting and filling, shaping and contouring, mapping and modeling, of reimagining “matter out of place,” and finally of stacking the odds and mounting the possibilities.

Mountains and the Rise of Landscape offers five thematic sections:

Mountain Lines of Beauty: Mild mountaineers, John Ruskin (1819–1900) and Eugène Viollet-le-Duc (1814–1879) were among the first theorists, designers, and architects to teach us that the “line” formed by crests, peaks, and ridges presents an exemplary form of beauty. The invention of the panorama and the photomechanical reproduction of mountain views transformed a geophysical phenomenon into an object of aesthetic value and topographical knowledge. Guidebooks, geographical manuals, and maps glorified specific ranges by showing their most beautiful contours. To define a single mountain or group of mountains as a “line,” however, implies a process of abstraction. This process is both enhanced and complicated by contemporary tools such as CAD, GIS, and GPS. To draw the most important mountain ranges of our contemporary world as “mountain lines of beauty”—the phrase is evidently inspired by the eighteenth-century painter William Hogarth’s analysis of the serpentine, S-shaped “line of beauty”—should not be seen as a simplified way to represent them. Rather, the difficulty of representation itself becomes visible in the constructedness of the lines. Through them the possibility arises, again, of our being surprised by the sublimity, as well as the beauty, of the mountains.

Artificial mountains are a worldwide phenomenon. Burial sites, such as Etruscan tumuli, were often marked by the intimidating form of the man-made mountain. Incense burners in ancient China evoked the Five Sacred Mountains. A representation of Mount Parnassus was a significant element of European gardens and a symbol of Renaissance humanism. Artificial mounds, typically composed of locally excavated material, may be seen as so many milestones in the history of landscape architecture. The industrial revolution accelerated the rise of an anthropic topography, producing landforms that we often no longer recognize as being artificial. Mountains are ubiquitous in twentieth-century and contemporary art, with a special place—between site and non-site—reserved for the explorations of Robert Smithson, who reversed, displaced, and rebuilt the form, material, and meaning of mountains.

Camouflage: Among the first who climbed high mountains in antiquity were members of the military, in search of an advantageously elevated view of their enemy’s position. From the seventeenth century onward, many mountainous regions were massively fortified, with military infrastructures placed strategically to take advantage of their secluded impregnability. The photographer Leo Fabrizio has documented traces of former military constructions hidden in the most remote areas of Alpine Switzerland. His visual archeology of camouflage techniques employed by the Swiss military exposes the unfamiliar territory of a landscape that still appears “natural” while being completely transformed from within.

Glaciers are in retreat throughout the world. Celebrated and studied during the eighteenth century as sublime objects—sung of by poets and depicted by landscape painters—glaciers register today as metonymies of global climate change and vanishing natural and scenic phenomena. Geneva-based composers Olga Kokcharova and Gianluca Ruggeri have explored the fascinating soundscape of the Mont Miné Glacier in the Swiss canton of Valais. Since 2000, the 4.9 mile-long glacier has lost about eighty-five feet per year. To hear the “voice” of a glacier compellingly questions the visual bias of the landscape-oriented perspective. The mysterious sounds of the white masses bear melancholy aural testimony to the progressive disappearance of a titanic natural feature.

Inhabitants of the Alpine regions have practiced transhumance for centuries, droving livestock between the valleys in winter and the high mountain pastures in summer. Many of the wooden or stone structures built by farmers to shelter their cattle and themselves have been abandoned, ruined, and in some instances transformed into chalets. Martino Pedrozzi, a Ticino-based architect, has worked for a decade in the remote valleys of southern Switzerland. His Recompositions, carried out with his students at the Mendrisio Academy of Architecture and other volunteers, consist in repairing the existing structures or in composing a new object from the abandoned material. The resulting architectural objects are designedly functionless; they are poetic metaphors and visual documents of a past that is at risk of disappearance.

Photo by Omer Salom on Unsplash

Research Areas: 

Categories: Green Homes

And Now, Land May Be Sinking

Wed, 2019-02-20 10:05
February 20, 2019

The Harvard Gazette

And Now, Land May Be Sinking0

By Peter Reuell, Harvard Staff Writer

In the coming decades, cities and towns up and down the eastern seaboard will have to come to terms with the impact of rising sea level due to climate change. A new study, however, is suggesting that rising sea levels may be only part of the picture — because the land along the coast is also sinking.

That’s the key finding of Professor of Earth and Planetary Sciences Peter Huybers, Frank B. Baird Jr. Professor of Science Jerry Mitrovica, and Christopher Piecuch, an assistant scientist at the Woods Hole Oceanographic Institution, who used everything from tide gauges to GPS data to paint the most accurate picture ever of sea-level rise along the east coast of the U.S. The researchers are co-authors of the study, recently published in Nature.

“What we are seeing at a large scale, and this was a surprise to me, is a very clear pattern that you would expect if the response to the last ice age were the primary control on the differential rates of sea-level rise across the eastern U.S.,” said Huybers. In other words, between 20,000 and 95,000 years ago, the Laurentide Ice Sheet, which covered most of northern North America, levered the land upwards. “Now, thousands of years after the ice is gone,” Huybers said, “the mid-Atlantic crust is still subsiding.

“In New England, there is not too much additional sea-level rise from land motion because it’s near the hinge point,” he continued. “The bulge caused by the ice sheet was centered on the mid-Atlantic, and because it’s still settling down, the relative rise of sea level in the mid-Atlantic is about twice the global average.”

What that means, Huybers said, is that we need to prepare for greater rates of relative sea-level rise along the mid-Atlantic because of the combined effects of the natural subsidence of the land and human-caused rises in sea level.

“The fact that the mid-Atlantic is subsiding because of long-term geologic processes means that it will continue for centuries and millennia, in addition to whatever other changes in sea level occur,” Huybers said. “The mid-Atlantic is already having to cope with routine coastal flooding, and this problem is only going to get worse with time.”

Developing estimates of how much various factors contribute to sea level rise, however, is easier said than done.

“Sea level is a noisy place,” said Mitrovica. “Tides go up and down, waves crash, there is ice melting, ocean circulation changes, the warming of the ocean. … If you want to understand sea level in its totality, you need to know what all those factors are doing.”

One of the first researchers to attempt that feat, he said, was Carling Hay, a former postdoctoral fellow in Mitrovica’s lab and now an assistant professor at Boston College.

In 2014, while at Harvard, Hay published a groundbreaking study that used advanced statistical techniques to sift through dozens of data sets and factors influencing sea-level rise. She came to the surprising conclusion that during the 20th century, sea levels had risen more slowly than many had estimated.

“Unfortunately, what that means is that if sea levels weren’t rising as fast as we thought in the 20th century, they have been going up significantly faster than we thought over the last 20 years,” Mitrovica said. “That was a real demonstration of the power of statistical work in a field where it had not been very common.”

With the new study, Mitrovica said, Piecuch took that idea and ran with it. But rather than trying to estimate worldwide sea level rise over the past century, he chose to home in on one particular region over a shorter time period.

“So he can use all sorts of data sets,” Mitrovica said. “He can use GPS, which tells you how the land is moving, but he’s also got sea-level data going back several thousand years, tide gauges, and other data. He throws all that into the stew … and asks where the east coast is going and what’s contributing to that change. What Chris has done has solved this long-standing, 30-year problem.”

But the work, Mitrovica pointed out, is part of a trajectory. “The next thing that’s going to happen is we will be able to bring in satellite data and we can step back and look at this globally,” he said. “And I think for the first time we may be able to separate out the various contributors to sea-level rise.”

“There is a rather confusing montage of possibilities for why sea level could be changing,” Huybers added. “What Chris has done is to pull together disparate information that were distributed across a number of locations and different time intervals and put them together in a fully probabilistic way, allowing for better estimates of historical rates of sea-level change and how the ongoing response to the last ice age will contribute to future changes.”

Image source: AP

Research Areas: 

Categories: Green Homes

Think Different, Act More

Tue, 2019-02-19 14:14
February 19, 2019

The Harvard Gazette

Think Different, Act More0

By Clea Simon, Harvard Correspondent

The threat of climate change is dire, but Hal Harvey sees a path forward.

In “Getting to Zero on Climate Change,” a stirring presentation recently at the Harvard University Center for the Environment, Harvey, the CEO of Energy Innovation Policy and Technology in San Francisco, stressed both the urgency of the problem and specific steps that could, he said, make the difference between accelerating toward destruction or innovating toward prosperity.

Citing “a massive problem with horrifying dimensions,” Harvey, the co-author of “Designing Climate Solutions: A Policy Guide for Low-Carbon Energy,” said that today, “extremes have become the norm.” From drought and wildfires to unprecedented floods and cold snaps, he detailed the effects we have already begun to experience. More terrifying is how close we are to climate tipping points, such as the thawing of the tundra — what used to be known as “permafrost” ­— which would release massive amounts of methane and carbon.

“After a certain point you unleash natural systems and there’s no going back,” he said.

But Harvey also laid out a series of steps that could counteract the problem, including policy recommendations that could be affected by concerned citizens.

Rejecting small-scale, feel-good campaigns — “We shouldn’t have a strategy about plastic straws,” he said — Harvey broke down the problem into four major sectors that contribute the most to climate change: energy and the electric grid, transportation, buildings, and industry. Again pushing for efficacy, he suggested moving for change in the top 20 countries that contribute to climate change, specifically global heavyweights such as the U.S., China, E.U., India, and Russia. He then isolated specific policy changes here that could make a difference for our country and, ultimately, for the globe.

One is the electrical grid. Harvey noted that green energy sources such as solar and wind are already becoming more cost-effective. In fact, with new technologies, such as larger wind turbines and turbines that can float and thus be placed farther from land, off-shore wind power is on the verge of becoming a major industry.

However, these sources are intermittent, leaving many green-energy proponents focused on expensive and, thus far inefficient, battery technology. Instead, Harvey suggested that smarter and more flexible grids can enable municipalities to share resources, leveling out supply and demand. He said demand can also be managed, for instance by cooling skyscrapers in advance of extreme weather, thus using less energy during peak demand times.

He also supported the wider use of renewable portfolio standards that would reward those who invest in these green power sources.

Turning to transportation, Harvey applauded but dismissed the electric vehicle movement, which he said has too small a share of the market to make a difference. For vehicles already on the road, he suggested increasing incentives such as tax rebates and nonfinancial rewards, such as free parking. For the billions of cars that will be built in coming years, however, Harvey argued for super fuel-efficiency, pointing out that in addition to changes in engine technology, efficiency can be increased by making vehicles lighter and less wind-resistant.

For buildings, Harvey said low-emission windows, coated with an invisible metallic layer, already greatly decrease demands for heating and cooling. He called for stronger building codes, like California’s, that focus on annual percentage gains in efficiency. Such continuous and expected progress does not need to be revisited legislatively, he noted, and creates a stable environment that let businesses plan for future construction. As a corollary, Harvey also called for stronger appliance efficiency, a trend that has already proved popular with consumers.

Harvey said industry can also take steps to reduce waste. New 3-D technologies are already helping, as they define the specific components of building projects, eliminating the waste of concrete and other materials. In industrial engines, variable speed settings that use smart technologies to adjust automatically save on power as well as costs.

Harvey pointed out that “most of the money [to make these changes] is there already.” While many climate change activists spend time trying to raise funds to help emerging countries, Harvey said the “world already spends about $5 trillion dollars a year on energy and another $6 trillion for infrastructure setting up consumption.” Reallocating these resources, rather than battling for new ones, is an achievable goal, he said. To effect the change, concerned citizens need only find out who is really in charge. Public utility commissions, for example, often have more practical impact than legislative bodies and regularly hold open meetings.

Do the triage,” Harvey said. “Understand which policies make a difference and pay attention to who makes the decisions.

“With a modest amount of work, a few tens of hours, you can become a player.”

Image by: Jon Chase/Harvard Staff Photographer

Research Areas: 

Categories: Green Homes

How to Think About the Costs of Climate Change

Fri, 2019-01-18 14:57
January 17, 2019

The New York Times

How to Think About the Costs of Climate Change0

By Neil Irwin

By now, it’s clear that climate change poses environmental risks beyond anything seen in the modern age. But we’re only starting to come to grips with the potential economic effects.

Using increasingly sophisticated modeling, researchers are calculating how each tenth of a degree of global warming is likely to play out in economic terms. Their projections carry large bands of uncertainty, because of the vagaries of human behavior and the remaining questions about how quickly the planet will respond to the buildup of greenhouse gases.

A government report in November raised the prospect that a warmer planet could mean a big hit to G.D.P. in the coming decades.

And on Thursday, some of the world’s most influential economists called for a tax on carbon emissions in the United States, saying climate change demands “immediate national action.” The last four people to lead the Federal Reserve, 15 former leaders of the White House Council of Economic Advisers, and 27 Nobel laureates signed a letter endorsing a gradually rising carbon tax whose proceeds would be distributed to consumers as “carbon dividends.”

The Trump administration has long rejected prescriptions like a carbon tax. But policy debates aside, many of the central economic questions of the decades ahead are, at their core, going to be climate questions. These are some of the big ones.

How permanent will the costs be?

When we think about the economic damage from a hotter planet, it’s important to remember that not all costs are equivalent, even when the dollar values are similar. There is a big difference between costs that are high but manageable versus those that might come with catastrophic events like food shortages and mass refugee crises.

Consider three possible ways that climate change could exact an economic cost:

  • A once-fertile agricultural area experiences hotter weather and drought, causing its crop yields to decrease.
  • A road destroyed by flooding because of rising seas and more frequent hurricanes must be rebuilt.
  • An electrical utility spends hundreds of millions of dollars to build a more efficient power grid because the old one could not withstand extreme weather.

The farmland’s yield decline is a permanent loss of the economy’s productive capacity — society is that much poorer, for the indefinite future. It’s worse than what happens in a typical economic downturn. Usually when factories sit idle during a recession, there is a reasonable expectation that they will start cranking again once the economy returns to health.

The road rebuilding might be expensive, but at least that money is going to pay people and businesses to do their work. The cost for society over all is that the resources that go to rebuilding the road are not available for something else that might be more valuable. That’s a setback, but it’s not a permanent reduction in economic potential like the less fertile farmland. And in a recession, it might even be a net positive, under the same logic that fiscal stimulus can be beneficial in a downturn.

By contrast, new investment in the power grid could yield long-term benefits in energy efficiency and greater reliability.

There’s some parallel with military spending. In the 1950s and ’60s, during the Cold War, the United States spent more than 10 percent of G.D.P. on national defense (it’s now below 4 percent).

Most of that spending crowded out other forms of economic activity; many houses and cars and washing machines weren’t made because of the resources that instead went to making tanks, bombs and fighter jets. But some of that spending also created long-term benefits for society, like the innovations that led to the internet and to reliable commercial jet aircraft travel.

Certain types of efforts to reduce carbon emissions or adapt to climate impacts are likely to generate similar benefits, says Nicholas Stern, chair of the Grantham Research Institute on Climate Change and the Environment at the London School of Economics.

“You couldn’t provide sea defenses at large scale without very heavy investment, but it’s not investment of the kind that you get from the things that breed technological progress,” Mr. Stern said. “The defensive adaptations don’t carry anything like the dynamism that comes from different ways of doing things.”

There is more fertile ground in areas like transportation and infrastructure, he said. Electric cars, instead of those with internal combustion engines, would mean less air pollution in cities, for example.

How should we value the future compared with the present?

Seeking a baseline to devise environmental regulations, the Obama administration set out to calculate a “social cost of carbon,” the amount of harm each new ton of carbon emissions will cause in decades ahead.

At the core of the project were sophisticated efforts to model how a hotter earth will affect thousands of different places. That’s necessary because a low-lying region that already has many hot days a year is likely to face bigger problems, sooner, than a higher-altitude location that currently has a temperate climate.

Michael Greenstone, who is now director of the Becker Friedman Institute at the University of Chicago and of the Energy Policy Institute there, as well as a contributor to The Upshot, was part of those efforts.

“We’ve divided the world into 25,000 regions and married that with very precise geographic predictions on how the local climate will change,” Mr. Greenstone said. “Just having the raw computing power to be able to analyze this at a more disaggregated level is a big part of it.”

But even once you have an estimate of the cost of a hotter climate in future decades, some seemingly small assumptions can drastically alter the social cost of carbon today.

Finance uses something called the discount rate to compare future value with present value. What would the promise of a $1,000 payment 10 years from now be worth to you today? Certainly something less than $1,000 — but how much less would depend on what rate you use.

Likewise, the cost of carbon emissions varies greatly depending on how you value the well-being of people in future decades — many not born yet, and who may benefit from technologies and wealth we cannot imagine — versus our well-being today.

The magic of compounding means that the exact rate matters a great deal when looking at things far in the future. It’s essentially the inverse of observing that a $1,000 investment that compounds at 3 percent a year will be worth about $4,400 in 50 years, whereas one that grows 7 percent per year will be worth more than $29,000.

In the Obama administration’s analysis, using a 5 percent discount rate — which would put comparatively little weight on the well-being of future generations — would imply a social cost of $12 (in 2007 dollars) for emitting one metric ton of carbon dioxide. A metric ton is about what would be released as a car burns 113 gallons of gasoline. A 2.5 percent rate would imply a cost of $62, which adds up to hundreds of billions of dollars a year in society-wide costs at recent rates of emissions.

The Obama administration settled on a 3 percent discount rate that put the social cost of carbon at $42 per metric ton. The Trump administration has subsequently revised that estimate to between one dollar and seven dollars.

That sharp decrease was achieved in part by measuring only the future economic costs to the United States, not factoring in the rest of the world. And the Trump administration analyzed a discount rate of up to 7 percent — a rate at which even costs far into the future become trivial.

Mr. Greenstone favors substantially lower discount rates, based on evidence that financial markets also place high value on investments that protect against risk.

Understood this way, spending today to reduce carbon emissions tomorrow is like insurance against some of the most costly effects of a hotter planet — and part of the debate is over how much that insurance is really worth, given that the biggest benefits are far in the future.

How might climate change fuel inequality?

When a government report raises the possibility of a 10 percent hit to G.D.P. as a result of a warming climate, it can be easy to picture everyone’s incomes being reduced by a tenth.

In reality there is likely to be enormous variance in the economic impact, depending on where people live and what kind of jobs they have.

Low-lying, flood-prone areas are at particularly high risk of becoming unlivable — or at least uninsurable. Certain industries in certain places will be dealt a huge blow, or cease to exist; many ski slopes will turn out to be too warm for regular snow, and the map of global agriculture will shift.

Adaptation will probably be easier for the affluent than for the poor. Those who can afford to move to an area with more favorable impacts from a warmer climate presumably will.

So the economic implications of climate change include huge shifts in geography, demographics and technology, with each affecting the other.

“To look at things in terms of G.D.P. doesn’t really capture what this means to people’s lives,” said William Nordhaus, a Yale economist who pioneered the models on which modern climate economics is based and who won a Nobel for that work. “If you just look at an average of all the things we experience, some in the marketplace and some not in the marketplace, it’s insufficient. The impact is going to be highly diverse.”

Can we adapt to a warmer climate?

Despite all these risks, it’s important to remember that humanity tends to be remarkably adaptable. A century ago, most people lived without an automobile, a refrigerator, or the possibility of traveling by airplane. A couple of decades before that, almost no one had indoor plumbing.

Changes in how people live, and the technology they use, could both mitigate the impact of climate change and ensure that the costs are less about a pure economic loss and more about rewiring the way civilization works.

Most capital investments last only a decade or two to begin with; people are constantly rebuilding roads, buildings and other infrastructure. And a warmer climate could, if it plays out slowly enough, merely shift where that reinvestment happens.

But a big risk is that the change happens too quickly. Adaptation that might be manageable over a generation could be impossible — and cause mass suffering or death — if it happens over a few years.

Imagine major staple food crops being wiped out for a few consecutive years by drought or other extreme weather. Or a large coastal city wiped out in a single extreme storm.

“Whether it’s jobs, consumption patterns or residential patterns, if things are changing so fast that we can’t adapt to them, that will be very, very costly,” Mr. Nordhaus said. “We know we can adapt to slow changes. Rapid changes are the ones that would be most damaging and painful.”

It’s clear that climate change and its ripple effects are likely to be a defining challenge of the 21st-century economy. But there are wide ranges of possible results that vary based on countless assumptions. We should also recognize that the economic backdrop of society is always changing. Projecting what that will mean for ordinary people is not simply a matter of dollars.

“I’ve spent the last 20 years trying to communicate it and it’s not easy to process,” Joseph Aldy, who teaches at Harvard’s Kennedy School for Public Policy, said of the connection between climate change and the economy. “It’s really hard to convey something that is long term and gradual until it’s not.”

Research Areas: 

Categories: Green Homes

A Growing Role As a Living Lab

Fri, 2019-01-18 14:23
January 16, 2019

The Harvard Gazette

A Growing Role As a Living Lab0

By Deborah Blackwell, Arnold Arboretum Communications

Andrew Groover celebrates the complexity of trees and makes it his life’s work to unlock how they adapt to their environments. It’s knowledge that’s critical for the U.S. Forest Service research geneticist — he works in California, where concerns about climate change have grown as wildfires there have increased in frequency and intensity.

A practical problem for Groover, who is a University of California, Davis, adjunct professor of plant biology, is efficient access to the variety of trees he studies. His research requires a ready supply of species diversity, a tall order without laborious travel. But in 2012 his search for the perfect resource brought him to the Arnold Arboretum of Harvard University — a 281-acre living museum holding more than 2,100 woody plant species from around the world.

“Trees are fascinating for biology and research, but one of the greatest challenges in this research is finding trees tractable for study,” Groover said. “If you have a list of a dozen or two different species, where do you get all those? The Arnold Arboretum has all of the species we would ever want to look at, and then some.”

The Arboretum also contains one of the most extensive collections of Asian trees in the world, which Groover said is advantageous to his research. Typically a researcher has to travel to various locations throughout the world, determine whether the trees are on public or private property, obtain permission to study and transport samples, overcome language and other barriers, and potentially return to the same site later to complete research, which can be challenging.

“The Arnold Arboretum plays a crucial role in research and science and educating the public, connecting them with trees and forests. But it’s also a living laboratory and repository of hard-to-source species for research and is renowned for its collection of Asian disjuncts,” he said. “We can actually study these species pairs found in both Asia and the U.S. directly in the Arboretum. We didn’t need to go anywhere else.”

Director of the Arnold Arboretum and Arnold Professor of Organismic and Evolutionary Biology William (Ned) Friedman emphasized the extraordinary efforts that go into creating such a high-impact research destination.

“Importantly, beyond the more than 16,000 accessioned woody plants at the Arnold Arboretum, we have a staff of world-class horticulturists, propagators, IT professionals, curators, and archivists, all of whom are devoted to ensuring that the living collections are what I call a ‘working collection’ of plants,” he said. “The plants of the Arboretum may look great in flower, or at the peak of fall colors, but these plants are here primarily to be studied by scholars at Harvard and from around the world. In 2018 alone, there were 79 different research projects using the living collections and landscape of the Arnold Arboretum.”

Groover’s work with the Arboretum became a long-term collaboration. In 2014 he won a Sargent fellowship, and, working with Arboretum scientists, collected small samples of genetic material from specific Arboretum trees and propagated them in his own laboratory greenhouses. In 2015 Groover, with Friedman, organized the 35th New Phytologist Symposium held at the Arboretum. He has also given several research talks there, most recently in December on genomic approaches to understanding the development and evolution of forest trees.

“When the Weld Hill Research Building was completed [in 2011], many of us in the research community saw that as a real commitment holding great possibilities for expanding into new areas of research,” he said. “We could not only access a broad range of species all in one location, we had a physical facility for research activities.”

Groover’s work investigates genetic regulation of wood formation — the triggers of gene expression within the wood — which is driven by environment, including light, temperature, wind, water, gravity, even insects and disease. Studying diverse tree species helps him identify the genetic basis of how different species modify their growth and adapt to different environmental conditions.

“Trees, in general, are very responsive to the environment, and trees can actually make adjustments in their wood anatomy to suit the environment,” Groover said. “One thing that is really interesting about trees is that they are perennial and live to decades or even thousands of years in the same place, and they have to be able to cope with all of the variation.”

The collaboration with the Arboretum is special because its trees contain valuable provenance.

“The trees are well-cared for, are not likely to disappear or die so you can go back again, and they are all right there next to each other,” Groover said.

While his in-depth research is on poplars (Populus spp.), the knowledge obtained may be beneficial in the study of many other tree species.

“If the genetic regulation of a trait is conserved among species, then what we learn in poplar can be transferred to the hundreds of other species we would like to be able to better manage or understand,” Groover said. “We can transfer knowledge across different species and potentially use that information in the future for things like reforestation and restoration.”

Suzanne Gerttula of the Forest Service began working in developmental plant genetics more than three decades ago and joined Groover’s laboratory in 2010. The former staff research associate in plant biology at U.C., Davis, has an interest in the underlying mechanisms of trees’ responses to gravity, such as occurs in weeping varieties.

“The Arboretum is an incredible resource for both weeping and upright trees. It’s fascinating, fun, and inspiring to me to be able to get at the some of the biochemical bases of how life works,” she said.

Groover’s enthusiasm for his subject spans sectors from ecological to economic. From understanding Earth cycles and climate change to helping the lumber, paper, fiber, and even biofuel industries, he hopes his research can inform solutions for forest management and conservation and identify new forms of renewable energy.

“I think it’s important we have places like the Arnold Arboretum to help provide this sort of basic information that has the potential to help in the conservation and management of forests,” he said.

Michael Dosmann, Keeper of the Living Collections at the Arboretum, said it has research potential across a wide swath of disciplines — taxonomic, horticultural, plant conservation, ecology, and developmental biology.

“Our living collection’s research potential could never be exhausted; there is a constant need for its use, growth, and development,” he said. “[The] dynamic interplay between living collections and scientific research demonstrates the vital importance collections have to science and to society.”

Scientists such as Groover enjoy access not only to the living collections, but also to other Arboretum resources, including affiliated collections containing herbarium specimens, archives, images, historical records, on-site greenhouse and laboratory space, centralized expertise, and, frequently, financial assistance in the form of grants and fellowships.

“All too often, the cost both in time and dollars of assembling collections at their own institutions is prohibitive for researchers, making places like the Arboretum a vital resource, especially for those working with limited budgets,” Dosmann said.

Evolving technology also plays a critical role, according to Dosmann, giving researchers the ability to access the Arboretum’s expansive resources, and making plant species more attainable.

“With the aid of databases and other information systems, it is now much easier to see collections in the multiple dimensions within which they exist and appreciate their unlimited research potential,” he said.

Groover said that with forests facing multiple threats, there’s never been a more important time to address forest biology and the use of technology.

“In the west especially, we need new insights into how to make forests more resilient to drought and heat, including understanding the biology underlying stress responses in different tree species,” he said. “We are learning the complexities of forest trees and hope to ultimately be able to select genotypes or species that might perform better in the future. Working with the Arboretum offers the resources for this important research.”

Research Areas: 

Categories: Green Homes

Is the Green New Deal For Real?

Fri, 2019-01-18 10:50
January 10, 2019

90.9 WBUR

Is the Green New Deal For Real?0

Listen to podcast

The mission, as it turned out, was to transform the American economy and save the country, no less, over twelve years. Franklin Roosevelt called it his New Deal, starting in 1933. New-breed Democrats in Congress today are talking about a Green New Deal, starting now, deep into the crisis of a changing climate that goes way beyond the weather. FDR had a working class revolt driving him forward, and later he had a Nazi threat and a world war to focus every fiber of mind and muscle on a reinvention. Which may be what the climate is demanding. Here’s one test: at mention of an all-new renewable energy system, is your first thought Costs? Savings? Or Survival? Getting real about the Green New Deal, this week on Open Source.

Three words and one picture sum up the new scene in Washington—and the relief, for starters, from a two-year fixation on President You-Know-Who. The picture is of the so-called Sunrise Movement siege of Nancy Pelosi’s office from last November, and of the rapturous, insurgent Congressperson from the Bronx, Alexandria Ocasio-Cortez, sweeping up the moment and putting its three little words—Green New Deal—at the top of the evolving agenda in D.C. It’s as slippery a promise as universal health care, but here’s our first crack at what it could mean: a resurrection of spirit, perhaps, at the bold Rooseveltian scale, after 75 years? A reset in relations with work, among workers, which Roosevelt’s New Deal was? We’ll see. Does it mean a warfor clean, renewable energy, against the embedded power of fossil-fuels? Unavoidably. A “system upgrade” for the power grid and the whole economy? About time, you say! But can it be done?

Guest List

Bill McKibben
environmentalist and journalist

Naomi Oreskes
Professor of the History of Science at Harvard

Daniel Schrag
Professor of Geology and Environmental Science and Engineering at Harvard and director of the Harvard Center for the Environment

Research Areas: 

Categories: Green Homes

The Double-Edged Sword of Palm Oil

Thu, 2019-01-17 14:28
January 16, 2019

Stanford Earth

The Double-Edged Sword of Palm Oil0

BY ROB JORDAN, STANFORD WOODS INSTITUTE FOR THE ENVIRONMENT

Nearly ubiquitous in products ranging from cookies to cosmetics, palm oil represents a bedeviling double-edged sword. Widespread cultivation of oil palm trees has been both an economic boon and an environmental disaster for tropical developing-world countries, contributing to large-scale habitat loss, among other impacts. New Stanford-led research points the way to a middle ground of sustainable development through engagement with an often overlooked segment of the supply chain.

"The oil palm sector is working to achieve zero-deforestation supply chains in response to consumer-driven and regulatory pressures, but they won’t be successful until we find effective ways to include small-scale producers in sustainability strategies,” said Elsa Ordway, lead author of a Jan. 10 Nature Communications paper that examines the role of proliferating informal oil palm mills in African deforestation. Ordway, a postdoctoral fellow at The Harvard University Center for the Environment, did the research while a graduate student in Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth).

Using remote sensing tools, Ordway and her colleagues mapped deforestation due to oil palm expansion in Southwest Cameroon, a top producing region in Africa’s third largest palm oil producing country. 

Contrary to a widely publicized narrative of deforestation driven by industrial-scale expansion, the researchers found most oil palm expansion and associated deforestation occurred outside large, company-owned concessions, and that expansion and forest clearing by small-scale, non-industrial producers was more likely near low-yielding informal mills, scattered throughout the region. This is strong evidence that oil palm production gains in Cameroon are coming from extensification instead of intensification.

Possible solutions for reversing the extensification trend include improving crop and processing yields by using more high-yielding seed types, replanting old plantations, and upgrading and mechanizing milling technologies, among other approaches. To prevent intensification efforts from inciting further deforestation, they will need to be accompanied by complementary natural resource policies that include sustainability incentives for smallholders.

In Indonesia, where a large percentage of the world’s oil palm-related forest clearing has occurred, a similar focus on independent, smallholder producers could yield major benefits for both poverty alleviation and environmental conservation, according to a Jan. 4 Ambio study led by Rosamond Naylor, the William Wrigley Professor in the School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment and the Freeman Spogli Institute for International Studies (Naylor coauthored the Cameroon study led by Ordway).

Using field surveys and government data, Naylor and her colleagues analyzed the role of small producers in economic development and environmental damage through land clearing. Their research focused on how changes in legal instruments and government policies during the past two decades, including the abandonment of revenue-sharing agreements between district and central governments and conflicting land title authority among local, regional and central authorities, have fueled rapid oil palm growth and forest clearing in Indonesia.

They found that Indonesia’s shift toward decentralized governance since the end of the Suharto dictatorship in 1998 has simultaneously encouraged economic development through the expansion of smallholder oil palm producers (by far the fastest growing subsector of the industry since decentralization began), reduced rural poverty, and driven ecologically destructive practices such as oil palm encroachment into more than 80 percent of the country’s Tesso Nilo National Park.

Among other potential solutions, Naylor and her coauthors suggest Indonesia’s Village Law of 2014, which devolves authority over economic development to the local level, be re-drafted to enforce existing environmental laws explicitly. Widespread use of external facilitators could help local leaders design sustainable development strategies and allocate village funds more efficiently, according to the research. Also, economic incentives for sustainable development, such as an India program in which residents are paid to leave forests standing, could make a significant impact.

There is reason for hope in recent moves by Indonesia’s government, including support for initiatives that involve large oil palm companies working with smallholders to reduce fires and increase productivity; and the mapping of a national fire prevention plan that relies on financial incentives.

“In all of these efforts, smallholder producers operating within a decentralized form of governance provide both the greatest challenges and the largest opportunities for enhancing rural development while minimizing environmental degradation,” the researchers write.

Coauthors of "Decentralization and the environment: Assessing smallholder oil palm development in Indonesia” include Matthew Higgins, a research assistant at Stanford’s Center on Food Security and the Environment; Ryan Edwards of Dartmouth College, and Walter Falcon, the Helen C. Farnsworth Professor of International Agricultural Policy, Emeritus, at Stanford.

Coauthors of “Oil palm expansion at the expense of forests in Southwest Cameroon associated with proliferation of informal mills” include Raymond Nkongho, a former fellow at Stanford’s Center for Food Security and the Environment; and Eric Lambin, the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment.

Image: Elsa Ordway

Research Areas: 

Categories: Green Homes

Mercury Matters 2018: A Science Brief for Journalists and Policymakers

Tue, 2019-01-15 12:45
December 1, 2018

Center for Climate, Health, and the Global Environment

Mercury Matters 2018: A Science Brief for Journalists and Policymakers0

Mercury in Context

Coal-fired power plants are the largest source of mercury in the U.S., accounting for approximately 48% of mercury emissions in 2015[1].

The Mercury and Air Toxics Standards (MATS) were finalized in 2011 and currently regulate emissions of mercury, acid gases and other hazardous air pollutants (HAPs) from U.S. electric utilities.

The MATS rule is expected to reduce mercury emissions from the power sector by 90%, improve public health, and play an integral role in meeting U.S. commitments under the international 2017 Minamata Convention on Mercury. 

The Latest from EPA

In August 2018, the U.S. Environmental Protection Agency (EPA) announced plans to revisit the Agency’s prior determination that regulating HAPs emitted from power plants under section 112 of the Clean Air Act was “appropriate and necessary”.

A proposal to reopen one or more aspects of MATS is currently under interagency review at the Office of Management and Budget and could result in lifting limits on mercury emissions from electric utilities in the U.S. 

The Issue

Recent research shows that MATS has substantially reduced mercury levels in the environment and improved public health at a much lower cost than anticipated. However, the Regulatory Impact Assessment (RIA) that the Administration is relying on in its rollback proposal does not reflect current scientific understanding of the local impacts and societal cost of mercury pollution in the U.S.[2],[3].

Many of the health effects associated with mercury exposure are not fully reflected in the RIA, and the final estimate of the mercury-related benefits from MATS only accounted for benefits to children of freshwater recreational anglers in the U.S., a small fraction of the total population affected. 

Mercury Emissions Matter to Human Health and the Environment

Mercury in the form of methylmercury is a potent neurotoxin. Important facts about the health effects of methylmercury include the following:

Children exposed to methylmercury during a mother’s pregnancy can experience persistent and lifelong IQ and motor function deficits[4].

In adults, high levels of methylmercury exposure have been associated with adverse cardiovascular effects, including increased risk of fatal heart attacks[5].

Other adverse health effects of methylmercury exposure that have been identified in the scientific literature include endocrine disruption[6], diabetes risk[7], and compromised immune function[8].

The societal costs of neurocognitive deficits associated with methylmercury exposure in the U.S. were estimated in 2017 to be approximately $4.8 billion per year[9].

No known threshold exists for methylmercury below which neurodevelopmental impacts do not occur[10],[11].

Mercury exposure in the U.S. occurs primarily through the consumption of freshwater fish and seafood (fish and shellfish). The consumption of marine fish, often harvested from U.S. coastal waters, accounts for greater than 80% of methylmercury intake by the U.S. population[12]. Dietary supplements cannot counteract methylmercury toxicity in U.S. consumers. A safe and consumable fishery is important to retaining a healthy, low-cost source of protein and other nutrients that are essential for pregnant women, young children, and the general population.

After mercury is emitted from power plants it is deposited back to Earth where it can be converted to methylmercury, a highly toxic form of mercury that magnifies up food chains, reaching concentrations in fish that are 10 to 100 million times greater than concentrations in water[13].

With increasing levels of mercury in the environment due to human activities, virtually all fish from U.S. waters now have detectable levels of methylmercury. Some fish, such as swordfish, large species of tuna, and freshwater game fish, can have levels that exceed consumption guidelines.

States post fish consumption advisories for waterbodies that are known to have elevated contaminants. In 2013, consumption advisories for mercury were in effect in all 50 states, one U.S. territory, and three tribal territories, and accounted for 81% of all U.S. advisories[14]. This represents more advisories for mercury than for all other contaminants combined.

Wildlife that consume fish, such as common loons, bald eagles, otter and mink, and many marine mammals can also experience adverse effects from mercury and are unable to heed advisories[15]. The health of many songbird and bat species is threatened due to methylmercury exposure in wetland habitats. The productivity of economically valuable game fish stocks can also be compromised[16].

As Mercury Emissions in the U.S. Have Declined, Health Has Improved

In the 2011 MATS RIA, it was assumed that mercury emissions from coal-fired utilities are mainly transported long-distances away from the U.S. and that a substantial fraction of mercury in the U.S. comes from international sources. Since that time, scientific understanding of the fate of U.S. mercury emissions has advanced[17],[18]. Recent research reveals that the contribution of U.S. coal-fired power plants to local mercury contamination in the U.S. has been markedly underestimated. Accordingly, controls on mercury emissions from U.S. electric utilities have contributed to the following human health and environmental improvements.

Mercury emissions from U.S. coal-fired power plants have declined by 85% from 92,000 pounds in 2006 to 14,000 pounds in 2016[19]since states began setting standards and MATS was introduced in 2011. Eleven states had implemented mercury emissions standards for power plants prior to 2011.

Concurrent with declines in mercury emissions, mercury levels in air, water, sediments, loons, freshwater fisheries, and Atlantic Ocean fisheries[20] have decreased appreciably.

Mercury levels in the blood of women in the U.S. declined by 34% between 2001 and 2010 as mercury levels in some fish decreased, and fish consumption advisories improved[21].

The estimated number of children born in the U.S. each year with prenatal exposure to methylmercury levels that exceed the EPA reference dose has decreased by half from 200,000-400,000 to 100,000-200,000, depending on the measure used[22].

The Benefits of Reducing Mercury Are Much Larger Than Previously Estimated

The EPA estimated in the MATS RIA that the annualized mercury-related health benefits of reducing mercury emissions would be less than $10 million. Recent studies that account for more pathways of methylmercury exposure and additional health effects suggest that the monetized benefits of reducing power plant mercury emissions in the U.S. are likely in the range of several billion dollars per year[23],[24],[25]. These and other studies support the conclusion that the mercury-related benefits from MATS are orders of magnitude larger than previously estimated in the MATS RIA[26].

In addition to the mercury-related benefits, MATS has also decreased sulfur dioxide and nitrogen oxide emissions, improving air quality and public health by reducing fine particulate matter and ground-level ozone. The EPA estimated that the annualized value of these additional benefits is $24 to $80 billion; bringing the total annual benefits from MATS to tens of billions of dollars. Even with these more complete estimates, substantial benefits of reducing mercury and other air toxics remain unquantified due to data limitations[27].

On the cost side, new information suggests that the EPA’s original cost-estimate for MATS of $9.6 billion is much higher than the actual cost due to declines in natural gas prices and lower than expected control equipment and renewable energy costs[28]. Yet, even with the original overestimate, the EPA projected that MATS would increase the monthly electric bill of the average American household by only $2.71 (or 0.3 cents per kilowatt-hour). This value is well within the price fluctuation consumers experienced between 2000 and 2011[29].

The Bottom Line

The science is clear, the health impacts of U.S. mercury emissions in the U.S. are large and disproportionately affect children and other vulnerable populations. Mercury emission standards in the U.S. have markedly reduced mercury in the environment and improved public health. The mercury-related benefits of MATS are much larger than previously estimated, the actual costs appear to be substantially lower than projected by the EPA, and the total monetized benefits across all pollutants far outweigh the costs of the standards.

Contributors

Charles Driscoll, Department of Civil and Environmental Engineering, Syracuse University

Elsie Sunderland, Harvard Paulson School of Engineering & Applied Sciences and Harvard T.H. Chan School of Public Health, Department of Environmental Health, Exposure, Epidemiology and Risk

Kathy Fallon Lambert, Harvard T.H. Chan School of Public Health, Center for Climate, Health, and the Global Environment

Joel Blum, Department of Earth and Environmental Sciences, University of Michigan

Celia Chen, Department of Biological Sciences, Dartmouth College

David Evers, BioDiversity Research Institute

Philippe Grandjean, Harvard T.H. Chan School of Public Health, Department of Environmental Health, Environmental and  Occupational Medicine and Epidemiology

Rob Mason, Departments of Chemistry and Marine Sciences, University of Connecticut

Emily Oken, Harvard Medical School

Noelle Selin, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology

Literature Cited

[1] Streets, D.G.; Horowitz, H.M.; Lu, Z.; Levin, L.; Thackray, C.P.; Sunderland, E.M. Global and regional trends in mercury emissions and concentrations, 2010-2015. Atmospheric Environment. Accepted.

[2] Sunderland, E.M.; Driscoll, Jr., C.T.; Hammitt, J.K.; Grandjean, P.; Evans, J.S.; Blum, J.D.; Chen, C.Y.; Evers, D.C.; Jaffe, D.A.; Mason, R.P.; Goho, S.; Jacobs, W. 2016. Benefits of Regulating Hazardous Air Pollutants from Coal and Oil-Fired Utilities in the United States. Environmental Science & Technology. 50 (5), 2117-2120. DOI: 10.1021/acs.est.6b00239.

[3] Giang, A.; Mulvaney, K; Selin, N.E. 2016. Comments on “Supplemental Finding That It Is Appropriate and Necessary to Regulate Hazardous Air Pollutants from Coal- and Oil-Fired Electric Utility Steam Generating Units”.

[4] Grandjean, P. and Bellanger, M. 2017. Calculation of the disease burden associated with environmental chemical exposures: application of toxicological in health economic estimation. 16:123. DOI: 10.1186/s12940-017-0340-3.

[5] Genchi G., Sinicropi M.S., Carocci A., Lauria G., Catalano A. 2017. Mercury Exposure and Heart Diseases. Int J Environ Res Public Health. 2017;14(1):74. Published Jan 12. DOI:10.3390/ijerph14010074.

[6] Tan, S.W.; Meiller, J.C.; Mahaffey, K.R. 2009. The endocrine effects of mercury in humans and wildlife. Crit. Rev. Toxicol. 39 (3), 228−269.

[7] He, K.; Xun, P.; Liu, K.; Morris, S.; Reis, J.; Guallar, E. 2013. Mercury exposure in young adulthood and incidence of diabetes later in life: the CARDIA trace element study. Diabetes Care. 36, 1584−1589.

[8] Nyland, J. F.; Fillion, M.; Barbosa, R., Jr.; Shirley, D. L.; Chine, C.; Lemire, M.; Mergler, D.; Silbergeld, E.K. 2011. Biomarkers of methylmercury exposure and immunotoxicity among fish consumers in the Amazonian Brazil. Env. Health Persp. 119 (12), 1733− 1738.

[9] Grandjean and Bellanger 2017.

[10] Rice, G.E.; Hammitt, J.K; and Evans, J.S. 2010. A probabilistic characterization of the health benefits of reducing methyl mercury intake in the United States. Environ Sci Technol. 1;44(13):516-24. DOI:10.1021/es903359u.

[11] Grandjean and Bellanger 2017.

[12] Sunderland, E. M.; Li, M.; Bullard, K. 2018. Decadal Changes in the Edible Supply of Seafood and Methylmercury Exposure in the United States. Environ. Health Persp. DOI: 10.1289/EHP2644.

[13] Driscoll, C.T.; Han, Y-J; Chen, C.; Evers, D.; Lambert, K.F.; Holsen, T.; Kamman, N.; and Munson, R. 2007. Mercury Contamination on Remote Forest and Aquatic Ecosystems in the Northeastern U.S.: Sources, Transformations, and Management Options. BioScience. 57(1):17-28.

[14] U.S. Environmental Protection Agency. 2011 National Listing of Fish Advisories. 2013. EPA-820-F-13-058.

[15] Chan, N.M.; Scheuhammer, A.M.; Ferran, A.; Loupelle, C.; Holloway, J.; and Weech, S. 2003. Impacts of Mercury on Freshwater Fish-eating Wildlife and Humans. Human and Ecological Risk Assessment. 9(4): 867-883.

[17] Zhang, Y.; Jacob, D.; Horowitz, H.; Chen, L.; Amos, H.; Krabbenhoft, D.; Slemr, F.; St. Louis, V.; Sunderland, E. 2016. Observed decrease in atmospheric mercury explained by global decline in anthropogenic emissions. PNAS. 113 (3) 526-531.  DOI: 10.1073/pnas.1516312113.

[18] Lepak, R.F.; Yin, R.; Krabbenhoft, D.; Ogorek, J.; DeWild, J.; Holsen, T.; and Hurley, J. 2015. Use of Stable Isotope Signatures to Determine Mercury Sources in the Great Lakes. Environmental Science & Technology Letters. 2 (12), 335-34. DOI: 10.1021/acs.estlett.5b00277.

[19] U.S. Environmental Protection Agency. 2018. https://www.epa.gov/trinationalanalysis/electric-utilities-mercury-relea....

[20] Cross, F.A.; Evans, D.W.; Barber, R.T. 2015. Decadal declines of mercury in adult bluefish (1972−2011) from the mid-Atlantic coast of the U.S.A. Environ. Sci. Technol. 49, 9064−9072.

[21] U.S. Environmental Protection Agency. 2013. Trends in Blood Mercury Concentrations and Fish Consumption Among U.S. Women of Childbearing Age NHANES 1999-2010. EPA-823-R-13-002. https://www.regulations.gov/document?D=EPA-HQ-OAR-2009-0234-20544.

[22] U.S. Environmental Protection Agency. 2013. EPA-823-R-13-002.

[23] Rice et al. 2010.

[24] Giang, A.; Selin, N. E. Benefits of mercury controls for the United States. Proc. Natl. Acad. Sci. U. S. A. 2016, 113, 286.

[25] Sunderland et al. 2016.

[26] Giang et al. 2016.

[27] Sunderland et al. 2016.

[28] Declaration of James E. Staudt, Ph.D. CFA, September 24, 2015, White Stallion Energy Center, et al., v. United States Environmental Protection Agency, Case No. 12-1100 and Summary plus cases, Exhibit 1 Declaration of James E. Staudt, Ph.D., CFA, U.S. Court of Appeals for the District of Columbia.

[29] U.S. Environmental Protection Agency. Final Consideration of Cost in the Appropriate and Necessary Finding for the Mercury and Air Toxics Standards for Power Plants. https://www.epa.gov/sites/production/files/2016-05/documents/20160414_ma....

Photo by Pixabay user 12019

Research Areas: 

Categories: Green Homes

Environmental Health Capacity Building

Tue, 2019-01-08 10:14
November 8, 2018

Harvard T.H. Chan School News

Environmental Health Capacity Building0

By Chris Sweeney

Dust storms in Kuwait. Tourism in Tunisia. Air pollution in Uganda. Three different countries facing three different challenges. A common thread? Harvard T.H. Chan School researchers are working in each setting to understand how environmental factors are impacting the health of the people who live and work in these regions.

At a panel discussion on “Environmental Health Capacity Building In Africa And The Middle East” held on October 25, 2018 as part of Harvard Worldwide Week, attendees were given a look at these ongoing research projects.

“Developing countries in Africa and the Middle East are bearing a disproportionate health burden from climate change and environmental contamination,” said Douglas Dockery, John L. Loeb and Frances Lehman Loeb Research Professor of Environmental Epidemiology. “For this panel we brought together three investigators across Harvard who are partnering with institutions in this region to build local capacity to address these challenges.”

The event, hosted by the Department of Environmental Health and the Harvard Chan-NIEHS Center for Environmental Health, kicked off with a presentation from Petros Koutrakis, professor of environmental sciences and an expert on air pollution. Koutrakis shared an overview of his work in the Middle East, which dates back to the 1990s when he and colleagues assessed the environmental health impacts of the hundreds of oil wells that were set ablaze during the Gulf War.

More recently, Koutrakis has turned his attention to dust storms in Kuwait, a fairly common meteorological event that may have a significant impact on human health and social dynamics. Using satellite data, historical weather records, and air quality sensors, Koutrakis and colleagues are gleaning new insights on how desert vegetation and wind patterns affect the severity and frequency of dust storms.

“Dust is not something we can control, and so people have to adjust,” Koutrakis said, noting that these adjustments can impact human activity and health. For instance, on days when dust storms are severe, people may be forced to stay indoors, reducing their ability to exercise. There is also the potential that exposure to dust storms over long periods of time may be associated with chronic respiratory problems.

Koutrakis was followed up by Misbath Daouda, a master’s candidate in environmental health who’s studying how the growing tourism industry in Tunisia may impact the local environment.

Daouda’s research so far has shown that in some tourism hot spots, electricity demand surges by more than 50% during the busy season and that the sector is responsible for more than one-third of water consumption in Djerba, an island oasis off the eastern coast of Tunisia. As Daouda explained, her hope is to build a framework to measure tourism growth and its impact on the environment and human health in order to assist policymakers who will have to wrestle with important choices on how to mitigate the sector’s impact in the North African country over the coming years.

Rounding out the event was a presentation from Crystal North, a pulmonologist at Massachusetts General Hospital who has been collaborating with Harvard Chan School researchers to study air pollution in Uganda.

The work involves tracking air quality and following a cohort of HIV-positive and HIV-negative patients in the East African nation. Among the challenges are the air quality data from developing countries are relatively sparse, and there are very few sensors in Uganda to measure ambient air quality.

Previously North’s research has looked at inflammation and lung function in HIV-positive patients, who tend to be at increased risk of tuberculosis. She hopes to build on that with this new research by focusing on whether air pollution and HIV are synergistic in their effects on lung function. “Hopefully in the next year or two we’ll have some initial results to share,” North said.

Research Areas: 

Categories: Green Homes

The Long Memory of the Pacific Ocean

Mon, 2019-01-07 12:04
January 4, 2019

SEAS Communications

The Long Memory of the Pacific Ocean0

By Leah Burrows, SEAS Communications

The ocean has a long memory. When the water in today’s deep Pacific Ocean last saw sunlight, Charlemagne was the Holy Roman Emperor, the Song Dynasty ruled China and Oxford University had just held its very first class. During that time, between the 9th and 12th centuries, the earth’s climate was generally warmer before the cold of the Little Ice Age settled in around the 16th century. Now, ocean surface temperatures are back on the rise but the question is, do the deepest parts of the ocean know that?

Researchers from the Woods Hole Oceanographic Institution and Harvard University have found that the deep Pacific Ocean lags a few centuries behind in terms of temperature and is still adjusting to the advent of the Little Ice Age. Whereas most of the ocean is responding to modern warming, the deep Pacific may be cooling.

The research is published in Science.

"Climate varies across all timescales,” said Peter Huybers, Professor of Earth and Planetary Sciences in the Department of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences and co-author of the paper.  “Some regional warming and cooling patterns, like the Little Ice Age and the Medieval Warm Period, are well known. Our goal was to develop a model of how the interior properties of the ocean respond to changes in surface climate.”

What that model showed was surprising.

“If the surface ocean was generally cooling for the better part of the last millennium, those parts of the ocean most isolated from modern warming may still be cooling,” said Jake Gebbie, a physical oceanographer at Woods Hole Oceanographic Institution and lead author of the study. 

The model is a simplification of the actual ocean. To test the prediction, Gebbie and Huybers compared the cooling trend found in the model to ocean temperature measurements taken by scientists aboard the HMS Challenger in the 1870s and modern observations from the World Ocean Circulation Experiment of the 1990s.

The HMS Challenger, a three-masted wooden sailing ship originally designed as a British warship, was used for the first modern scientific expedition to explore the world’s ocean and seafloor. During the expedition from 1872 to 1876, thermometers were lowered into the ocean depths and more than 5,000 temperature measurements were logged.

“We screened this historical data for outliers and considered a variety of corrections associated with pressure effects on the thermometer and stretching of the hemp rope used for lowering thermometers,” said Huybers. 

The researchers then compared the HMS Challenger data to the modern observations and found warming in most parts of the global ocean, as would be expected due to the warming planet over the 20th Century, but cooling in the deep Pacific at a depth of around two kilometers depth. 

“The close correspondence between the predictions and observed trends gave us confidence that this is a real phenomenon,” said Gebbie.

These findings imply that variations in surface climate that predate the onset of modern warming still influence how much the climate is heating up today.  Previous estimates of how much heat the Earth had absorbed during the last century assumed an ocean that started out in equilibrium at the beginning of the Industrial Revolution. But Gebbie and Huybers estimate that the deep Pacific cooling trend leads to a downward revision of heat absorbed over the 20th century by about 30 percent.

"Part of the heat needed to bring the ocean into equilibrium with an atmosphere having more greenhouse gases was apparently already present in the deep Pacific,” said Huybers. "These findings increase the impetus for understanding the causes of the Medieval Warm Period and Little Ice Age as a way for better understanding modern warming trends."

This research was funded by the James E. and Barbara V. Moltz Fellowship and National Science Foundation grants OCE-1357121 and OCE-1558939.

Research Areas: 

Categories: Green Homes

The Long Memory of the Pacific Ocean

Mon, 2019-01-07 12:04
January 4, 2019

SEAS Communications

The Long Memory of the Pacific Ocean0

By Leah Burrows, SEAS Communications

The ocean has a long memory. When the water in today’s deep Pacific Ocean last saw sunlight, Charlemagne was the Holy Roman Emperor, the Song Dynasty ruled China and Oxford University had just held its very first class. During that time, between the 9th and 12th centuries, the earth’s climate was generally warmer before the cold of the Little Ice Age settled in around the 16th century. Now, ocean surface temperatures are back on the rise but the question is, do the deepest parts of the ocean know that?

Researchers from the Woods Hole Oceanographic Institution and Harvard University have found that the deep Pacific Ocean lags a few centuries behind in terms of temperature and is still adjusting to the advent of the Little Ice Age. Whereas most of the ocean is responding to modern warming, the deep Pacific may be cooling.

The research is published in Science.

"Climate varies across all timescales,” said Peter Huybers, Professor of Earth and Planetary Sciences in the Department of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences and co-author of the paper.  “Some regional warming and cooling patterns, like the Little Ice Age and the Medieval Warm Period, are well known. Our goal was to develop a model of how the interior properties of the ocean respond to changes in surface climate.”

What that model showed was surprising.

“If the surface ocean was generally cooling for the better part of the last millennium, those parts of the ocean most isolated from modern warming may still be cooling,” said Jake Gebbie, a physical oceanographer at Woods Hole Oceanographic Institution and lead author of the study. 

The model is a simplification of the actual ocean. To test the prediction, Gebbie and Huybers compared the cooling trend found in the model to ocean temperature measurements taken by scientists aboard the HMS Challenger in the 1870s and modern observations from the World Ocean Circulation Experiment of the 1990s.

The HMS Challenger, a three-masted wooden sailing ship originally designed as a British warship, was used for the first modern scientific expedition to explore the world’s ocean and seafloor. During the expedition from 1872 to 1876, thermometers were lowered into the ocean depths and more than 5,000 temperature measurements were logged.

“We screened this historical data for outliers and considered a variety of corrections associated with pressure effects on the thermometer and stretching of the hemp rope used for lowering thermometers,” said Huybers. 

The researchers then compared the HMS Challenger data to the modern observations and found warming in most parts of the global ocean, as would be expected due to the warming planet over the 20th Century, but cooling in the deep Pacific at a depth of around two kilometers depth. 

“The close correspondence between the predictions and observed trends gave us confidence that this is a real phenomenon,” said Gebbie.

These findings imply that variations in surface climate that predate the onset of modern warming still influence how much the climate is heating up today.  Previous estimates of how much heat the Earth had absorbed during the last century assumed an ocean that started out in equilibrium at the beginning of the Industrial Revolution. But Gebbie and Huybers estimate that the deep Pacific cooling trend leads to a downward revision of heat absorbed over the 20th century by about 30 percent.

"Part of the heat needed to bring the ocean into equilibrium with an atmosphere having more greenhouse gases was apparently already present in the deep Pacific,” said Huybers. "These findings increase the impetus for understanding the causes of the Medieval Warm Period and Little Ice Age as a way for better understanding modern warming trends."

This research was funded by the James E. and Barbara V. Moltz Fellowship and National Science Foundation grants OCE-1357121 and OCE-1558939.

Research Areas: 

Categories: Green Homes

Changing Temperatures Boost U.S. Corn Yield — For Now

Wed, 2018-12-05 12:45
November 6, 2018

SEAS Communications

Changing Temperatures Boost U.S. Corn Yield — For Now0

By Leah Burrows, SEAS Communications

The past 70 years have been good for corn production in the Midwestern U.S., with yields increasing fivefold since the 1940s. Much of this improvement has been credited to advances in farming technology, but researchers at Harvard University are asking if changes in climate and local temperature may be playing a bigger role than previously thought.

In a new paper, researchers found that a prolonged growing season due to warmer temperatures, combined with the natural cooling effects of large fields of plants, have had a major contribution to improved corn production in the U.S.

“Our research shows that improvements in crop yield depend, in part, on improvements in climate,” said Peter Huybers, professor of Earth and planetary sciences in the Department of Earth and Planetary Sciences (EPS) and of environmental science and engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS). “In this case, changing temperatures have had a beneficial impact on agricultural production, but there is no guarantee that benefit will last as the climate continues to change. Understanding the detailed relationships between climate and crop yield is important as we move toward feeding a growing population on a changing planet.”

The research is published in the Proceedings of the National Academy of Sciences (PNAS).

The researchers modeled the relationship between temperature and crop yield from 1981 to 2017 across the so-called Corn Belt: Illinois, Indiana, Iowa, Kansas, Kentucky, Michigan, Minnesota, Missouri, Nebraska, Ohio, South Dakota, and Wisconsin. They found that as temperatures increased due to global climate change, planting days got earlier and earlier, shifting by about three days per decade.

“One of farmers’ biggest decisions is what they plant and when they plant it,” said Ethan Butler, a postdoctoral research associate in the Department of Forest Resources at the University of Minnesota, first author of the paper, and a former graduate student in EPS. “We are seeing that farmers are planting earlier — not only because they have hardier seeds and better planting equipment, but also because it’s getting warmer sooner.”

Early planting means the corn has more time to mature before the growing season ends.

There is also a second, more surprising trend that has benefited corn yields. Whereas the vast majority of temperatures have warmed over the last century, the hottest days during the Midwestern growing season have actually cooled.

“Increasingly productive and densely planted crops can evaporate more water from leaves and soils during hot days,” said Nathaniel Mueller, former postdoctoral research fellow at the Harvard University Center for the Environment and co-author of the paper. “Widespread increases in rates of evaporation apparently help shield maize from extreme heat, cooling the surrounding area and helping to boost yields.”

Mueller is currently an assistant professor of Earth system science at the University of California, Irvine.

The researchers estimate that more than a quarter of the increase in crop yield since 1981 can be attributed to the twin effects of a longer growing season and less exposure to high temperatures, suggesting that crop yields are more vulnerable to climate change than previously thought.

The researchers also show that the planting and harvest dates farmers currently use are significantly better adapted to the present climate than they would have been to climates in earlier decades.

“Farmers are incredibly proactive and we’re seeing them take advantage of changes in temperature to improve their yield. The question is, how well can they continue to adapt in response to future changes in climate,” said Huybers.

This research was supported in part by the Packard Foundation and the National Science Foundation

Image: Pixabay​

Research Areas: 

Categories: Green Homes

Getting from No Nuclear to Slow Nuclear

Wed, 2018-12-05 10:07
December 4, 2018

The Harvard Gazette

Getting from No Nuclear to Slow Nuclear0

By Alvin Powell, Harvard Staff Writer

Harvard scientists say that low-carbon nuclear power may eventually gain greater support in the United States, suggesting that new, more economical plants could play an important role in the country’s energy production midcentury and beyond.

Beset by high construction costs and undercut by cheaper natural gas, wind, and even solar power, the nation’s nuclear fleet is struggling, with nuclear power producing about 20 percent of U.S. electricity today. Plant development is rare and economically risky, while the pace of retirements is increasing, driven by aging infrastructure and red ink.

Environmental fellow Michael Ford and climate scientist Daniel Schrag say those conditions are unlikely to change soon, but that the low-carbon power provided by nuclear plants may prove an important part of a future energy mix, one designed to fight climate change.

“Certainly right now, the existing fleet is struggling,” Ford said. “There are quite a few plants that are under market pressure, many early closings. … And if you pay any attention to attempts to deploy new nuclear in this country, efforts in South Carolina failed in 2016 and the Vogtle construction project in Georgia is something on the order of 100 percent over budget and years behind schedule.”

The central challenge, according to a paper Ford and Schrag published in October in the journal Nature Energy, is ensuring that acceptable technology — safe, cheaper, more efficient — is available to be deployed by midcentury, when market trends may again make nuclear power competitive with other sources.

At the moment, the main barrier for American nuclear projects is “the high price of new development and market economics, driven by cheap renewables and natural gas — combinations that are somewhat unique to the U.S.,” said Ford, whose doctoral work at Carnegie Mellon University focused on the state of the U.S. nuclear industry.

Ford and Schrag recommend that the U.S. government initiate a “tortoise” approach, investing in steady development of a range of advanced nuclear technologies, evaluating each so that the best option will be ready to go when need and market conditions arise.

“We believe the likely timescale for the demand for this technology is still at least a couple of decades away,” said Schrag, the Sturgis Hooper Professor of Geology, a professor of environmental science and engineering, and director of the Harvard University Center for the Environment.

In his work with the center, Ford, a retired Navy captain and nuclear engineer, has developed models of possible futures for the industry and is now creating a more detailed version of the proposal outlined in the October paper.

The industry itself has pinned its hopes on developing a generation of smaller, less-expensive reactors that could be built more quickly on site, he said. A number of companies are betting on new nuclear designs to get there — technologies that use substances other than water for cooling, such as liquid sodium or molten salts.

Ford and Schrag warn that there’s a risk in rushing ahead with a single advanced technology — what they call the “hare” approach — because it’s hard to know which of several options will work well enough to be deployable. In addition, whatever the technology, market conditions affecting nuclear power today are going to take decades to shift, they say.

“The idea of developing advanced nuclear like the Manhattan Project actually doesn’t fit the pace of the problem,” Schrag said. “What we’re trying to explain is that there are a sequence of steps in the decarbonization process [of the electricity grid], and absolutely we can accelerate them, but those steps are unlikely to change.”

Cheap and relatively clean natural gas will likely drive the closing of most of the nation’s coal-fired plants in the coming decades, Schrag said. At the same time, renewable power from wind and solar — both of which have become cheaper than nuclear in recent years — will likely take a larger share of the power grid.

Market conditions could shift in nuclear’s favor as the intermittent nature of wind and solar creates higher demand for new sources of support.

Though market conditions will initially favor natural gas in that role, gas is likely to become more expensive as demand increases, say Ford and Schrag, adding that pressure to include carbon capture technology could also boost the cost of electricity produced by natural gas plants.

“A new demand for nuclear power in the U.S. is likely to come only when natural gas is expensive or considered dirty or both,” Schrag said.

Ford and Schrag suggest that the U.S. direct research and development support to advanced nuclear designs and adopt a strategy that provides $200 million to $250 million annually to develop four or five promising technologies. Policy makers should also commit to providing additional funding — in partnership with industry — to demonstration plants that can scale up and test competing technologies, they say.

Schrag said the amount needed is relatively small in the context of overall energy spending, making it more likely to survive budget battles and swings in priorities between Republican and Democratic administrations.

Nonetheless, advanced nuclear won’t have a guaranteed spot in the nation’s energy mix, the scientists say. The industry would still have to compete with technologies such as carbon capture and large-scale battery storage.

“We’re saying nuclear needs to be an option, we’re not saying that nuclear has to win that competition,” Schrag said.

Image: Jon Chase/Harvard Staff Photographer

Research Areas: 

Categories: Green Homes
RMC facebook RMC twitter
Scroll to Top