The Center for the Environment is delighted to congratulate three of its longtime affiliated faculty—Joyce Chaplin, Jody Freeman, and Daniel Schrag—on their recent election to the American Academy of Arts and Sciences, one of the nation’s most prestigious honorary societies and a leading center for independent policy research. Joyce Chaplin is the James Duncan Phillips Professor of Early American History in the Department of History, where she teaches the histories of science, climate, colonialism, and environment. She is currently working on a history of resource conservation, climate change, and settler colonialism, for which she received a 2018 Guggenheim Fellowship. Chaplin is also the faculty organizer of the HUCE Environmental History Working Group. Jody Freeman is Archibald Cox Professor of Law and the founding director of the Law School’s Environmental and Energy Law Program. A leading scholar of both administrative law and environmental law, Freeman served as counselor for Energy and Climate Change during the Obama Administration, where she was the architect of the president’s historic agreement with the auto industry to double fuel efficiency standard. Daniel Schrag is the Sturgis Hooper Professor of Geology, Professor of Environmental Science and Engineering, co-director of the Program on Science, Technology and Public Policy at the Harvard Kennedy School, and director of the Center for the Environment. Schrag studies climate change over the broadest range of Earth’s history, including how climate change and the chemical evolution of the atmosphere influenced the evolution of life in the past, and what steps might be taken to prepare for impacts of climate change in the future. He served from 2009 to 2017 on President Obama’s Council of Advisors for Science and Technology (PCAST), contributing to many reports to the President including energy technology and national energy policy, agricultural preparedness, climate change, and STEM education.
For more information on the 12 Harvard faculty elected to the Academy, please see the Harvard Gazette story here.
Twelve Harvard faculty are among the more than 200 individuals elected to the American Academy of Arts and Sciences, the academy announced today.
Chosen for their compelling achievements in academia, business, government, and public affairs, the Harvard inductees are Joyce E. Chaplin, Jody Freeman, Peter A. Hall, Mark D. Jordan, Barbara B. Kahn, Ronald C. Kessler, Danesh Moazed, Carol J. Oja, Subir Sachdev, Daniel P. Schrag, Tommie Shelby, and Jeremy M. Wolfe.
“One of the reasons to honor extraordinary achievement is because the pursuit of excellence is so often accompanied by disappointment and self-doubt,” said David W. Oxtoby 72, the president of the American Academy of Arts and Sciences. “We are pleased to recognize the excellence of our new members, celebrate their compelling accomplishments, and invite them to join the academy and contribute to its work.”
The academy was founded in 1780 by John Adams, James Bowdoin, and others who believed the new republic should honor exceptionally accomplished individuals and engage them in advancing the public good. The academy’s dual mission remains essentially the same 239 years later, with honorees from increasingly diverse fields and with the work now focused on the arts, democracy, education, global affairs, and science.
“With the election of these members, the academy upholds the ideals of research and scholarship, creativity and imagination, intellectual exchange and civil discourse, and the relentless pursuit of knowledge in all its forms,” said Oxtoby.
“While the work of this class includes work never imagined in 1780 — such as cultural studies, cybersecurity, disease ecology, nanotechnology, paleoclimatology, and superconductivity — these members embody the founders’ vision of cultivating knowledge that advances, in their words, a ‘free, virtuous, and independent people,’” said Nancy C. Andrews, board chair of the American Academy.
The new class will be inducted at a ceremony in October in Cambridge and join the academy members who came before them, including Benjamin Franklin (elected 1781) and Alexander Hamilton (1791); Ralph Waldo Emerson (1864), Maria Mitchell (1848), and Charles Darwin (1874); Albert Einstein (1924), Robert Frost (1931), Margaret Mead (1948), Milton Friedman (1959), and Martin Luther King Jr. (1966); and more recently Antonin Scalia (2003), Michael Bloomberg (2007), John Lithgow ’67 (2010), Judy Woodruff (2012), Bryan Stevenson (2014), and former President Barack Obama, J.D. ’91 (2018).
For the complete list of the 239th class of new members, visit the website.
By Kendra Pierre-Louis
Image: Tim Gruber for The New York Times
DULUTH, Minn. — As the West burns, the South swelters and the East floods, some Americans are starting to reconsider where they choose to live.
For advice, a few of them are turning to Jesse Keenan, a lecturer at the Harvard University Graduate School of Design. At least once a day, Dr. Keenan, who studies urban development and climate adaptation, gets an email from someone asking where to move to be safe from climate change. The messages come from people who are thinking about moving not because they have already been hit by catastrophe, but because they see the writing on the wall.
So, what does Dr. Keenan suggest to these advance planners? Maybe climate-proof Duluth.
That’s a slogan that he created as part of an economic development and marketing package commissioned by the University of Minnesota Duluth. Some community leaders think they can spur growth by bringing in more people, and they sense an opportunity in climate change. And Duluth isn’t the only urban area that has climate migration on their radar. In a February speech, the mayor of Buffalo, Byron W. Brown, declared his city a “climate refuge.”
Dr. Keenan emphasized one day in mid-March as we stood on the ice of Lake Superior that the Duluth slogan was meant to be tongue-in-cheek. The science behind it, though, is no joke.
Nowhere in the world is immune from climate change, including Duluth. “We’re getting more precipitation in bigger amounts than we ever really observed,” said Kenneth Blumenfeld, a senior climatologist at the Minnesota Department of Natural Resources.
“But when you stand back and look around, it’s almost like, ‘But we’ve got it good.’”
Climate projections suggest that, because of geographic factors, the region around Duluth, the Great Lakes area, will be one of the few places in America where the effects of climate change may be more easily managed.
First, it’s cool to begin with. That means, as temperatures increase, it will remain mild in relative terms. By 2080, even under relatively high concentrations of carbon dioxide emissions, Duluth’s climate is expected to shift to something like that of Toledo, Ohio, with summer highs maxing out in the mid-80s Fahrenheit.
“We’re not seeing worse heat waves or longer heat waves or more of those long nights that don’t fall below 75 degrees,” Dr. Blumenfeld said. “Instead, what we’re seeing is warmer winters, fewer days during winter where we get to negative 30 Fahrenheit.”
Because the region will remain relatively cool, it will have a lower wildfire risk than the West or the Southeast. Wildfires thrive in hotter temperatures, which dry out plants and make them easier to ignite.
And, because Duluth is inland, it’s mostly protected from the effects of sea level rise.
Duluth, which sits at the western end of Lake Superior, the greatest of the Great Lakes by volume, also has fresh water. A lot of it. Superior is so voluminous that, if poured out, it would submerge North and South America under a foot of water.
“At the end of the day, it’s really about fresh water,” Dr. Keenan said. “It’s that simple. You’ve got to have fresh water.”
You’ve got to have quite a bit, in fact. To meet our minimum needs, from drinking to cooking and cleaning, the World Health Organization says we need 13 to 26 gallons of water a day, or about 50 to 100 liters. The average American uses 80 to 100 gallons.
The city hasn’t formally adopted Dr. Keenan’s climate refuge plan so far, but it has the attention of the mayor, Emily Larson. “This idea that we have this national researcher who has identified Duluth as a place that has kind of a secret sauce when it comes to being a place for refuge and sustainability and resiliency, that is something you want to be a part of,” she said.
For the plan to work, people would need to actually move to Duluth. The city’s infrastructure can accommodate 150,000 people, but the current population is just 86,000. From 2010 to 2016, though, the city added only 56 people.
Presented to the public at the end of a two-day conference focused on understanding Duluth’s future in a warming world, Dr. Keenan’s research, which partly aims to predict which states are likely to see more people leaving because of climate change, suggested that present-day Texans and Floridians might make excellent futureDuluthians.
“What do people from Florida really want?” said Dr. Keenan, himself a former Floridian who still keeps a residence in the state. “They want the infinite horizon of the ocean.”
This may be why one of Dr. Keenan’s sample advertisements, “Duluth, it’s not as cold as you think,” featured an image of a surfer in a wet suit.
That appeared to be tongue-in-cheek, too. When he showed the photo during his presentation, the audience laughed. Duluth does have a surf season. But the proposed ad glides past the fact that it’s in winter. Surfers head out into the lake in temperatures as low as minus 15 Fahrenheit, or minus 26 Celsius.
“We had one week in particular that it was negative 60 almost every day with wind chill,” said Kyle Skarp, an electrician, as he watched friends play board games in the back room of Blacklist Artisan Ales, a brew pub in Duluth. “You didn’t want to go outside. And not because it was uncomfortable, but because it’s unsafe.”
Mr. Skarp said he liked the idea of more people coming to Duluth. He said it would mean more jobs.
But not everyone agrees with Dr. Keenan’s plan. Because it favors those who are financially able to move, it selects for the affluent and, he acknowledged, raises questions of gentrification.
After the presentation, Karen Diver, a faculty fellow at the College of Saint Scholastica who served as special assistant to the president for Native American affairs during the Obama administration, cautioned that the city had an uneven track record when it comes to embracing diversity.
“From my perspective we haven’t even figured out how to interact in a positive way with our indigenous people,” said Ms. Diver, a member of the Fond du Lac Band of Lake Superior Chippewa who lives near Duluth on her tribe’s reservation.
Mayor Larson seemed to acknowledge that. “I think we have a tremendous amount of work to do as a community to truly be a place where migration and immigration are seen as being strength and vitality and growth,” she said.
Ultimately, if Duluth decides to invest in attracting climate migrants, whether voluntary or displaced, the city may face competition.
At least one other Great Lakes city, Buffalo, 700 miles away on the eastern tip of Lake Erie, has the same kind of winter cold, and all the geographic blessings, that Duluth has. It’s predicted to have fresh water even as the climate warms, and its summers will remain relatively cool.
“We’ve never had a 100 degree day,” said Stephen J. Vermette, a professor of geography at Buffalo State.
But Buffalo has already received what could be described as a wave of climate migrants, after Hurricane Maria devastated Puerto Rico in the autumn of 2017.
They came partly because Buffalo has an established Puerto Rican population, which meant that many prospective migrants had friends and relatives in the city.
At the same time, Buffalo had advertised itself on Puerto Rican television in search of Spanish language teachers. They came because they had connections and knew that there was a chance they could make a life there.
“About 10,000 people came here after the hurricane in Puerto Rico,” said George Besch, chairman of the board of directors at Designing to Live Sustainably, a nonprofit group working to help the Buffalo-Niagara region adapt to climate change.
According to Matthew Hauer, an assistant professor of sociology at Florida State University, people who migrate, whether by choice or not, still like to stick close to home, moving just far enough to get out of harm’s way but often remaining within the same state or region.
When people do go far away, he said, they either move for higher paying jobs, or they “tend to follow kin networks and friend networks.”
Seven research projects in the sciences, social sciences, and humanities will share about $1 million in the fifth round of grants awarded by the Climate Change Solutions Fund (CCSF), an initiative encouraging multidisciplinary research projects that seek creative solutions to climate change.
“Harvard has a responsibility to create knowledge and advance research on the pressing issue of climate change,” said President Larry Bacow. “Since its inception, the Climate Change Solutions Fund has supported groundbreaking work across the University, and this year’s cohort represents the complex, multidisciplinary work required to address profound environmental changes that affect all of us.”
“The CCSF Review Committee and I are delighted with this year’s awards,” said vice provost for research Richard McCullough, whose office administers the fund. “The combination of the varied research in which our faculty and students engage — projects in chemistry, economics, anthropology, architecture, and more — and the support shown by University leadership through CCSF ensures the kind of innovative problem-solving we require around climate change.”
In 2014, President Emerita Drew Faust announced the creation of the fund to accelerate the transition from carbon-based energy systems to renewable ones to create a greener world. To date, more than 40 CCSF projects have received more than $5 million. They have included a wide range of topics, including the creation of a new electrochemical method of capturing carbon dioxide to reduce overall levels in the atmosphere, technological advances to lower the cost of solar energy, partnering with local government agencies to address air pollution in India, modeling local economic impacts of extreme weather events, and targeting the emissions associated with food waste.
The fund’s evaluation committee targets projects representing the range of academic disciplines and research interests across Harvard’s 12 Schools. Special consideration is given to projects seeking to use the campus as a living laboratory to test ideas or produce new insights through the lens of nontraditional disciplines, including the arts and humanities. The fund is supported by the president’s office and the generosity of alumni and others.
Here are this year’s seven projects.
Heat Stress, Labor Fatalities, and Adaptation Policy
Patrick Behrer, Harvard Environmental Economics Program Pre-Doctoral Fellow, Ph.D. candidate, Harvard Kennedy School, Graduate School of Arts and Sciences
Exposure to extreme heat has substantial adverse consequences for workers, from acute health conditions to impaired cognitive function. Despite its significance to the U.S. workforce, only a few studies examine the impact of heat on workplace injuries and fatalities, and they all look only at injuries directly attributed to heat, overlooking its indirect effects and underestimating its true impact. This project will assess exposure to extreme heat and how it relates to workplace injuries and fatalities, and will seek to understand how the effects of heat exposure vary by workers’ occupation, race, and socioeconomic status. By measuring the effectiveness of an existing policy to protect workers from extreme heat, this project aims to inspire new legislation to mitigate the impact of heat on the U.S. workforce.
Behrer will be joined by R. Jisung Park in this project. Park earned his Ph.D. at Harvard in 2017 and was an inaugural CCSF awardee for his research project, “The Critical Moment: Climate Means Versus Extremes in the Economics of Climate Change.”
Pleistocene Park: Mitigating the Effects of Climate Change in the Russian Arctic
Anya Bernstein, John L. Loeb Associate Professor of the Social Sciences, Faculty of Arts and Sciences
This project will advance a climate change mitigation experiment in Pleistocene Park, a unique nature reserve in Arctic Siberia that sits on permafrost, which has a considerable impact on climate change due to the high levels of carbon trapped in the frozen soil. By combining ethnographic and archival methods, the investigator will examine the complex historical, sociopolitical, and cultural contexts that shape interactions between humans, animals, and the environment in Pleistocene Park. The project will also investigate the development of biotechnologies in climate engineering while promoting Russian-American cooperation in counteracting the effects of climate change through permafrost preservation.
Measuring the Gains From Trade in a Market for Decentralized Renewable Energy
Shefali Khanna, Harvard Environmental Economics Program Pre-Doctoral Fellow, Ph.D. Candidate in Public Policy, Harvard Kennedy School, Graduate School of Arts and Sciences
Decentralized solar energy technologies have significant potential for increasing energy access while achieving climate change mitigation goals, particularly in the developing world. Yet the intermittency of solar energy may affect not only the adoption of these technologies but also consumers’ decisions to choose less-sustainable energy sources. One way to address this potential drawback is through peer-to-peer trading of surplus electricity via community microgrids. Trading energy can improve the quality of electricity supply, repurpose excess solar electricity, and allocate electricity to its highest marginal use. This project will test the potential of this technological solution by estimating the gains from trading decentralized renewable energy in rural Bangladesh.
Metal-Organic Phase-Change Materials for Thermal Energy Storage
Jarad A. Mason, assistant professor of chemistry and chemical biology, Faculty of Arts and Sciences
More than 90 percent of energy production and consumption in the world involves the generation of heat, and more than half of home energy use goes to heating and cooling. Despite the tremendous importance of managing thermal energy efficiently, storing thermal energy for later use has received significantly less attention than storing electrical and chemical energy. In a phase-change thermal energy storage system, thermal energy can be transferred into a material and stored as the energy of a phase transition. Importantly, phase-change materials can store large amounts of thermal energy with minimal temperature changes, opening opportunities for efficient energy storage and temperature regulation without external energy input. This project will investigate how the structural and chemical features of metal-organic phase-change materials contribute to their thermodynamic properties, helping design improved solid-solid and solid-liquid phase-change materials for next-generation heat storage systems.
Catching Sunlight with Quantum Mechanics: A Materials-by-Design Approach to Designing New Photovoltaic Materials
Julia A. Mundy, assistant professor of physics, Faculty of Arts and Sciences
Solar energy is recognized as a critical component in achieving a global energy system free of fossil fuels. This project hopes to construct novel materials that can be used in even more efficient and scalable solar-energy harvesting. Ultimately, this high-risk, high-reward research could help us move closer to a more sustainable energy system by identifying new material properties that could give rise to highly efficient photovoltaic cells, to generate electricity directly from sunlight via a naturally occurring electronic process.
Daniel G. Nocera, Patterson Rockwood Professor of Energy, Faculty of Arts and Sciences
The bionic leaf system Nocera co-created with Harvard Medical School Professor Pamela Silver in 2016 uses a combination of hydrogen and specialized bacteria to produce an internal cellular fuel to power the creation of a strong, living fertilizer. The goal of this new project is to advance the bionic leaf in order to examine its effectiveness at establishing carbon and nitrogen cycles and increasing large-scale food production. Studies show that the process by which the leaf powers the fertilization cycle makes it ultimately carbon-negative. Using this biofertilization on a worldwide scale could significantly impact mitigating the effects of climate change. Furthermore, this study hopes to look at how the bionic leaf can be used in a way that is beneficial to people living in environments where large infrastructures for fuel and food production are not available.
Protecting Health by Building Design Under a Warming Urban Climate
Holly Samuelson, assistant professor of architecture, Harvard Graduate School of Design
Extreme heat exposure has proved more fatal to humans than other types of weather event, and the frequency of extreme heat events has increased in recent years. This project will observe how existing buildings in a range of climates manage without air conditioning during a heat event, identifying areas of increased vulnerability and analyzing the survivability of residential buildings in extreme heat conditions. The research can help provide guidelines for climate adaptation and help develop plans for alleviating the impacts of climate change. Looking forward, a parallel assessment will be conducted using risk estimates from previous studies to gauge the impact on future urban regions. Cost-effective strategies developed through this research will offer scalable systems that can guide how urban policy makers, public health officials, and building energy code developers adapt to the changing climate.
Daniel Schrag is the Sturgis Hooper Professor of Geology and a professor of environmental science and engineering at Harvard University. While teaching an undergraduate course he calls “the climate energy challenge,” Schrag also directs the Harvard University Center for the Environment, and co-directs the Science, Technology and Public Policy Program at the Belfer Center for Science and International Affairs.
Schrag served on former President Barack Obama’s Council of Advisors for Science and Technology (PCAST) from 2009 to 2016, and has worked on a range of issues in climate science, geochemistry, earth history, and energy technology. Examples of his research include a 2017 study on the potential impacts of solar geoengineering on extreme heat events, as well as a 2016 paper that looks at how policy decisions in the coming years will influence global climate, ecosystems and human societies for thousands of years into the future.
Schrag recently sat down with Journalist’s Resource to offer tips on environmental and science reporting and share his own observations and perspective on the field. We edited or expanded upon some points to make them even more helpful, after consulting with Schrag.
1 – Understand the science
“Environmental journalism and science journalism are not the same thing, but there’s a big overlap,” Schrag said. A reporter doesn’t necessarily need a science degree to cover many aspects of environmental journalism, such as environmental law, policy and regulation, and environmental justice. But a basic understanding of climate science would help eliminate common errors reporters make in their coverage. “The rate of those mistakes would decrease if journalists had a little more training,” he said.
Science journalists, however, need a background in science. “Science journalism in general has suffered for a very long time. Imagine if someone was covering the financial section of a newspaper and had no economics or financial training … It wouldn’t happen,” Schrag said. He said training for science journalists could consist of an undergraduate degree or training in chemistry or physics, or a science fellowship, depending on the journalist and type of reporting he or she does.
2 – Change your focus
“Climate change is here, it’s happening and going to be with us for thousands of years,” said Schrag. Journalists should be thinking more about how humans can manage climate change – not stop it. They also should focus on communicating the realities of a changing climate. “We do ultimately have to stop greenhouse gas emissions from entering the atmosphere, but that’s going to take a very long time, a century at best,” he said.
3 – Include the correct context
The goal of environmental journalists should be to communicate to the public about what’s going on in the correct context and timescales of climate change. “That includes natural phenomenon, or unnatural,” said Schrag. For example, when Hurricane Harvey dropped over 50 inches of rain near Houston, Texas in September 2017, it was the first time a single place in the continental U.S. had experienced that amount of rain during one storm system, he said. “You could never say that this particular hurricane was caused by climate change,” Schrag said. “But you can say climate change leads to conditions that make these hurricanes worse, and makes it rain more, makes the water in the ocean warmer for the hurricane to grow faster.”
4 – Tell the human story
The impacts of climate-related events can profoundly touch, or sometimes devastate, people and communities around the globe. According to Schrag, there is a place for both scientifically-trained reporters and journalists capable of capturing the emotional impacts of climate change. “Sometimes I feel environmental journalism has grown so close to science writing, I’m afraid it’s lost that emotion.”
5 – Use research to challenge leaders
Schrag criticized President Donald Trump’s decisions to appoint, or try to appoint, government officials who oppose scientific consensus and to leave key scientific leadership positions vacant. He noted that journalists seem to have backed off environmental coverage lately. “I think we’ve all become a little numb to the Trump phenomenon … I haven’t seen a lot of great journalism lately on the environment, partly because they [journalists] might feel there is no audience.”
Journalists, he said, are a key part of holding agencies and officials accountable. “The idea that he [Trump] would nominate someone with no science background and no understanding of environmental issues and is ideologically opposed to it is typical … So, this is a tough time.”
6 – Acknowledge partisan divides
Managing the effects of climate change and environmental degradation has become a deeply partisan issue. “Your views on climate can be almost perfectly predicted from a handful of other questions that have nothing to do with climate … the way you feel about government, about a variety of other issues not related to the environmental at all are predictive about how you’re going to feel about climate change and that’s unfortunate,” said Schrag. Environmental issues often cut across beats, such as criminal justice, politics and economics. Leaning on experts and credible scientists is essential to providing the public with clear, evidence-based information in all areas of environmental reporting. “Journalists need to be fearless and be bold … there are so many attacks on our environmental regulations and our environmental sense of decency, journalists are a critical part of standing up to that and they need to be honest about what they see.”
By Alvin Powell, Harvard Staff Writer
The keys to feeding the 10 billion people expected on Earth by midcentury read like a thoughtful laundry list that’s both reassuring and daunting: new technology, more seafood, more efficient small farms, less food waste, less red meat, and — perhaps — insects.
Experts gathered at the Harvard T.H. Chan School of Public Health Friday laid out the extent of the challenge: with just 7.5 billion people today, some 800 million are underfed, 2 billion eat an unhealthy diet that puts them at higher risk for obesity, diabetes, and other metabolic diseases, and many of the rest eat diets dependent upon an inefficient and unsustainable food production system whose reform will be essential in feeding another 2.5 billion mouths.
Though panelists appearing at The Forum at the Harvard T.H. Chan School of Public Health said the necessary changes, while challenging, are achievable, one of the lowest-hanging fruits has the potential to make a large difference.
A large percentage of the food produced today is lost as waste, and a variety of approaches could make that food available for consumption, according to Gina McCarthy, professor of the practice of public health, former administrator of the U.S. Environmental Protection Agency, and head of the Chan School’s Center for Climate, Health and the Global Environment.
“We waste 40 percent of the food between the farm and the table,” McCarthy said. “And then we have to think about how we get people engaged in this. We want them to demand healthy food, but we also want them to have a rich sense of where their food comes from. I want them to be engaged in the food process, and I want them to think about how we eliminate that waste by engaging them.”
McCarthy said locating farms closer to where food is purchased will reduce the amount of food lost before it reaches store shelves. Better refrigeration, already being developed, is another strategy to cutting waste in the distribution system. Awareness among consumers is also key, she said, and simple steps can be taken to reduce food waste in cafeterias — don’t use large serving trays that encourage people to take more than they will eat — and in the home, where she counseled to first “shop your own fridge.”
While fighting food waste is a step that can help anywhere, the current food production system is complex and diverse, and will require a variety of approaches to become more efficient, panelists said.
Technology may provide one answer, according to David Bennell, manager of food, land, and water for the World Business Council for Sustainable Development, a corporate-led organization seeking sustainable business solutions.
The organization, Bennell said, has developed weather forecasting technology — being piloted in the West African nations of Ghana and the Ivory Coast — that can send weather data to the phones of small farmers in the developing world. Armed with that knowledge, farmers can better plan planting and harvesting to boost agricultural yields. In addition to information, though, such small farms will also benefit from fertilizer, according to Professor of Epidemiology and Nutrition Walter Willett.
“Many people around the world carry mobile phones,” Bennell said. “The idea is: Could we create a mechanism through the mobile phone network that would enable these farmers to make better predictions about weather.”
Improving yields on small farms in the developing world would put additional food where it is needed most. Bennell said that the people suffering the largest food insecurity today are, ironically, the world’s farmers — about 2 billion people live on the developing world’s 475 million farms, according to the U.N. Food and Agriculture Organization — and on those farms women and children are most affected. Another technology-centered strategy, Bennell said, is a plan by Microsoft to use drones and artificial intelligence to survey farmers’ fields from the air and provide data on whether they’re too wet, too dry, or showing signs of insect infestation. Getting that information early will allow farmers to address imbalances before they become problems.
Eating habits are going to have to change, according to Willett, who was part of a commission convened by the scientific journal The Lancet that examined how best to feed 10 billion people.
The commission developed a model diet with the goal of being able to provide healthy food in an environmentally sustainable manner.
The diet proposes eating more fish and plant-based foods — fruits, vegetables, nuts, and legumes — than many in the developed world eat today, Willett said. It would also mean eating far less red meat than is common on many tables — the equivalent of just one hamburger a week or a large steak monthly.
One problem with red meat, Willett said, is that the ratio of grain fed to a cow to get a serving of beef is equal to 20 servings if the grain is eaten directly. In a world struggling for enough to eat, that’s inefficient.
“[The cow] is a huge emitter of greenhouse gases, for all the time it’s living and breathing,” Willett said. “Plus, feeding grain to cattle, in particular, is hugely inefficient — roughly a 20:1 conversion of what we feed cattle to convert it to edible food to humans. [It’s] massively inefficient.”
The “new” diet, Willett said, is actually not that different from the traditional Mediterranean diet as it existed before the advent of modern agricultural practices. Meat was eaten, but sparingly, and rarely as the centerpiece of the meal.
That diet, Willet said, would not just feed a lot of people, but it would also improve health globally, eliminating between 20 percent and a quarter of 11 million diet-related premature deaths annually.
“The good news is it is possible to feed them a sustainable and healthy diet,” Willett said, “but it will require a big change in what we’re doing.”
In response to a question from the audience, Willett said some cultures have long eaten insects and, though dietary studies including insects are rare, researchers should explore whether they deserve a larger role in the future.
Ana Sortun, chef and owner of the Cambridge, Mass., restaurant Oleana and also a panelist, said that a healthful, plant-based diet can be not just sustainable, but also tasty. Cooks, Sortun said, can avoid the “cheap tricks” common in modern processed foods — boosting flavor with fat, salt, and sugar — and still turn out flavorful meals. One key, she said, is to emphasize locally sourced foods, because the fresher the ingredients, the better the flavor.
“There isn’t enough access to really fresh food,” Sortun said. “From a chef’s standpoint, fresh equals flavor.”
Sortun told of a recent trip around the Mediterranean and her exposure to an array of new flavors in Turkey, created not through those fat-salt-sugar tricks, but through careful cooking with chosen spices.
Several panelists pointed to the U.S. Farm Bill as a potential political tool to encourage change in the U.S. industrial agriculture model, which relies heavily on pesticides and fertilizers, the runoff from which chokes waterways. The bill’s incentives, which could be altered, now encourage centralization of agriculture to the detriment of surviving small and medium-sized farms, Willett said. Changing the farm bill, however, would be difficult since the food and agriculture industry is a powerful lobby.
Despite the challenges ahead, McCarthy said that people can be educated and support change with the choices they make every day, reducing waste and building an agricultural system that doesn’t harm the environment.
“I have no doubt we can do it. The question is how to engage enough people … to ensure this is what we deliver to the world,” McCarthy said.
By Alvin Powell, Harvard Staff Writer
Can we eat our way out of some global environmental problems?
That’s a question asked by author Paul Greenberg, who has made his career writing about the problems of the seafood industry and seafood’s potential — with careful management and a shift in consumer practices — to be the foundation of a healthier diet, promote more sustainable use of the environment, and even reduce carbon emissions.
Greenberg, the author of three books on fish, seafood, and the fishing industry, said one health benefit of eating more seafood is consuming more of the omega-3 fatty acids it contains. Those fats, also marketed commercially as supplements, have been long suspected to have heart-healthy effects, and results last fall from the VITAL study, led by JoAnn Manson, the Michael and Lee Bell Professor of Women’s Health at Harvard Medical School and Brigham and Women’s Hospital, showed that taking omega-3 supplements caused large reductions in cardiovascular risk for those who had little fish in their diet.
But “eating seafood” today means something different than it did just a few generations ago, Greenberg said. Instead of diets varying depending on locally available fish, recent decades have seen seafood becoming standardized, with the industrial focus narrowing to four foods: tuna, shrimp, salmon, and several species lumped together as “whitefish.”
Those species can be very energy-intensive to harvest, and rediversifying the diet to increase the consumption of things like mussels and seaweed — which are both high in omega-3s and use much less energy to produce and process — can ease carbon emissions related to seafood harvesting.
Another strategy, Greenberg said, would be to utilize the 20 million to 30 million metric tons of smaller fish — currently ground up to use as fertilizer, for pig and chicken feed, and in aquaculture — for direct human consumption. That would increase the efficiency of a system that currently expends a lot of energy harvesting fish to feed them to something else that humans then eat.
With the planet’s population slated to continue growing, future challenges include not just shifting toward healthier and less-energy-intensive diets, but also simply providing more food. Greenberg said the aquaculture industry has the potential to meet this need and could reduce current problems of near-shore pollution by co-locating its pens at offshore wind farms.
Greenberg, whose talk mirrored the title of his latest book, “The Omega Principle: Seafood and the Quest for Long Life and a Healthier Planet,” spoke at Harvard’s Science Center on Tuesday afternoon. His lecture was followed by a discussion with three Harvard nutrition and seafood experts, including Professor of Epidemiology and Nutrition Walter Willett, Assistant Professor of Nutrition and Planetary Health Christopher Golden, and Assistant Professor of Medicine Susan Korrick. The event was presented by the Harvard University Center for the Environment.
Their discussion, Willett said, goes to the heart of what will be one of the major challenges facing humanity over the next century: feeding a growing population in a sustainable way.
While omega-3 fatty acids are an important nutritional component of seafood, Willett said questions remain as to how much is needed for optimum health.
Studies have shown that omega-3 supplements can reduce cardiovascular risk among those whose diets contain little fish, he said, but it’s likely that the effect plateaus and adding more beyond that to the diet would be of little use. At high enough doses, he said, some nutrients become toxic. Salt, for example, is an essential nutrient, but Willett said it’s likely we’ve gone beyond the plateau of beneficial effects and typically eat too much.
“Like many nutrients, there’s not a linear relationship between intake and how healthy we are,” Willett said. “There are many things that are essential, but we don’t need more.”
Resource-poor parts of the world are facing a somewhat different scenario, Golden said. In places where locally caught fish provide important nutrients in the diet, fish populations are expected to shift and body sizes to decline due to climate change, portending difficult times. Fish provide not just calories and protein for more than a billion people, but also micronutrients that are absent from the tubers and grains that are likely to take their place in the diet.
“It’s very troubling, in my opinion, to look at these types of statistics,” Golden said.
In addition to nutrients, seafood also contains pollutants that concentrate as they make their way up the food chain, Korrick said. Those pollutants, such as mercury, have long been known to be a risk of marine foods, but the use of fish meal to feed land animals like pigs and chicken makes that a problem for the terrestrial food chain as well.
“If there’s the political will and the political interest to really rethink and reimagine our food supply, considering contamination is a critical piece of that process,” Korrick said.
Despite the many challenges, Greenberg contended that increasing seafood consumption, heightening use of aquaculture, and shifting toward less-energy-intensive foods point the way toward a sustainable future.
“We could end up with a planet that is more balanced and perhaps a human body that’s more balanced,” he concluded.
By Alexander Gelfand; illustration by Eric Nyquist
Much hope and plenty of money are riding on the idea that battery-powered electric cars will help slow global warming by reducing tailpipe emissions. But when it comes to reducing the greenhouse gases produced by heavy transportation—namely, the trucks, planes, trains, and ships that move large volumes of goods and people long distances—humanity’s best bet might lie with overweight algae. And staving off the climate apocalypse could be just the beginning.
That’s the premise underlying a decadelong joint effort to develop algae biofuel by ExxonMobil and Synthetic Genomics Inc. (SGI), a private biotech company cofounded by genomics pioneer Craig Venter and Nobel Laureate Hamilton Smith, together with writer and life sciences investor Juan Enriquez (MBA 1986). And it may soon come to fruition: Last March, in the wake of a scientific breakthrough by SGI, the two companies committed to producing 10,000 barrels of algae biofuel a day by 2025.
According to Enriquez, who directed HBS’s Life Sciences Project prior to his current role as managing director of Excel Venture Management, algae biofuels were once the darlings of the alternative energy sector. That’s because the aquatic microorganisms use sunlight, water, and carbon dioxide to photosynthesize sugar, proteins, and fat—the latter in the form of an oil that can replace fossil fuels in applications where batteries either can’t store enough power or are simply too heavy to lug around, like commercial aviation and maritime shipping.
In addition, algae can grow in salty or brackish water under extremely harsh conditions; so unlike other biofuel feedstocks such as corn and soy, algae don’t need to compete with agricultural crops for fresh water and arable land. And the oil they produce is free from the pollutants that must be removed from fossil crude.
As a result, algae could pull fossil-fuel generated CO2 out of the atmosphere and transform it into nearly carbon-neutral diesel or jet fuel with minimal environmental impact—a handy trick when demand for transportation energy is on the rise, and the need to manage global emissions grows ever more urgent.
The prospect of a clean energy source that could serve double duty as a carbon-capture technology has proven irresistible to investors, who have sunk hundreds of millions of dollars into dozens of algae biofuel startups. Unfortunately, says HBS Senior Fellow Joseph Lassiter, whose research focuses on developing carbon-neutral energy supplies, efforts to produce algal crude cheaply and efficiently have met with nothing but failure. The reason: basic biology.
One can easily persuade algae to produce more oil by starving them of nutrients like nitrogen, prompting the single-celled organisms to bulk up on fat like bears preparing for winter. Alas, just like bears, the microscopic butterballs eventually go into hibernation. And once that happens, they stop growing, negating the gains made in oil production.
SGI solved that biological catch-22 by genetically engineering algae to get fat without going comatose. As a result, “You can take the brakes off oil formation without putting the brakes on growth,” says SGI’s CEO, Oliver Fetzer.
In a study published in Nature Biotechnology in 2017, SGI researchers analyzed the genome and metabolism of the marine algae Nannochloropsis gaditana and uncovered a group of genes responsible for regulating oil production. By tweaking one of those genes with the powerful editing tool known as CRISPR, the team ultimately doubled the amount of oil produced by the algae without significantly hindering their growth.
SGI’s breakthrough finally provides a line of sight to a scalable algae biofuel. The company is already growing algae in outdoor ponds at a test facility near California’s Salton Sea, and Fetzer envisions a day when large pools of algae will be located wherever saltwater and consistently warm temperatures are to be found. The ponds could even be parked next to heavy CO2 emitters like cement factories and power plants so that the organisms can suck up excess carbon while churning out clean, renewable biocrude.
Having cracked the problem of boosting oil production, engineering algae to make petrochemicals ranging from fertilizers to plastics ought to be relatively straightforward. What’s more, the knowledge gained from the biofuels project should eventually permit researchers to turn algae into microscopic factories for the manufacture of virtually any organic compound, leading to what Enriquez describes as a full-blown algal revolution. “You can make vaccines in the stuff, you can make medicines in the stuff, you can make food in the stuff,” he says.
Capitalizing on its burgeoning algal expertise, SGI has already bred one strain that can make highquality protein and healthful fatty acids, and it hopes to coax others into producing biological drugs such as the antibodies used to treat cancer and autoimmune diseases.
Right now, however, the biofuel breakthrough is generating the most buzz—and for good reason, given the looming possibility of catastrophic climate change and the desperate need for fossil-fuel substitutes.
Fetzer hopes to have a pilot facility up and running by 2025 that can meet ExxonMobil’s production goals while sucking CO2 from a heavy polluter. He readily admits that challenges remain—like figuring out how best to extract the algae from their ponds and expel their oil—but the goal of producing algae biofuel that can compete with traditional diesel is finally within reach.
By Leah Burrows, SEAS Communications; Photo by chuttersnap on Unsplash
One of the key misconceptions about solar geoengineering — putting aerosols into the atmosphere to reflect sunlight and reduce global warming — is that it could be used as a fix-all to reverse global warming trends and bring temperature back to pre-industrial levels.
It can’t. Applying huge doses of solar geoengineering to offset all warming from rising atmospheric C02 levels could worsen the climate problem — particularly rainfall patterns — in certain regions. But could smaller doses work in tandem with emission cuts to lower the risks of a changing climate?
New research from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), in collaboration with MIT and Princeton University, finds that if solar geoengineering is used to cut global temperature increases in half, there could be worldwide benefits without exacerbating change in any large geographic area.
Finding the right "dose" for solar geoengineering
“Some of the problems identified in earlier studies where solar geo-engineering offset all warming are examples of the old adage that the dose makes the poison,” said David Keith, the Gordon McKay Professor of Applied Physics at SEAS and senior author of the study. “This study takes a big step towards using climate variables most relevant for human impacts and finds that no IPCC-defined region is made worse off in any of the major climate impact indicators. Big uncertainties remain, but climate models suggest that geoengineering could enable surprisingly uniform benefits.”
The research is published in Nature Climate Change.
To better understand what regions could experience worse climatic conditions if solar geoengineering were combined with emissions cuts, the researchers used a state-of-the-art high-resolution model to simulate extreme rainfall and tropical cyclones (a.k.a. hurricanes). It’s the first time such a model has been used to study the potential impact of solar geoengineering.
Researchers looked at temperature and precipitation extremes, water availability, and a measure of the intensity of tropical storms. They found that halving warming with solar geoengineering not only cools the planet everywhere but also moderates changes in water availability and extreme precipitation in many places and offsets more than 85 percent of the increase in the intensity of hurricanes.
Less than 0.5 percent of the land would see the effects of climate change exacerbated, according to the model.
“The places where solar geoengineering exacerbates climate change were those that saw the least climate change to begin with,” said Peter Irvine, Postdoctoral Research Fellow at SEAS and lead author of the study. “Previous work had assumed that solar geo-engineering would inevitably lead to winners and losers with some regions suffering greater harms; our work challenges this assumption. We find a large reduction in climate risk overall without significantly greater risks to any region.”
The researchers are quick to point out that this is a simplified experiment, which assumed doubled CO2 concentrations and represented solar geo-engineering by turning down the sun. However, it is a first step towards understanding how solar geoengineering could be used in tandem with other tools to mitigate some of the worse impacts of climate change.
"For years, geoengineering has focused on compensating for greenhouse gas induced warming without worrying too much about other quantities like rainfall and storms,” said Kerry Emanuel, the Cecil & Ida Green Professor of Atmospheric Science at MIT and co-author of the study. “This study shows that a more modest engineered reduction in global warming can lead to better outcomes for the climate as a whole."
“The analogy is not perfect but solar geoengineering is a little like a drug which treats high-blood pressure,” said Irvine. “An overdose would be harmful, but a well-chosen dose could reduce your risks. Of course, it’s better to not have high-blood pressure in the first place but once you have it, along with making healthier lifestyle choices, it’s worth considering treatments that could lower your risks.”
This research was co-authored by Jie He, Larry W. Horowitz, and Gabriel Vecchi.
By Jill Radsken, Harvard Staff Writer
Bruno Carvalho has published research on topics ranging from environmental justice and race to city planning and literature. His award-winning “Porous City: A Cultural History of Rio de Janeiro” made the case for his native city as a place of cultural history defined by porous spaces and structural inequalities. Carvalho earned his Ph.D. in Romance Languages and Literatures at The Graduate School of Arts and Sciences in 2009. He is co-editor of the book series “Lateral Exchanges,” about architecture and urbanism in a global context.
GAZETTE: Can you talk about your research?
CARVAHLO: My research is a bit wide-ranging, but it broadly focuses on cities as lived and imagined spaces, especially in Brazil. I’m beginning work on a cultural history of futures. We can think of much of modernity in terms of competing visions of what the future ought to be like. In contrast, today, with the realities of climate change and labor precarity setting in, it often seems as if a dreadful future is inevitable. In the 1920s, for example, some vied for car-centric and highly segregated cities, others for mixed-race, multicultural utopias. Urban visions have become in many ways more modest and contingent. Contemporary urbanism absorbed important lessons from the failures of top-down, authoritarian, modernist projects, and though that’s of course a good thing, the daunting scale of our challenges demands that we conceive transformations with imagination and ambition.
Reflecting on how the future was conceived in the past can help to expand the realm of possibilities. Most of my research brings together perspectives from the social sciences, design, and cultural materials. Art, literature, film, and historical knowledge can all push us to confront entrenched intuitions and stretch the limits of the thinkable. We shouldn’t assume that cities are doomed to the levels of segregation so common in the United States today, nor that this will solve itself. The study of the past can act as an antidote to a type of conformity that our dire problems sometimes produce.
A lot of people already get that a certain status quo is untenable, whether it’s fossil-fuel dependency, hyper-concentration of wealth, or the war on drugs. But as I like to remind my students, if the scale of needed changes seem unviable, unforeseen social, political, and technological transformations happen as a norm.
I also have a body of research on the 18th-century period, which includes publications on topics like the emergence of anti-black racism in scientific thought, and on how the circulation of French translations of U.S. constitutional documents played a role in failed independence movements in Brazil. The Enlightenment sometimes appears to nonspecialists as this pivotal, but sort of flattened phenomenon — in some circles tied to progress and freedom, in others to Eurocentrism and exploitation. Rethinking the Enlightenment from the perspective of Brazil helps to foreground some of its tensions and contradictions, and allows us to trace the formation of modern ideas about sovereignty and individual liberty, as well as about race and white supremacy, cities as harbingers of civilization, and nature as a resource rather than something to which we belong.
GAZETTE: Last fall, you taught a seminar called “Writing and Urban Life”; this spring you’ll teach a graduate seminar called “Imagine Futures,” and you have a Gen Ed course debuting in two years. What are they about?
CARVAHLO: The fall seminar was a great welcome to Harvard. It brought together a wonderful group of graduate and undergraduate students. Writing and urbanization have entangled histories, and have become central to the very ways in which we constitute subjecthood in the modern world. Both are relatively recent developments in our history as a species. We discussed some very large questions and reviewed canonical debates, but we also concentrated on a set of authors from the past century or so, mostly from Brazil, who in various ways push us to “denaturalize” a lot of what we tend to take for granted about urban life. Much of the writing we analyzed is attuned to the strangeness in familiar modes of being, as well as to the perils, promises, and potentials of this massive experiment in which urbanites are now engaged: living closely in density among strangers. That’s just not how most humans before us did it.
This spring’s seminar will look at how past ideas of what the future should look like have helped to shape cities in all sorts of material ways — say, for example, associations between order and progress with geometric patterns like the grid. We’ll also try to recover ideas that largely lost out but can resonate today, such as challenges to human exceptionalism that contemplate the place of other life forms in the worlds we’ve built. We will discuss how in the history of planning, the unplanned and even the improbable happen often, and how the future is, by definition, out of reach and, therefore, always in a way imagined. We will focus on Brazil, the country; like the Americas as a whole, it’s a fertile space to generate these reflections because so many futures were projected on it throughout colonial and modern history. Brazil has been alternatively conceived as Edenic and dystopian. We’ll focus on historical turning points in urbanization and culture and try to understand their specificities, but we won’t lose sight of our current predicaments. After all, our collective planetary futures are very much at stake in regions like the Amazon, which is now a contested site for different visions of what the world ought to be like.
Next academic year when Neil Brenner, professor of urban theory at GSD, is back from sabbatical, we will co-teach a Gen Ed course called “Living in an Urban Planet.” So even though it’s already a cliché to say that more than half of the world population lives in cities, we actually tend to underestimate how much of the planet urbanization encompasses. If we think of energy systems and refuse, for example, or even the circulation of urban cultural production, where do our cities end? In this course we will discuss urban transformations at various scales, from the planetary to the sidewalk. It’s been stimulating to work with Neil on this. We share a number of interests, and tend to approach related questions in very different but complementary ways.
GAZETTE: You are leading an effort to create a secondary field in urban studies, something you were involved in at Princeton. Is it meant to be a cross-disciplinary effort, and what kinds of conversations do you hope will come from it?
CARVAHLO: My dream is to build a program in urban studies like our cities at their best: places of intellectual exploration, encounters with difference, lively exchanges. Urban experiences, much like a liberal arts education, can expose us to multiple ways of being and belonging in the world. They can move us to step outside of ourselves, to inhabit multiple perspectives, to exceed our assigned roles. Because there are already so many wonderful urban-related courses, and because there is no formal urban studies curriculum outside of the professional schools, we have the opportunity to build something really special. An urban studies curriculum can bring together students and faculty with mutual interests, but whose paths might not cross otherwise. At Princeton, we built a thriving program, and saw how it had transformative potential, especially for undergrads. Urban studies can introduce students to very basic facts about the world around them that they might not otherwise learn, like the role of segregatory housing in the U.S. wealth gap. It can introduce issues like inequality in resonant ways. Urban studies presents opportunities for poets and engineers to discuss different standards for value, or for anthropologists and computer scientists to rigorously debate the blind spots and uses of big data or GIS.
An institutional space around the urban could help us to break down siloes, building links across disciplinary and geographic boundaries. Neil, Eve Blau (GSD), and I are working with several colleagues on addressing some of these issues by reviving the Harvard Mellon Urban Initiative, which Eve and Julie Buckler, Samuel Hazzard Cross Professor of Slavic Languages and Literatures and of Comparative Literature, created as part of a grant funded by the Mellon Foundation.
GAZETTE: There has been so much challenging news coming out of Brazil, from the devastating National Museum fire to the recent presidential election. What are your thoughts about the cultural future of your homeland?
CARVAHLO: Brazil’s cultural landscapes are full of dynamism and utopian yearnings that have worked to destabilize structures of inequality, broadening the horizons of possibility. As elsewhere, we have recently seen extremist political movements take advantage of a very understandable sense of disillusionment and frustrations with futures that never arrived. Early in Brazil’s election, when not many expected surprises, I wrote a long essay on the appeal of politicians positioning themselves as anti-establishment, promising a return to a fantasy-based past, and of groups that have turned digital tools like YouTube and WhatsApp into engines for far-right radicalization and for the spread of misinformation. I think we cannot underestimate the grave threats to the environment, to a free press, to research and education and to vulnerable populations in Brazil, including indigenous groups. But there are many people fighting for democracy too.
The least important thing is setting ourselves up to say “I told you so.” We have to continue standing up for evidence-based approaches to our problems, but that won’t be enough. We also need to nurture alternative, inclusive visions for the future. One person who did that brilliantly was Marielle Franco, a young, Afro-Brazilian native of a Rio de Janeiro favela who was elected to the city council and was assassinated last year. Sidney Chalhoub (professor of history and African and African American studies) and I are planning an event here at Harvard with feminist leaders and former colleagues to celebrate her legacy. We do not yet know for certain who was behind her murder, but we know that the last electoral cycle empowered some individuals who mocked or made light of her death.
There are also renewed threats to the Amazon in growing deforestation and attacks on indigenous people. Brian Farrell, director of the David Rockefeller Center for Latin American Studies, Monique and Philip Lehner Professor for the Study of Latin America, and professor of biology; postdoc Bruno de Medeiros; and I are collaborating on a conference called “Amazonia and Our Planetary Futures.” We are assembling specialists from government, the private sector (including biodiversity economies), scientists, and indigenous leaders. It’s all hands on deck to avert catastrophe and create better futures!
To ask when we started looking at mountains is by no means the same as asking when we started to see them. Rather, it is to question what sorts of aesthetic and moral responses, what kinds of creative and reflective impulses, our new found regard for them prompted. It is evident enough that in a more or less recent geological time frame mountains have always just been there. It is possible that mountains, like the sea, best provide pleasure, visual and otherwise, when experienced from a (safe) physical and psychical distance. But it might also be the case that the pleasures mountains hold in store are of a learned and acquired sort.
Which is also to say that mountains themselves, for all their unforgiving thereness, are themselves the products of unwitnessed Neptunian and Vulcanian tumults or divine judgment. For the late seventeenth-century theologian and cosmogonist Thomas Burnet, mountains were “nothing but great ruins.” A dawning appreciation of these wastelands appeared in the critical writings of John Dennis. Satirized as “Sir Tremendous Longinus” for his rehabilitation of the antique aesthetic category of the sublime, Dennis expressed the complex concept of “delightful horror.” Mountain gloom was ready to become mixed with mountain glory. More work was still to be done on the literary and philosophical front before the Romantic breakthrough, one high vantage point being the essayist Joseph Addison’s dream of finding himself in the Alps, “astonished at the discovery of such a Paradise amidst the wildness of those cold hoary landscapes.”
But a kindred innovation in seeing and feeling was called for in the formation of mountains and the rise of landscape. Mountains, among other earth forms, are both the medium and outcome of still-evolving habits of experiencing, making, and imagining. Architects and landscape architects, mutually occupied with the horizontal surface, have had a touch equally as searching as that of mountaineers and poets in sensing the terrain. Mountains and the Rise of Landscape is the culmination of a curatorial project and a research seminar conducted at the Graduate School of Design, the latter focusing on the question, How do you model a mountain? The installation in the Druker Design Gallery and continuing in the Frances Loeb Library collects diverse objects and scientific instruments, drawings, photographs, and motion pictures of built and imagined projects and presents invitingly challenging modes of seeing (and hearing!) mountains of varied definition. Allied with the work of artists, visionaries, and interpreters of natural and cultural meaning, they propose new and foregone possibilities of perception and form-making in the acts of leveling and grading, cutting and filling, shaping and contouring, mapping and modeling, of reimagining “matter out of place,” and finally of stacking the odds and mounting the possibilities.
Mountains and the Rise of Landscape offers five thematic sections:
Mountain Lines of Beauty: Mild mountaineers, John Ruskin (1819–1900) and Eugène Viollet-le-Duc (1814–1879) were among the first theorists, designers, and architects to teach us that the “line” formed by crests, peaks, and ridges presents an exemplary form of beauty. The invention of the panorama and the photomechanical reproduction of mountain views transformed a geophysical phenomenon into an object of aesthetic value and topographical knowledge. Guidebooks, geographical manuals, and maps glorified specific ranges by showing their most beautiful contours. To define a single mountain or group of mountains as a “line,” however, implies a process of abstraction. This process is both enhanced and complicated by contemporary tools such as CAD, GIS, and GPS. To draw the most important mountain ranges of our contemporary world as “mountain lines of beauty”—the phrase is evidently inspired by the eighteenth-century painter William Hogarth’s analysis of the serpentine, S-shaped “line of beauty”—should not be seen as a simplified way to represent them. Rather, the difficulty of representation itself becomes visible in the constructedness of the lines. Through them the possibility arises, again, of our being surprised by the sublimity, as well as the beauty, of the mountains.
Artificial mountains are a worldwide phenomenon. Burial sites, such as Etruscan tumuli, were often marked by the intimidating form of the man-made mountain. Incense burners in ancient China evoked the Five Sacred Mountains. A representation of Mount Parnassus was a significant element of European gardens and a symbol of Renaissance humanism. Artificial mounds, typically composed of locally excavated material, may be seen as so many milestones in the history of landscape architecture. The industrial revolution accelerated the rise of an anthropic topography, producing landforms that we often no longer recognize as being artificial. Mountains are ubiquitous in twentieth-century and contemporary art, with a special place—between site and non-site—reserved for the explorations of Robert Smithson, who reversed, displaced, and rebuilt the form, material, and meaning of mountains.
Camouflage: Among the first who climbed high mountains in antiquity were members of the military, in search of an advantageously elevated view of their enemy’s position. From the seventeenth century onward, many mountainous regions were massively fortified, with military infrastructures placed strategically to take advantage of their secluded impregnability. The photographer Leo Fabrizio has documented traces of former military constructions hidden in the most remote areas of Alpine Switzerland. His visual archeology of camouflage techniques employed by the Swiss military exposes the unfamiliar territory of a landscape that still appears “natural” while being completely transformed from within.
Glaciers are in retreat throughout the world. Celebrated and studied during the eighteenth century as sublime objects—sung of by poets and depicted by landscape painters—glaciers register today as metonymies of global climate change and vanishing natural and scenic phenomena. Geneva-based composers Olga Kokcharova and Gianluca Ruggeri have explored the fascinating soundscape of the Mont Miné Glacier in the Swiss canton of Valais. Since 2000, the 4.9 mile-long glacier has lost about eighty-five feet per year. To hear the “voice” of a glacier compellingly questions the visual bias of the landscape-oriented perspective. The mysterious sounds of the white masses bear melancholy aural testimony to the progressive disappearance of a titanic natural feature.
Inhabitants of the Alpine regions have practiced transhumance for centuries, droving livestock between the valleys in winter and the high mountain pastures in summer. Many of the wooden or stone structures built by farmers to shelter their cattle and themselves have been abandoned, ruined, and in some instances transformed into chalets. Martino Pedrozzi, a Ticino-based architect, has worked for a decade in the remote valleys of southern Switzerland. His Recompositions, carried out with his students at the Mendrisio Academy of Architecture and other volunteers, consist in repairing the existing structures or in composing a new object from the abandoned material. The resulting architectural objects are designedly functionless; they are poetic metaphors and visual documents of a past that is at risk of disappearance.
By Peter Reuell, Harvard Staff Writer
In the coming decades, cities and towns up and down the eastern seaboard will have to come to terms with the impact of rising sea level due to climate change. A new study, however, is suggesting that rising sea levels may be only part of the picture — because the land along the coast is also sinking.
That’s the key finding of Professor of Earth and Planetary Sciences Peter Huybers, Frank B. Baird Jr. Professor of Science Jerry Mitrovica, and Christopher Piecuch, an assistant scientist at the Woods Hole Oceanographic Institution, who used everything from tide gauges to GPS data to paint the most accurate picture ever of sea-level rise along the east coast of the U.S. The researchers are co-authors of the study, recently published in Nature.
“What we are seeing at a large scale, and this was a surprise to me, is a very clear pattern that you would expect if the response to the last ice age were the primary control on the differential rates of sea-level rise across the eastern U.S.,” said Huybers. In other words, between 20,000 and 95,000 years ago, the Laurentide Ice Sheet, which covered most of northern North America, levered the land upwards. “Now, thousands of years after the ice is gone,” Huybers said, “the mid-Atlantic crust is still subsiding.
“In New England, there is not too much additional sea-level rise from land motion because it’s near the hinge point,” he continued. “The bulge caused by the ice sheet was centered on the mid-Atlantic, and because it’s still settling down, the relative rise of sea level in the mid-Atlantic is about twice the global average.”
What that means, Huybers said, is that we need to prepare for greater rates of relative sea-level rise along the mid-Atlantic because of the combined effects of the natural subsidence of the land and human-caused rises in sea level.
“The fact that the mid-Atlantic is subsiding because of long-term geologic processes means that it will continue for centuries and millennia, in addition to whatever other changes in sea level occur,” Huybers said. “The mid-Atlantic is already having to cope with routine coastal flooding, and this problem is only going to get worse with time.”
Developing estimates of how much various factors contribute to sea level rise, however, is easier said than done.
“Sea level is a noisy place,” said Mitrovica. “Tides go up and down, waves crash, there is ice melting, ocean circulation changes, the warming of the ocean. … If you want to understand sea level in its totality, you need to know what all those factors are doing.”
One of the first researchers to attempt that feat, he said, was Carling Hay, a former postdoctoral fellow in Mitrovica’s lab and now an assistant professor at Boston College.
In 2014, while at Harvard, Hay published a groundbreaking study that used advanced statistical techniques to sift through dozens of data sets and factors influencing sea-level rise. She came to the surprising conclusion that during the 20th century, sea levels had risen more slowly than many had estimated.
“Unfortunately, what that means is that if sea levels weren’t rising as fast as we thought in the 20th century, they have been going up significantly faster than we thought over the last 20 years,” Mitrovica said. “That was a real demonstration of the power of statistical work in a field where it had not been very common.”
With the new study, Mitrovica said, Piecuch took that idea and ran with it. But rather than trying to estimate worldwide sea level rise over the past century, he chose to home in on one particular region over a shorter time period.
“So he can use all sorts of data sets,” Mitrovica said. “He can use GPS, which tells you how the land is moving, but he’s also got sea-level data going back several thousand years, tide gauges, and other data. He throws all that into the stew … and asks where the east coast is going and what’s contributing to that change. What Chris has done has solved this long-standing, 30-year problem.”
But the work, Mitrovica pointed out, is part of a trajectory. “The next thing that’s going to happen is we will be able to bring in satellite data and we can step back and look at this globally,” he said. “And I think for the first time we may be able to separate out the various contributors to sea-level rise.”
“There is a rather confusing montage of possibilities for why sea level could be changing,” Huybers added. “What Chris has done is to pull together disparate information that were distributed across a number of locations and different time intervals and put them together in a fully probabilistic way, allowing for better estimates of historical rates of sea-level change and how the ongoing response to the last ice age will contribute to future changes.”
Image source: AP
By Clea Simon, Harvard Correspondent
The threat of climate change is dire, but Hal Harvey sees a path forward.
In “Getting to Zero on Climate Change,” a stirring presentation recently at the Harvard University Center for the Environment, Harvey, the CEO of Energy Innovation Policy and Technology in San Francisco, stressed both the urgency of the problem and specific steps that could, he said, make the difference between accelerating toward destruction or innovating toward prosperity.
Citing “a massive problem with horrifying dimensions,” Harvey, the co-author of “Designing Climate Solutions: A Policy Guide for Low-Carbon Energy,” said that today, “extremes have become the norm.” From drought and wildfires to unprecedented floods and cold snaps, he detailed the effects we have already begun to experience. More terrifying is how close we are to climate tipping points, such as the thawing of the tundra — what used to be known as “permafrost” — which would release massive amounts of methane and carbon.
“After a certain point you unleash natural systems and there’s no going back,” he said.
But Harvey also laid out a series of steps that could counteract the problem, including policy recommendations that could be affected by concerned citizens.
Rejecting small-scale, feel-good campaigns — “We shouldn’t have a strategy about plastic straws,” he said — Harvey broke down the problem into four major sectors that contribute the most to climate change: energy and the electric grid, transportation, buildings, and industry. Again pushing for efficacy, he suggested moving for change in the top 20 countries that contribute to climate change, specifically global heavyweights such as the U.S., China, E.U., India, and Russia. He then isolated specific policy changes here that could make a difference for our country and, ultimately, for the globe.
One is the electrical grid. Harvey noted that green energy sources such as solar and wind are already becoming more cost-effective. In fact, with new technologies, such as larger wind turbines and turbines that can float and thus be placed farther from land, off-shore wind power is on the verge of becoming a major industry.
However, these sources are intermittent, leaving many green-energy proponents focused on expensive and, thus far inefficient, battery technology. Instead, Harvey suggested that smarter and more flexible grids can enable municipalities to share resources, leveling out supply and demand. He said demand can also be managed, for instance by cooling skyscrapers in advance of extreme weather, thus using less energy during peak demand times.
He also supported the wider use of renewable portfolio standards that would reward those who invest in these green power sources.
Turning to transportation, Harvey applauded but dismissed the electric vehicle movement, which he said has too small a share of the market to make a difference. For vehicles already on the road, he suggested increasing incentives such as tax rebates and nonfinancial rewards, such as free parking. For the billions of cars that will be built in coming years, however, Harvey argued for super fuel-efficiency, pointing out that in addition to changes in engine technology, efficiency can be increased by making vehicles lighter and less wind-resistant.
For buildings, Harvey said low-emission windows, coated with an invisible metallic layer, already greatly decrease demands for heating and cooling. He called for stronger building codes, like California’s, that focus on annual percentage gains in efficiency. Such continuous and expected progress does not need to be revisited legislatively, he noted, and creates a stable environment that let businesses plan for future construction. As a corollary, Harvey also called for stronger appliance efficiency, a trend that has already proved popular with consumers.
Harvey said industry can also take steps to reduce waste. New 3-D technologies are already helping, as they define the specific components of building projects, eliminating the waste of concrete and other materials. In industrial engines, variable speed settings that use smart technologies to adjust automatically save on power as well as costs.
Harvey pointed out that “most of the money [to make these changes] is there already.” While many climate change activists spend time trying to raise funds to help emerging countries, Harvey said the “world already spends about $5 trillion dollars a year on energy and another $6 trillion for infrastructure setting up consumption.” Reallocating these resources, rather than battling for new ones, is an achievable goal, he said. To effect the change, concerned citizens need only find out who is really in charge. Public utility commissions, for example, often have more practical impact than legislative bodies and regularly hold open meetings.
Do the triage,” Harvey said. “Understand which policies make a difference and pay attention to who makes the decisions.
“With a modest amount of work, a few tens of hours, you can become a player.”
Image by: Jon Chase/Harvard Staff Photographer
By Neil Irwin
By now, it’s clear that climate change poses environmental risks beyond anything seen in the modern age. But we’re only starting to come to grips with the potential economic effects.
Using increasingly sophisticated modeling, researchers are calculating how each tenth of a degree of global warming is likely to play out in economic terms. Their projections carry large bands of uncertainty, because of the vagaries of human behavior and the remaining questions about how quickly the planet will respond to the buildup of greenhouse gases.
A government report in November raised the prospect that a warmer planet could mean a big hit to G.D.P. in the coming decades.
And on Thursday, some of the world’s most influential economists called for a tax on carbon emissions in the United States, saying climate change demands “immediate national action.” The last four people to lead the Federal Reserve, 15 former leaders of the White House Council of Economic Advisers, and 27 Nobel laureates signed a letter endorsing a gradually rising carbon tax whose proceeds would be distributed to consumers as “carbon dividends.”
The Trump administration has long rejected prescriptions like a carbon tax. But policy debates aside, many of the central economic questions of the decades ahead are, at their core, going to be climate questions. These are some of the big ones.
How permanent will the costs be?
When we think about the economic damage from a hotter planet, it’s important to remember that not all costs are equivalent, even when the dollar values are similar. There is a big difference between costs that are high but manageable versus those that might come with catastrophic events like food shortages and mass refugee crises.
Consider three possible ways that climate change could exact an economic cost:
The farmland’s yield decline is a permanent loss of the economy’s productive capacity — society is that much poorer, for the indefinite future. It’s worse than what happens in a typical economic downturn. Usually when factories sit idle during a recession, there is a reasonable expectation that they will start cranking again once the economy returns to health.
The road rebuilding might be expensive, but at least that money is going to pay people and businesses to do their work. The cost for society over all is that the resources that go to rebuilding the road are not available for something else that might be more valuable. That’s a setback, but it’s not a permanent reduction in economic potential like the less fertile farmland. And in a recession, it might even be a net positive, under the same logic that fiscal stimulus can be beneficial in a downturn.
By contrast, new investment in the power grid could yield long-term benefits in energy efficiency and greater reliability.
There’s some parallel with military spending. In the 1950s and ’60s, during the Cold War, the United States spent more than 10 percent of G.D.P. on national defense (it’s now below 4 percent).
Most of that spending crowded out other forms of economic activity; many houses and cars and washing machines weren’t made because of the resources that instead went to making tanks, bombs and fighter jets. But some of that spending also created long-term benefits for society, like the innovations that led to the internet and to reliable commercial jet aircraft travel.
Certain types of efforts to reduce carbon emissions or adapt to climate impacts are likely to generate similar benefits, says Nicholas Stern, chair of the Grantham Research Institute on Climate Change and the Environment at the London School of Economics.
“You couldn’t provide sea defenses at large scale without very heavy investment, but it’s not investment of the kind that you get from the things that breed technological progress,” Mr. Stern said. “The defensive adaptations don’t carry anything like the dynamism that comes from different ways of doing things.”
There is more fertile ground in areas like transportation and infrastructure, he said. Electric cars, instead of those with internal combustion engines, would mean less air pollution in cities, for example.
How should we value the future compared with the present?
Seeking a baseline to devise environmental regulations, the Obama administration set out to calculate a “social cost of carbon,” the amount of harm each new ton of carbon emissions will cause in decades ahead.
At the core of the project were sophisticated efforts to model how a hotter earth will affect thousands of different places. That’s necessary because a low-lying region that already has many hot days a year is likely to face bigger problems, sooner, than a higher-altitude location that currently has a temperate climate.
Michael Greenstone, who is now director of the Becker Friedman Institute at the University of Chicago and of the Energy Policy Institute there, as well as a contributor to The Upshot, was part of those efforts.
“We’ve divided the world into 25,000 regions and married that with very precise geographic predictions on how the local climate will change,” Mr. Greenstone said. “Just having the raw computing power to be able to analyze this at a more disaggregated level is a big part of it.”
But even once you have an estimate of the cost of a hotter climate in future decades, some seemingly small assumptions can drastically alter the social cost of carbon today.
Finance uses something called the discount rate to compare future value with present value. What would the promise of a $1,000 payment 10 years from now be worth to you today? Certainly something less than $1,000 — but how much less would depend on what rate you use.
Likewise, the cost of carbon emissions varies greatly depending on how you value the well-being of people in future decades — many not born yet, and who may benefit from technologies and wealth we cannot imagine — versus our well-being today.
The magic of compounding means that the exact rate matters a great deal when looking at things far in the future. It’s essentially the inverse of observing that a $1,000 investment that compounds at 3 percent a year will be worth about $4,400 in 50 years, whereas one that grows 7 percent per year will be worth more than $29,000.
In the Obama administration’s analysis, using a 5 percent discount rate — which would put comparatively little weight on the well-being of future generations — would imply a social cost of $12 (in 2007 dollars) for emitting one metric ton of carbon dioxide. A metric ton is about what would be released as a car burns 113 gallons of gasoline. A 2.5 percent rate would imply a cost of $62, which adds up to hundreds of billions of dollars a year in society-wide costs at recent rates of emissions.
The Obama administration settled on a 3 percent discount rate that put the social cost of carbon at $42 per metric ton. The Trump administration has subsequently revised that estimate to between one dollar and seven dollars.
That sharp decrease was achieved in part by measuring only the future economic costs to the United States, not factoring in the rest of the world. And the Trump administration analyzed a discount rate of up to 7 percent — a rate at which even costs far into the future become trivial.
Mr. Greenstone favors substantially lower discount rates, based on evidence that financial markets also place high value on investments that protect against risk.
Understood this way, spending today to reduce carbon emissions tomorrow is like insurance against some of the most costly effects of a hotter planet — and part of the debate is over how much that insurance is really worth, given that the biggest benefits are far in the future.
How might climate change fuel inequality?
When a government report raises the possibility of a 10 percent hit to G.D.P. as a result of a warming climate, it can be easy to picture everyone’s incomes being reduced by a tenth.
In reality there is likely to be enormous variance in the economic impact, depending on where people live and what kind of jobs they have.
Low-lying, flood-prone areas are at particularly high risk of becoming unlivable — or at least uninsurable. Certain industries in certain places will be dealt a huge blow, or cease to exist; many ski slopes will turn out to be too warm for regular snow, and the map of global agriculture will shift.
Adaptation will probably be easier for the affluent than for the poor. Those who can afford to move to an area with more favorable impacts from a warmer climate presumably will.
So the economic implications of climate change include huge shifts in geography, demographics and technology, with each affecting the other.
“To look at things in terms of G.D.P. doesn’t really capture what this means to people’s lives,” said William Nordhaus, a Yale economist who pioneered the models on which modern climate economics is based and who won a Nobel for that work. “If you just look at an average of all the things we experience, some in the marketplace and some not in the marketplace, it’s insufficient. The impact is going to be highly diverse.”
Can we adapt to a warmer climate?
Despite all these risks, it’s important to remember that humanity tends to be remarkably adaptable. A century ago, most people lived without an automobile, a refrigerator, or the possibility of traveling by airplane. A couple of decades before that, almost no one had indoor plumbing.
Changes in how people live, and the technology they use, could both mitigate the impact of climate change and ensure that the costs are less about a pure economic loss and more about rewiring the way civilization works.
Most capital investments last only a decade or two to begin with; people are constantly rebuilding roads, buildings and other infrastructure. And a warmer climate could, if it plays out slowly enough, merely shift where that reinvestment happens.
But a big risk is that the change happens too quickly. Adaptation that might be manageable over a generation could be impossible — and cause mass suffering or death — if it happens over a few years.
Imagine major staple food crops being wiped out for a few consecutive years by drought or other extreme weather. Or a large coastal city wiped out in a single extreme storm.
“Whether it’s jobs, consumption patterns or residential patterns, if things are changing so fast that we can’t adapt to them, that will be very, very costly,” Mr. Nordhaus said. “We know we can adapt to slow changes. Rapid changes are the ones that would be most damaging and painful.”
It’s clear that climate change and its ripple effects are likely to be a defining challenge of the 21st-century economy. But there are wide ranges of possible results that vary based on countless assumptions. We should also recognize that the economic backdrop of society is always changing. Projecting what that will mean for ordinary people is not simply a matter of dollars.
“I’ve spent the last 20 years trying to communicate it and it’s not easy to process,” Joseph Aldy, who teaches at Harvard’s Kennedy School for Public Policy, said of the connection between climate change and the economy. “It’s really hard to convey something that is long term and gradual until it’s not.”
By Deborah Blackwell, Arnold Arboretum Communications
Andrew Groover celebrates the complexity of trees and makes it his life’s work to unlock how they adapt to their environments. It’s knowledge that’s critical for the U.S. Forest Service research geneticist — he works in California, where concerns about climate change have grown as wildfires there have increased in frequency and intensity.
A practical problem for Groover, who is a University of California, Davis, adjunct professor of plant biology, is efficient access to the variety of trees he studies. His research requires a ready supply of species diversity, a tall order without laborious travel. But in 2012 his search for the perfect resource brought him to the Arnold Arboretum of Harvard University — a 281-acre living museum holding more than 2,100 woody plant species from around the world.
“Trees are fascinating for biology and research, but one of the greatest challenges in this research is finding trees tractable for study,” Groover said. “If you have a list of a dozen or two different species, where do you get all those? The Arnold Arboretum has all of the species we would ever want to look at, and then some.”
The Arboretum also contains one of the most extensive collections of Asian trees in the world, which Groover said is advantageous to his research. Typically a researcher has to travel to various locations throughout the world, determine whether the trees are on public or private property, obtain permission to study and transport samples, overcome language and other barriers, and potentially return to the same site later to complete research, which can be challenging.
“The Arnold Arboretum plays a crucial role in research and science and educating the public, connecting them with trees and forests. But it’s also a living laboratory and repository of hard-to-source species for research and is renowned for its collection of Asian disjuncts,” he said. “We can actually study these species pairs found in both Asia and the U.S. directly in the Arboretum. We didn’t need to go anywhere else.”
Director of the Arnold Arboretum and Arnold Professor of Organismic and Evolutionary Biology William (Ned) Friedman emphasized the extraordinary efforts that go into creating such a high-impact research destination.
“Importantly, beyond the more than 16,000 accessioned woody plants at the Arnold Arboretum, we have a staff of world-class horticulturists, propagators, IT professionals, curators, and archivists, all of whom are devoted to ensuring that the living collections are what I call a ‘working collection’ of plants,” he said. “The plants of the Arboretum may look great in flower, or at the peak of fall colors, but these plants are here primarily to be studied by scholars at Harvard and from around the world. In 2018 alone, there were 79 different research projects using the living collections and landscape of the Arnold Arboretum.”
Groover’s work with the Arboretum became a long-term collaboration. In 2014 he won a Sargent fellowship, and, working with Arboretum scientists, collected small samples of genetic material from specific Arboretum trees and propagated them in his own laboratory greenhouses. In 2015 Groover, with Friedman, organized the 35th New Phytologist Symposium held at the Arboretum. He has also given several research talks there, most recently in December on genomic approaches to understanding the development and evolution of forest trees.
“When the Weld Hill Research Building was completed [in 2011], many of us in the research community saw that as a real commitment holding great possibilities for expanding into new areas of research,” he said. “We could not only access a broad range of species all in one location, we had a physical facility for research activities.”
Groover’s work investigates genetic regulation of wood formation — the triggers of gene expression within the wood — which is driven by environment, including light, temperature, wind, water, gravity, even insects and disease. Studying diverse tree species helps him identify the genetic basis of how different species modify their growth and adapt to different environmental conditions.
“Trees, in general, are very responsive to the environment, and trees can actually make adjustments in their wood anatomy to suit the environment,” Groover said. “One thing that is really interesting about trees is that they are perennial and live to decades or even thousands of years in the same place, and they have to be able to cope with all of the variation.”
The collaboration with the Arboretum is special because its trees contain valuable provenance.
“The trees are well-cared for, are not likely to disappear or die so you can go back again, and they are all right there next to each other,” Groover said.
While his in-depth research is on poplars (Populus spp.), the knowledge obtained may be beneficial in the study of many other tree species.
“If the genetic regulation of a trait is conserved among species, then what we learn in poplar can be transferred to the hundreds of other species we would like to be able to better manage or understand,” Groover said. “We can transfer knowledge across different species and potentially use that information in the future for things like reforestation and restoration.”
Suzanne Gerttula of the Forest Service began working in developmental plant genetics more than three decades ago and joined Groover’s laboratory in 2010. The former staff research associate in plant biology at U.C., Davis, has an interest in the underlying mechanisms of trees’ responses to gravity, such as occurs in weeping varieties.
“The Arboretum is an incredible resource for both weeping and upright trees. It’s fascinating, fun, and inspiring to me to be able to get at the some of the biochemical bases of how life works,” she said.
Groover’s enthusiasm for his subject spans sectors from ecological to economic. From understanding Earth cycles and climate change to helping the lumber, paper, fiber, and even biofuel industries, he hopes his research can inform solutions for forest management and conservation and identify new forms of renewable energy.
“I think it’s important we have places like the Arnold Arboretum to help provide this sort of basic information that has the potential to help in the conservation and management of forests,” he said.
Michael Dosmann, Keeper of the Living Collections at the Arboretum, said it has research potential across a wide swath of disciplines — taxonomic, horticultural, plant conservation, ecology, and developmental biology.
“Our living collection’s research potential could never be exhausted; there is a constant need for its use, growth, and development,” he said. “[The] dynamic interplay between living collections and scientiﬁc research demonstrates the vital importance collections have to science and to society.”
Scientists such as Groover enjoy access not only to the living collections, but also to other Arboretum resources, including afﬁliated collections containing herbarium specimens, archives, images, historical records, on-site greenhouse and laboratory space, centralized expertise, and, frequently, financial assistance in the form of grants and fellowships.
“All too often, the cost both in time and dollars of assembling collections at their own institutions is prohibitive for researchers, making places like the Arboretum a vital resource, especially for those working with limited budgets,” Dosmann said.
Evolving technology also plays a critical role, according to Dosmann, giving researchers the ability to access the Arboretum’s expansive resources, and making plant species more attainable.
“With the aid of databases and other information systems, it is now much easier to see collections in the multiple dimensions within which they exist and appreciate their unlimited research potential,” he said.
Groover said that with forests facing multiple threats, there’s never been a more important time to address forest biology and the use of technology.
“In the west especially, we need new insights into how to make forests more resilient to drought and heat, including understanding the biology underlying stress responses in different tree species,” he said. “We are learning the complexities of forest trees and hope to ultimately be able to select genotypes or species that might perform better in the future. Working with the Arboretum offers the resources for this important research.”
The mission, as it turned out, was to transform the American economy and save the country, no less, over twelve years. Franklin Roosevelt called it his New Deal, starting in 1933. New-breed Democrats in Congress today are talking about a Green New Deal, starting now, deep into the crisis of a changing climate that goes way beyond the weather. FDR had a working class revolt driving him forward, and later he had a Nazi threat and a world war to focus every fiber of mind and muscle on a reinvention. Which may be what the climate is demanding. Here’s one test: at mention of an all-new renewable energy system, is your first thought Costs? Savings? Or Survival? Getting real about the Green New Deal, this week on Open Source.
Three words and one picture sum up the new scene in Washington—and the relief, for starters, from a two-year fixation on President You-Know-Who. The picture is of the so-called Sunrise Movement siege of Nancy Pelosi’s office from last November, and of the rapturous, insurgent Congressperson from the Bronx, Alexandria Ocasio-Cortez, sweeping up the moment and putting its three little words—Green New Deal—at the top of the evolving agenda in D.C. It’s as slippery a promise as universal health care, but here’s our first crack at what it could mean: a resurrection of spirit, perhaps, at the bold Rooseveltian scale, after 75 years? A reset in relations with work, among workers, which Roosevelt’s New Deal was? We’ll see. Does it mean a warfor clean, renewable energy, against the embedded power of fossil-fuels? Unavoidably. A “system upgrade” for the power grid and the whole economy? About time, you say! But can it be done?
environmentalist and journalist
Professor of the History of Science at Harvard
Professor of Geology and Environmental Science and Engineering at Harvard and director of the Harvard Center for the Environment
BY ROB JORDAN, STANFORD WOODS INSTITUTE FOR THE ENVIRONMENT
Nearly ubiquitous in products ranging from cookies to cosmetics, palm oil represents a bedeviling double-edged sword. Widespread cultivation of oil palm trees has been both an economic boon and an environmental disaster for tropical developing-world countries, contributing to large-scale habitat loss, among other impacts. New Stanford-led research points the way to a middle ground of sustainable development through engagement with an often overlooked segment of the supply chain.
"The oil palm sector is working to achieve zero-deforestation supply chains in response to consumer-driven and regulatory pressures, but they won’t be successful until we find effective ways to include small-scale producers in sustainability strategies,” said Elsa Ordway, lead author of a Jan. 10 Nature Communications paper that examines the role of proliferating informal oil palm mills in African deforestation. Ordway, a postdoctoral fellow at The Harvard University Center for the Environment, did the research while a graduate student in Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth).
Using remote sensing tools, Ordway and her colleagues mapped deforestation due to oil palm expansion in Southwest Cameroon, a top producing region in Africa’s third largest palm oil producing country.
Contrary to a widely publicized narrative of deforestation driven by industrial-scale expansion, the researchers found most oil palm expansion and associated deforestation occurred outside large, company-owned concessions, and that expansion and forest clearing by small-scale, non-industrial producers was more likely near low-yielding informal mills, scattered throughout the region. This is strong evidence that oil palm production gains in Cameroon are coming from extensification instead of intensification.
Possible solutions for reversing the extensification trend include improving crop and processing yields by using more high-yielding seed types, replanting old plantations, and upgrading and mechanizing milling technologies, among other approaches. To prevent intensification efforts from inciting further deforestation, they will need to be accompanied by complementary natural resource policies that include sustainability incentives for smallholders.
In Indonesia, where a large percentage of the world’s oil palm-related forest clearing has occurred, a similar focus on independent, smallholder producers could yield major benefits for both poverty alleviation and environmental conservation, according to a Jan. 4 Ambio study led by Rosamond Naylor, the William Wrigley Professor in the School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment and the Freeman Spogli Institute for International Studies (Naylor coauthored the Cameroon study led by Ordway).
Using field surveys and government data, Naylor and her colleagues analyzed the role of small producers in economic development and environmental damage through land clearing. Their research focused on how changes in legal instruments and government policies during the past two decades, including the abandonment of revenue-sharing agreements between district and central governments and conflicting land title authority among local, regional and central authorities, have fueled rapid oil palm growth and forest clearing in Indonesia.
They found that Indonesia’s shift toward decentralized governance since the end of the Suharto dictatorship in 1998 has simultaneously encouraged economic development through the expansion of smallholder oil palm producers (by far the fastest growing subsector of the industry since decentralization began), reduced rural poverty, and driven ecologically destructive practices such as oil palm encroachment into more than 80 percent of the country’s Tesso Nilo National Park.
Among other potential solutions, Naylor and her coauthors suggest Indonesia’s Village Law of 2014, which devolves authority over economic development to the local level, be re-drafted to enforce existing environmental laws explicitly. Widespread use of external facilitators could help local leaders design sustainable development strategies and allocate village funds more efficiently, according to the research. Also, economic incentives for sustainable development, such as an India program in which residents are paid to leave forests standing, could make a significant impact.
There is reason for hope in recent moves by Indonesia’s government, including support for initiatives that involve large oil palm companies working with smallholders to reduce fires and increase productivity; and the mapping of a national fire prevention plan that relies on financial incentives.
“In all of these efforts, smallholder producers operating within a decentralized form of governance provide both the greatest challenges and the largest opportunities for enhancing rural development while minimizing environmental degradation,” the researchers write.
Coauthors of "Decentralization and the environment: Assessing smallholder oil palm development in Indonesia” include Matthew Higgins, a research assistant at Stanford’s Center on Food Security and the Environment; Ryan Edwards of Dartmouth College, and Walter Falcon, the Helen C. Farnsworth Professor of International Agricultural Policy, Emeritus, at Stanford.
Coauthors of “Oil palm expansion at the expense of forests in Southwest Cameroon associated with proliferation of informal mills” include Raymond Nkongho, a former fellow at Stanford’s Center for Food Security and the Environment; and Eric Lambin, the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy & Environmental Sciences and a senior fellow at the Stanford Woods Institute for the Environment.
Image: Elsa Ordway
Mercury in Context
Coal-fired power plants are the largest source of mercury in the U.S., accounting for approximately 48% of mercury emissions in 2015.
The Mercury and Air Toxics Standards (MATS) were finalized in 2011 and currently regulate emissions of mercury, acid gases and other hazardous air pollutants (HAPs) from U.S. electric utilities.
The MATS rule is expected to reduce mercury emissions from the power sector by 90%, improve public health, and play an integral role in meeting U.S. commitments under the international 2017 Minamata Convention on Mercury.
The Latest from EPA
In August 2018, the U.S. Environmental Protection Agency (EPA) announced plans to revisit the Agency’s prior determination that regulating HAPs emitted from power plants under section 112 of the Clean Air Act was “appropriate and necessary”.
A proposal to reopen one or more aspects of MATS is currently under interagency review at the Office of Management and Budget and could result in lifting limits on mercury emissions from electric utilities in the U.S.
Recent research shows that MATS has substantially reduced mercury levels in the environment and improved public health at a much lower cost than anticipated. However, the Regulatory Impact Assessment (RIA) that the Administration is relying on in its rollback proposal does not reflect current scientific understanding of the local impacts and societal cost of mercury pollution in the U.S.,.
Many of the health effects associated with mercury exposure are not fully reflected in the RIA, and the final estimate of the mercury-related benefits from MATS only accounted for benefits to children of freshwater recreational anglers in the U.S., a small fraction of the total population affected.
Mercury Emissions Matter to Human Health and the Environment
Mercury in the form of methylmercury is a potent neurotoxin. Important facts about the health effects of methylmercury include the following:
Children exposed to methylmercury during a mother’s pregnancy can experience persistent and lifelong IQ and motor function deficits.
In adults, high levels of methylmercury exposure have been associated with adverse cardiovascular effects, including increased risk of fatal heart attacks.
Other adverse health effects of methylmercury exposure that have been identified in the scientific literature include endocrine disruption, diabetes risk, and compromised immune function.
The societal costs of neurocognitive deficits associated with methylmercury exposure in the U.S. were estimated in 2017 to be approximately $4.8 billion per year.
No known threshold exists for methylmercury below which neurodevelopmental impacts do not occur,.
Mercury exposure in the U.S. occurs primarily through the consumption of freshwater fish and seafood (fish and shellfish). The consumption of marine fish, often harvested from U.S. coastal waters, accounts for greater than 80% of methylmercury intake by the U.S. population. Dietary supplements cannot counteract methylmercury toxicity in U.S. consumers. A safe and consumable fishery is important to retaining a healthy, low-cost source of protein and other nutrients that are essential for pregnant women, young children, and the general population.
After mercury is emitted from power plants it is deposited back to Earth where it can be converted to methylmercury, a highly toxic form of mercury that magnifies up food chains, reaching concentrations in fish that are 10 to 100 million times greater than concentrations in water.
With increasing levels of mercury in the environment due to human activities, virtually all fish from U.S. waters now have detectable levels of methylmercury. Some fish, such as swordfish, large species of tuna, and freshwater game fish, can have levels that exceed consumption guidelines.
States post fish consumption advisories for waterbodies that are known to have elevated contaminants. In 2013, consumption advisories for mercury were in effect in all 50 states, one U.S. territory, and three tribal territories, and accounted for 81% of all U.S. advisories. This represents more advisories for mercury than for all other contaminants combined.
Wildlife that consume fish, such as common loons, bald eagles, otter and mink, and many marine mammals can also experience adverse effects from mercury and are unable to heed advisories. The health of many songbird and bat species is threatened due to methylmercury exposure in wetland habitats. The productivity of economically valuable game fish stocks can also be compromised.
As Mercury Emissions in the U.S. Have Declined, Health Has Improved
In the 2011 MATS RIA, it was assumed that mercury emissions from coal-fired utilities are mainly transported long-distances away from the U.S. and that a substantial fraction of mercury in the U.S. comes from international sources. Since that time, scientific understanding of the fate of U.S. mercury emissions has advanced,. Recent research reveals that the contribution of U.S. coal-fired power plants to local mercury contamination in the U.S. has been markedly underestimated. Accordingly, controls on mercury emissions from U.S. electric utilities have contributed to the following human health and environmental improvements.
Mercury emissions from U.S. coal-fired power plants have declined by 85% from 92,000 pounds in 2006 to 14,000 pounds in 2016since states began setting standards and MATS was introduced in 2011. Eleven states had implemented mercury emissions standards for power plants prior to 2011.
Concurrent with declines in mercury emissions, mercury levels in air, water, sediments, loons, freshwater fisheries, and Atlantic Ocean fisheries have decreased appreciably.
Mercury levels in the blood of women in the U.S. declined by 34% between 2001 and 2010 as mercury levels in some fish decreased, and fish consumption advisories improved.
The estimated number of children born in the U.S. each year with prenatal exposure to methylmercury levels that exceed the EPA reference dose has decreased by half from 200,000-400,000 to 100,000-200,000, depending on the measure used.
The Benefits of Reducing Mercury Are Much Larger Than Previously Estimated
The EPA estimated in the MATS RIA that the annualized mercury-related health benefits of reducing mercury emissions would be less than $10 million. Recent studies that account for more pathways of methylmercury exposure and additional health effects suggest that the monetized benefits of reducing power plant mercury emissions in the U.S. are likely in the range of several billion dollars per year,,. These and other studies support the conclusion that the mercury-related benefits from MATS are orders of magnitude larger than previously estimated in the MATS RIA.
In addition to the mercury-related benefits, MATS has also decreased sulfur dioxide and nitrogen oxide emissions, improving air quality and public health by reducing fine particulate matter and ground-level ozone. The EPA estimated that the annualized value of these additional benefits is $24 to $80 billion; bringing the total annual benefits from MATS to tens of billions of dollars. Even with these more complete estimates, substantial benefits of reducing mercury and other air toxics remain unquantified due to data limitations.
On the cost side, new information suggests that the EPA’s original cost-estimate for MATS of $9.6 billion is much higher than the actual cost due to declines in natural gas prices and lower than expected control equipment and renewable energy costs. Yet, even with the original overestimate, the EPA projected that MATS would increase the monthly electric bill of the average American household by only $2.71 (or 0.3 cents per kilowatt-hour). This value is well within the price fluctuation consumers experienced between 2000 and 2011.
The Bottom Line
The science is clear, the health impacts of U.S. mercury emissions in the U.S. are large and disproportionately affect children and other vulnerable populations. Mercury emission standards in the U.S. have markedly reduced mercury in the environment and improved public health. The mercury-related benefits of MATS are much larger than previously estimated, the actual costs appear to be substantially lower than projected by the EPA, and the total monetized benefits across all pollutants far outweigh the costs of the standards.
Charles Driscoll, Department of Civil and Environmental Engineering, Syracuse University
Elsie Sunderland, Harvard Paulson School of Engineering & Applied Sciences and Harvard T.H. Chan School of Public Health, Department of Environmental Health, Exposure, Epidemiology and Risk
Kathy Fallon Lambert, Harvard T.H. Chan School of Public Health, Center for Climate, Health, and the Global Environment
Joel Blum, Department of Earth and Environmental Sciences, University of Michigan
Celia Chen, Department of Biological Sciences, Dartmouth College
David Evers, BioDiversity Research Institute
Philippe Grandjean, Harvard T.H. Chan School of Public Health, Department of Environmental Health, Environmental and Occupational Medicine and Epidemiology
Rob Mason, Departments of Chemistry and Marine Sciences, University of Connecticut
Emily Oken, Harvard Medical School
Noelle Selin, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology
 Streets, D.G.; Horowitz, H.M.; Lu, Z.; Levin, L.; Thackray, C.P.; Sunderland, E.M. Global and regional trends in mercury emissions and concentrations, 2010-2015. Atmospheric Environment. Accepted.
 Sunderland, E.M.; Driscoll, Jr., C.T.; Hammitt, J.K.; Grandjean, P.; Evans, J.S.; Blum, J.D.; Chen, C.Y.; Evers, D.C.; Jaffe, D.A.; Mason, R.P.; Goho, S.; Jacobs, W. 2016. Benefits of Regulating Hazardous Air Pollutants from Coal and Oil-Fired Utilities in the United States. Environmental Science & Technology. 50 (5), 2117-2120. DOI: 10.1021/acs.est.6b00239.
 Giang, A.; Mulvaney, K; Selin, N.E. 2016. Comments on “Supplemental Finding That It Is Appropriate and Necessary to Regulate Hazardous Air Pollutants from Coal- and Oil-Fired Electric Utility Steam Generating Units”.
 Grandjean, P. and Bellanger, M. 2017. Calculation of the disease burden associated with environmental chemical exposures: application of toxicological in health economic estimation. 16:123. DOI: 10.1186/s12940-017-0340-3.
 Genchi G., Sinicropi M.S., Carocci A., Lauria G., Catalano A. 2017. Mercury Exposure and Heart Diseases. Int J Environ Res Public Health. 2017;14(1):74. Published Jan 12. DOI:10.3390/ijerph14010074.
 Tan, S.W.; Meiller, J.C.; Mahaffey, K.R. 2009. The endocrine effects of mercury in humans and wildlife. Crit. Rev. Toxicol. 39 (3), 228−269.
 He, K.; Xun, P.; Liu, K.; Morris, S.; Reis, J.; Guallar, E. 2013. Mercury exposure in young adulthood and incidence of diabetes later in life: the CARDIA trace element study. Diabetes Care. 36, 1584−1589.
 Nyland, J. F.; Fillion, M.; Barbosa, R., Jr.; Shirley, D. L.; Chine, C.; Lemire, M.; Mergler, D.; Silbergeld, E.K. 2011. Biomarkers of methylmercury exposure and immunotoxicity among fish consumers in the Amazonian Brazil. Env. Health Persp. 119 (12), 1733− 1738.
 Grandjean and Bellanger 2017.
 Rice, G.E.; Hammitt, J.K; and Evans, J.S. 2010. A probabilistic characterization of the health benefits of reducing methyl mercury intake in the United States. Environ Sci Technol. 1;44(13):516-24. DOI:10.1021/es903359u.
 Grandjean and Bellanger 2017.
 Sunderland, E. M.; Li, M.; Bullard, K. 2018. Decadal Changes in the Edible Supply of Seafood and Methylmercury Exposure in the United States. Environ. Health Persp. DOI: 10.1289/EHP2644.
 Driscoll, C.T.; Han, Y-J; Chen, C.; Evers, D.; Lambert, K.F.; Holsen, T.; Kamman, N.; and Munson, R. 2007. Mercury Contamination on Remote Forest and Aquatic Ecosystems in the Northeastern U.S.: Sources, Transformations, and Management Options. BioScience. 57(1):17-28.
 U.S. Environmental Protection Agency. 2011 National Listing of Fish Advisories. 2013. EPA-820-F-13-058.
 Chan, N.M.; Scheuhammer, A.M.; Ferran, A.; Loupelle, C.; Holloway, J.; and Weech, S. 2003. Impacts of Mercury on Freshwater Fish-eating Wildlife and Humans. Human and Ecological Risk Assessment. 9(4): 867-883.
 Zhang, Y.; Jacob, D.; Horowitz, H.; Chen, L.; Amos, H.; Krabbenhoft, D.; Slemr, F.; St. Louis, V.; Sunderland, E. 2016. Observed decrease in atmospheric mercury explained by global decline in anthropogenic emissions. PNAS. 113 (3) 526-531. DOI: 10.1073/pnas.1516312113.
 Lepak, R.F.; Yin, R.; Krabbenhoft, D.; Ogorek, J.; DeWild, J.; Holsen, T.; and Hurley, J. 2015. Use of Stable Isotope Signatures to Determine Mercury Sources in the Great Lakes. Environmental Science & Technology Letters. 2 (12), 335-34. DOI: 10.1021/acs.estlett.5b00277.
 U.S. Environmental Protection Agency. 2018. https://www.epa.gov/trinationalanalysis/electric-utilities-mercury-relea....
 Cross, F.A.; Evans, D.W.; Barber, R.T. 2015. Decadal declines of mercury in adult bluefish (1972−2011) from the mid-Atlantic coast of the U.S.A. Environ. Sci. Technol. 49, 9064−9072.
 U.S. Environmental Protection Agency. 2013. Trends in Blood Mercury Concentrations and Fish Consumption Among U.S. Women of Childbearing Age NHANES 1999-2010. EPA-823-R-13-002. https://www.regulations.gov/document?D=EPA-HQ-OAR-2009-0234-20544.
 U.S. Environmental Protection Agency. 2013. EPA-823-R-13-002.
 Rice et al. 2010.
 Giang, A.; Selin, N. E. Benefits of mercury controls for the United States. Proc. Natl. Acad. Sci. U. S. A. 2016, 113, 286.
 Sunderland et al. 2016.
 Giang et al. 2016.
 Sunderland et al. 2016.
 Declaration of James E. Staudt, Ph.D. CFA, September 24, 2015, White Stallion Energy Center, et al., v. United States Environmental Protection Agency, Case No. 12-1100 and Summary plus cases, Exhibit 1 Declaration of James E. Staudt, Ph.D., CFA, U.S. Court of Appeals for the District of Columbia.
 U.S. Environmental Protection Agency. Final Consideration of Cost in the Appropriate and Necessary Finding for the Mercury and Air Toxics Standards for Power Plants. https://www.epa.gov/sites/production/files/2016-05/documents/20160414_ma....
Photo by Pixabay user 12019
By Chris Sweeney
Dust storms in Kuwait. Tourism in Tunisia. Air pollution in Uganda. Three different countries facing three different challenges. A common thread? Harvard T.H. Chan School researchers are working in each setting to understand how environmental factors are impacting the health of the people who live and work in these regions.
At a panel discussion on “Environmental Health Capacity Building In Africa And The Middle East” held on October 25, 2018 as part of Harvard Worldwide Week, attendees were given a look at these ongoing research projects.
“Developing countries in Africa and the Middle East are bearing a disproportionate health burden from climate change and environmental contamination,” said Douglas Dockery, John L. Loeb and Frances Lehman Loeb Research Professor of Environmental Epidemiology. “For this panel we brought together three investigators across Harvard who are partnering with institutions in this region to build local capacity to address these challenges.”
The event, hosted by the Department of Environmental Health and the Harvard Chan-NIEHS Center for Environmental Health, kicked off with a presentation from Petros Koutrakis, professor of environmental sciences and an expert on air pollution. Koutrakis shared an overview of his work in the Middle East, which dates back to the 1990s when he and colleagues assessed the environmental health impacts of the hundreds of oil wells that were set ablaze during the Gulf War.
More recently, Koutrakis has turned his attention to dust storms in Kuwait, a fairly common meteorological event that may have a significant impact on human health and social dynamics. Using satellite data, historical weather records, and air quality sensors, Koutrakis and colleagues are gleaning new insights on how desert vegetation and wind patterns affect the severity and frequency of dust storms.
“Dust is not something we can control, and so people have to adjust,” Koutrakis said, noting that these adjustments can impact human activity and health. For instance, on days when dust storms are severe, people may be forced to stay indoors, reducing their ability to exercise. There is also the potential that exposure to dust storms over long periods of time may be associated with chronic respiratory problems.
Koutrakis was followed up by Misbath Daouda, a master’s candidate in environmental health who’s studying how the growing tourism industry in Tunisia may impact the local environment.
Daouda’s research so far has shown that in some tourism hot spots, electricity demand surges by more than 50% during the busy season and that the sector is responsible for more than one-third of water consumption in Djerba, an island oasis off the eastern coast of Tunisia. As Daouda explained, her hope is to build a framework to measure tourism growth and its impact on the environment and human health in order to assist policymakers who will have to wrestle with important choices on how to mitigate the sector’s impact in the North African country over the coming years.
Rounding out the event was a presentation from Crystal North, a pulmonologist at Massachusetts General Hospital who has been collaborating with Harvard Chan School researchers to study air pollution in Uganda.
The work involves tracking air quality and following a cohort of HIV-positive and HIV-negative patients in the East African nation. Among the challenges are the air quality data from developing countries are relatively sparse, and there are very few sensors in Uganda to measure ambient air quality.
Previously North’s research has looked at inflammation and lung function in HIV-positive patients, who tend to be at increased risk of tuberculosis. She hopes to build on that with this new research by focusing on whether air pollution and HIV are synergistic in their effects on lung function. “Hopefully in the next year or two we’ll have some initial results to share,” North said.
By Leah Burrows, SEAS Communications
The ocean has a long memory. When the water in today’s deep Pacific Ocean last saw sunlight, Charlemagne was the Holy Roman Emperor, the Song Dynasty ruled China and Oxford University had just held its very first class. During that time, between the 9th and 12th centuries, the earth’s climate was generally warmer before the cold of the Little Ice Age settled in around the 16th century. Now, ocean surface temperatures are back on the rise but the question is, do the deepest parts of the ocean know that?
Researchers from the Woods Hole Oceanographic Institution and Harvard University have found that the deep Pacific Ocean lags a few centuries behind in terms of temperature and is still adjusting to the advent of the Little Ice Age. Whereas most of the ocean is responding to modern warming, the deep Pacific may be cooling.
The research is published in Science.
"Climate varies across all timescales,” said Peter Huybers, Professor of Earth and Planetary Sciences in the Department of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences and co-author of the paper. “Some regional warming and cooling patterns, like the Little Ice Age and the Medieval Warm Period, are well known. Our goal was to develop a model of how the interior properties of the ocean respond to changes in surface climate.”
What that model showed was surprising.
“If the surface ocean was generally cooling for the better part of the last millennium, those parts of the ocean most isolated from modern warming may still be cooling,” said Jake Gebbie, a physical oceanographer at Woods Hole Oceanographic Institution and lead author of the study.
The model is a simplification of the actual ocean. To test the prediction, Gebbie and Huybers compared the cooling trend found in the model to ocean temperature measurements taken by scientists aboard the HMS Challenger in the 1870s and modern observations from the World Ocean Circulation Experiment of the 1990s.
The HMS Challenger, a three-masted wooden sailing ship originally designed as a British warship, was used for the first modern scientific expedition to explore the world’s ocean and seafloor. During the expedition from 1872 to 1876, thermometers were lowered into the ocean depths and more than 5,000 temperature measurements were logged.
“We screened this historical data for outliers and considered a variety of corrections associated with pressure effects on the thermometer and stretching of the hemp rope used for lowering thermometers,” said Huybers.
The researchers then compared the HMS Challenger data to the modern observations and found warming in most parts of the global ocean, as would be expected due to the warming planet over the 20th Century, but cooling in the deep Pacific at a depth of around two kilometers depth.
“The close correspondence between the predictions and observed trends gave us confidence that this is a real phenomenon,” said Gebbie.
These findings imply that variations in surface climate that predate the onset of modern warming still influence how much the climate is heating up today. Previous estimates of how much heat the Earth had absorbed during the last century assumed an ocean that started out in equilibrium at the beginning of the Industrial Revolution. But Gebbie and Huybers estimate that the deep Pacific cooling trend leads to a downward revision of heat absorbed over the 20th century by about 30 percent.
"Part of the heat needed to bring the ocean into equilibrium with an atmosphere having more greenhouse gases was apparently already present in the deep Pacific,” said Huybers. "These findings increase the impetus for understanding the causes of the Medieval Warm Period and Little Ice Age as a way for better understanding modern warming trends."
This research was funded by the James E. and Barbara V. Moltz Fellowship and National Science Foundation grants OCE-1357121 and OCE-1558939.