Tuesday, March 20, 2018

Pruitt Expected to Limit Science Used to Make EPA Pollution Rules

Plan would hamstring agency’s mission, environmental advocates warn.

Scott Pruitt (Credit:  Pete Marovich Getty Images) Click to Enlarge.In a closed-door meeting at the Heritage Foundation on Monday, Pruitt told a group of conservatives that he has plans for additional science reform at the agency, according to multiple attendees.  EPA hasn’t formally shared details of the plan, but it’s widely expected to resemble an effort that Republican lawmakers and conservative groups have been pushing for years.  It’s been met with staunch resistance from Democrats and many scientists.

The plan could come “sooner rather than later, ” said Steve Milloy, who served on Trump’s EPA transition team and attended the meeting at the Heritage Foundation.

EPA did not respond to a request for comment.  And Milloy cautioned that he did not know the specifics of the plan and said he was not authorized to discuss the meeting.

The initiative is expected to require EPA—when issuing rules—to rely only on scientific studies where the underlying data are made public.  It’s an idea that House Science, Space and Technology Chairman Lamar Smith (R-Texas) has been championing for years.  He and others argue that EPA has been crafting regulations based on “secret science ” to advance its regulatory agenda.

Smith, one of the leading opponents of mainstream climate science in Congress, has repeatedly accused federal climate scientists of engaging in a massive conspiracy to falsify climate data.  And he has repeatedly introduced bills that would require EPA to publicize data it uses when crafting regulations.

Those efforts died when President Obama was in the White House, and Smith’s newest legislative push doesn’t appear to be moving even though Republicans control both chambers of Congress.  The House passed a bill dubbed the “Honest and Open New EPA Science Treatment (HONEST) Act"—requiring that EPA rules be based on science for which underlying data is publicly available and reproducible—last March.  But the measure has gone nowhere since it was referred to the Senate Environment and Public Works Committee.

Smith has tried to push the idea elsewhere, too.  In comments on the 2019 budget proposal, the GOP majority on the Science panel led by Smith suggested that EPA’s funding should be contingent on the administrator’s “requiring that all scientific and technical information and data relied on to support a risk, exposure, or hazard assessment; criteria document; standard; limitation; regulation; regulatory impact analysis; or guidance issued by the EPA is made publicly available. ”

Smith did not respond to a request for comment.

Critics on the left and in the scientific community see the effort as an attempt to hinder EPA from issuing rules.

“A lot of the data that EPA uses to protect public health and ensure that we have clean air and clean water relies on data that cannot be publicly released,” said Yogin Kothari with the Union of Concerned Scientists.

Many scientific studies rely on data that can’t be made public for reasons like patient privacy concerns or industry confidentiality.

“If EPA doesn’t have data to move forward with a public protection for a safeguard, it doesn’t have to do that at all,” said Kothari.  “It really hamstrings the ability of the EPA to do anything, to fulfill its mission."

Publishing raw data also opens scientists up to attacks from industry, which can twist or distort data to shape a deregulatory agenda, said Betsy Southerland, a former senior EPA official in the Office of Water who worked on a staff analysis of the “HONEST Act.”

Southerland, who left EPA last summer, said the effort is deceptive and is not about transparency, but about sidelining peer-reviewed science that supports regulation of pollution.  She said there are numerous examples of groundbreaking studies that are not replicable, such as human health studies after the dropping of atomic bombs in Hiroshima or the ecological effects of the BP PLC Gulf of Mexico oil spill.  In many of the older studies, there are a plethora of people, including some who are dead, who could no longer be tracked down.

“This is just done to paralyze rulemaking,” she said.  “It’s another obstacle that would make it so hard and so difficult to go forward with rulemaking that in the end, the only thing that would happen—in the best case you would greatly delay rulemaking; in the worst case you would just prevent it.  It would be such an obstacle you couldn’t overcome it.”

Publicizing the data in some EPA actions, which often come after years of research, could be extensive.  For example, risk assessments for certain chemicals sometimes cite hundreds or even thousands of studies, all of which would have to be tracked down for data collection, according to the EPA analysis of the “HONEST Act.”

Requiring data transparency would cost hundreds of millions of dollars because it would require EPA staff to track down data from study authors and create an online management system to store and present those data, the analysis found.  In addition, EPA staff would have to spend time redacting personally identifiable information in the studies, and study authors would likely require payments for preparing and sending their data.

EPA career staff estimated that Smith’s legislation would add $250 million in costs annually for the first few years after it was implemented, Southerland said.  That estimate was dismissed by senior EPA officials who said those costs were inflated and that the agency would not use many studies to which the rule would apply, but they did not provide evidence, she said.  EPA’s analysis of Smith’s bill was published by the radio program “Marketplace.”

Read more at Pruitt Expected to Limit Science Used to Make EPA Pollution Rules

Lithium Seen as Lifeline for Oil Majors in Clean Energy Future

Salar de Olaroz Lithium Mine, Argentina.  (Image Credit: Planet Labs, Inc. | CC BY-SA 4.0 | Wikimedia Commons) Click to Enlarge.
Lithium could be a lifeline for oil majors as the energy industry shifts toward lower-polluting alternatives to fossil fuels, said Jeff McDermott of Greentech Capital Advisors LLC.

“Their specialty is resource extraction,” McDermott, managing partner of the New York-based boutique investment bank advising energy companies and investors, said in an interview in London.  “They should buy lithium miners, get involved in the upstream of core battery technology.”

This suggestion marks out one solution to the existential question some of the world’s biggest energy companies are facing about how to survive as governments clamp down on the fuels they produce.  As the curbs on carbon emissions tighten, a key issue for fossil fuel producers are how much oil and gas demand is at risk.

Lithium is a key ingredient in rechargeable batteries that are prevalent in electronics from mobile phones to electric cars.  The metal is part of the cathode, which houses the electric charge.  Demand for the mineral is projected to rise 38-fold by 2030 to 7,845 metric tons per year from 200 metric tons in 2016, according to Bloomberg New Energy Finance.

Big oil companies have the capital to deploy and expertise in developing large projects that could help the lithium industry expand.

Oil majors have been dabbling in clean energy for decades, but it doesn’t make up a significant percent of any of their businesses.  This is beginning to change, with the industry seeking new revenue streams and to keep themselves at the center of the energy business.

Total SA bought the battery maker Saft Groupe SA for 950 million euros in 2016.  Royal Dutch Shell Plc recently made tracks into electricity, buying First Utility Ltd. in the U.K. in December.  BP Plc has taken a 43 percent stake in British solar developer Lightsource Renewable Energy Ltd. for $200 million.

McDermott also sees opportunities for oil majors in offshore wind and integrated systems for autonomous vehicles.  Shell and Statoil ASA of Norway have made recent moves into the wind industry, capitalizing on their experience in drilling for oil and gas in the sea.  Shell is a part of the consortium building the Borssele III & IV wind farms in Dutch waters.

Read original at Lithium Seen as Lifeline for Oil Majors in Clean Energy Future

Monday, March 19, 2018

Monday 19

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

Human Influence on Climate Change Will Fuel More Extreme Heat Waves in US

The four dominant heat wave clusters (i.e., Western, Northern Plains, Southern Plains, and Great Lakes) are embedded on top of a population count of US counties (gray shading) units of population per arc-second squared or about 600m2. The 21st Century years provided below the name of each cluster represents the decade for which Human-caused climate change may be a dominant factor in the occurrence of heat waves when compared to natural variability. (Credit: Hosmay Lopez) Click to Enlarge.
Human-caused climate change will drive more extreme summer heat waves in the western U.S., including in California and the Southwest as early as 2020, new research shows.

The new analysis of heat wave patterns across the U.S., led by scientists at the University of Miami Rosenstiel School of Marine and Atmospheric Science (UM) based Cooperative Institute for Marine and Atmospheric Studies (CIMAS) and colleagues, also found that human-made climate change will be a dominant driver for heat wave occurrences in the Great Lakes region by 2030, and in the Northern and Southern Plains by 2050 and 2070, respectively.

Human-made climate change is the result of increased carbon dioxide and other human-made emissions into the atmosphere.

"These are the years that the human contributions to climate change will become as important as natural variability in causing heat waves," said lead author Hosmay Lopez, a CIMAS meteorologist based at NOAA's Atlantic Oceanographic Meteorological Laboratory.  "Without human influence, half of the extreme heat waves projected to occur during this century wouldn't happen."

Read more at Human Influence on Climate Change Will Fuel More Extreme Heat Waves in US

Western Europe Financial Centers Win Big in Inaugural Global Green Finance Index

The first ever Global Green Finance Index was launched by Z/Yen and Finance Watch last week, and the financial centers of Western Europe outperformed those in other regions based on the perception of the quality and depth of their green finance offerings.

The Global Green Finance Index (GGFI) was created in an effort to “chart the progress of the world’s financial centers towards a financial system that delivers sustainable development, and values people and the planet as much as profit.”  Created by NGO Finance Watch and commercial think-tank Z/Yen, the GGFI ranks the world’s leading financial centers based on a worldwide survey of finance professionals’ views “on the quality and depth of green finance offerings across 108 international financial centres.”

“The core of the GGFI is a perception survey which observes and promotes change where it matters most — in people’s minds,” explained Professor Michael Mainelli, Executive Chairman of Z/Yen.  “The more we can get people talking about a sustainable transition, the quicker it will happen.  The high level of interest in GGFI 1 is a step in that direction.”

“The GGFI aims to contribute to the definition of green finance and identify best practices and areas for improvement,” added BenoĆ®t Lallemand, Secretary General of Finance Watch.  “We hope it will promote bold policy initiatives and high-quality financing that can cut through greenwash.  It is urgent that sustainable finance becomes mainstream in all financial centers.”

According to this inaugural edition of the GGFI — GGFI 1 — the top five centers for Green Finance – Penetration are London, Luxembourg, Copenhagen, Amsterdam, Paris; and for Green Finance – Quality are London, Amsterdam, Brussels, Hamburg, Paris (penetration referring to a financial center’s overall financial activities, and quality referring to the quality as compared to its market volume).

Western Europe led the way with nine of the top ten centers in the quality index and seven of the top ten in the penetration index.  San Francisco and Washington were an equal tenth place in the ranking and therefore the top North American centers in the quality index, while San Francisco is also the leading North American center in the penetration index.  Shanghai and Shenzhen led in the Asia Pacific Region for quality and penetration respectively, while Johannesburg ranked first in the Middle East and Africa region for quality, followed closely by Cape Town which also topped the penetration index for the region.  Mexico City and Moscow were the only financial centers to achieve enough survey responses to qualify across the Latin America and the Caribbean region, and Eastern Europe and Central Asian regions.

Respondents also predicted that Paris, New Delhi, and Los Angeles would see their green finance offerings improve significantly over the next two to three years, while Paris, Frankfurt, and New York were expected to become more significant for the green financing sector.  Unfortunately, the same predictions expect to see Moscow, Boston, and Chicago decline in their green finance offerings over the next two to three years.

Looking specifically at green financing areas of interest, respondents to the survey were most focused on green bonds and renewable energy investments, and emerging areas such as sustainable infrastructure financing and energy efficiency also scored well.  There is less interest, however, in climate risk stress testing, divestment from fossil fuels, carbon markets and carbon disclosure, natural capital valuation, and green insurance.

Read more at Western Europe Financial Centers Win Big in Inaugural Global Green Finance Index

India Most Vulnerable Country to Climate Change - HSBC Report

Vehicles drive through smog in New Delhi, India, on Dec 5, 2017. (Photo Credit: Reuters) Click to Enlarge.
India is the most vulnerable country to climate change, followed by Pakistan, the Philippines, and Bangladesh, a ranking by HSBC showed on Monday.

The bank assessed 67 developed, emerging and frontier markets on vulnerability to the physical impacts of climate change, sensitivity to extreme weather events, exposure to energy transition risks and ability to respond to climate change.

The 67 nations represent almost a third of the world’s nation states, 80 percent of the global population and 94 percent of global gross domestic product.

HSBC averaged the scores in each area for the countries in order to reach the overall ranking.  Some countries were highly vulnerable in some areas but less so in others.

Of the four nations assessed by HSBC to be most vulnerable, India has said climate change could cut agricultural incomes, particularly unirrigated areas that would be hit hardest by rising temperatures and declines in rainfall.

Pakistan, Bangladesh, and the Philippines are susceptible to extreme weather events, such as storms and flooding.

Pakistan was ranked by HSBC among nations least well-equipped to respond to climate risks.

South and southeast Asian countries accounted for half of the 10 most vulnerable countries.  Oman, Sri Lanka, Colombia, Mexico, Kenya, and South Africa are also in this group.

The five countries least vulnerable to climate change risk are Finland, Sweden, Norway, Estonia, and New Zealand.

Read more at India Most Vulnerable Country to Climate Change - HSBC Report

Evolution of the Tesla Supercharger Network

The Tesla Supercharger network is still one of the top reasons electric car buyers are convinced to buy a Tesla rather than another company’s electric car.  The network was a critical competitive advantage we identified years ago when surveying EV drivers and potential EV drivers, and it seems to be referenced every day in comments on CleanTechnica as a core competitive advantage for the Silicon Valley EV & clean energy giant.

We are finally seeing superfast/ultrafast charging stations rise up in non-Tesla charging networks, and hey, one day we’ll have a non-Tesla electric car on the market that can charge at 100 kW or more.  But rolling out vast superfast/ultrafast charging stations takes time, and a lot of money.
This is where the Supercharger network is today:

Read more at Evolution of the Tesla Supercharger Network

Canada's Pipeline Challenges Will Force More Tar Sands Oil to Move by Rail

Gogama oil train derailment (Credit: Transportation Safety Board Canada) Click to Enlarge.
The Motley Fool has been advising investors on How to Profit From the Re-Emergence of Canada’s Crude-by-Rail Strategy.  But what makes transporting Canadian crude oil by rail attractive to investors?

According to the Motley Fool, the reason is “… right now, there is so much excess oil being pumped out of Canada’s oil sands that the pipelines simply don’t have the capacity to handle it all.”

The International Energy Agency recently reached the same conclusion in its Oil 2018 market report.

“Crude by rail exports are likely to enjoy a renaissance, growing from their current 150,000 bpd [barrels per day] to an implied 250,000 bpd on average in 2018 and to 390,000 bpd in 2019.  At their peak in 2019, rail exports of crude oil could be as high as 590,000 bpd — though this calculation assumes producers do not resort to crude storage in peak months,” the International Energy Agency said, as reported by the 7.
And Canada has plenty of capacity to load oil on more trains, which means if a producer is willing to pay the premium to move oil by rail, it can find a customer to do it.  The infrastructure is in place to load approximately 1.2 million barrels per day.

With the cancellation of the Energy East pipeline project, which would have moved western Canada's tar sands east to Quebec and New Brunswick, the industry now is pursuing two remaining major pipeline projects:  Kinder Morgan’s Trans Mountain and the Keystone XL.  The Financial Post reported that the International Energy Agency predicts the earliest start date for either of those projects as 2021.  Both pipelines are facing fierce opposition.

Additionally, the surge in U.S. crude oil exports has been affecting the value of Canadian oil, and some are predicting the flood of U.S. crude abroad could deliver a serious blow to the tar sands industry in the long-term.

Over the next several years, however, the Canadian tar sands industry appears poised to rely heavily on rail to export its product while awaiting construction of new pipelines, much like the situation that led to the Bakken oil-by-rail boom in the U.S.

Read more at Read more at Canada's Pipeline Challenges Will Force More Tar Sands Oil to Move by Rail

IEA Report Draws Lessons from Rapid Uptake of EVs in Nordic Countries

The market share of electric cars in Nordic countries tends to be higher when incentives are larger and when the price gap between electric cars and equivalent ICE models is smaller, with the exception of Denmark. (Source: IEA) Click to enlarge.
In proportion to its population, the Nordic region—Denmark, Finland, Iceland, Norway, and Sweden—is strikingly ahead of the rest of the world in adopting electric cars.  With almost 250,000 electric cars at the end of 2017, the five countries account for roughly 8% of the total number of electric cars around the world.  Norway, Iceland, and Sweden have the highest ratios of EVs per person, globally.

Further, the number of electric vehicles (EVs) in the Nordic region is projected to reach 4 million cars by 2030—more than 15 times the number currently in circulation, according to the International Energy Agency’s Nordic EV Outlook 2018 (NEVO 2018).  The report outlines the key factors contributing to successful developments and identifies key lessons to be learned, providing insights for countries currently developing their electric mobility strategies.

The Nordic countries represent the third-largest electric-car market by sales, after China and the United States.  Norway leads the way with a 39% market share of electric car sales—the highest globally.  Sweden has more than 49,000 electric cars in circulation and accounts for 20% of the total Nordic stock.

This remarkable growth has been driven by strong policy support and ambitious decarbonization goals, putting the region at the forefront of the transition to electric mobility.  In this context, IEA intends for Nordic EV Outlook 2018 to provide a useful benchmark and to highlight a series of best practices—and hurdles to avoid—for countries around the world.

The market share of electric cars in Nordic countries tends to be higher when incentives are larger and when the price gap between electric cars and equivalent ICE models is smaller, with the exception of Denmark. (Source: IEA) Click to enlarge.Policy support has significantly influenced electric-car adoption across these countries, the main driver being measures that reduce the purchase price of electric vehicles.  Other important measures that have proven successful in Nordic countries include a cut to circulation taxes, or local incentives such as waivers or partial exemptions on road-use charges, free parking or allowing access to bus lanes.

In addition support for the deployment of infrastructure has supported EV growth, in particular policy support for charging infrastructure.  Though around 80% of EV charging takes place at home, wide availability of publicly accessible chargers can encourage consumers to consider the purchase of an electric car while also enabling longer distance trips.

Read more at IEA Report Draws Lessons from Rapid Uptake of EVs in Nordic Countries

Sunday, March 18, 2018

Sunday 18

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

Beyond Three Thirds, the Road to Deep Decarbonization - By Michael Liebreich, Senior Contributor, Bloomberg New Energy Finance

Wind tower by smokestack (Credit: bnef.com) Click to Enlarge.
In my BNEF Summit keynote in London last September, I talked about how far clean energy and transport had come over the last fifteen years.  Where renewable energy used to be dismissed as “alternative”, I talked about the “new orthodoxy” of what I called the Three-Third World:  by 2040 one third of global electricity will be generated from wind and solar; one third of vehicles on the road will be electric; and the world’s economy will produce one third more GDP from every unit of energy.

The fact that we are on track for the Three-Third World is quite extraordinary.   It certainly outstrips my expectations when I founded New Energy Finance in 2004.  And it is probably unstoppable:  wind, solar and battery costs will continue to fall faster than any mainstream energy forecasters expect, and there is nothing that makes me think President Donald Trump will succeed in his attempts to revive coal.

That’s the good news.  The bad news is that even though we are on track to achieve the Three-Third World by 2040, it will not be enough 
Building efficiency
First of all, we all need to start treating the energy efficiency of our buildings like it really matters.  Mainly that means insulation, air-tightness, and good thoughtful architecture and design.  It doesn’t need to add much cost to the building; in many cases nothing at all.  Ten years ago, I had never heard of the PassivHaus building standard; in ten years’ time, all new buildings could easily meet it.  In fact, there is no reason why new houses shouldn’t produce more energy than they consume, receiving utility revenues instead of incurring utility costs.  It’s just a question of applying technologies and techniques we know work.

Retrofits are harder.  The important thing is that any time a building undergoes a deep renovation, its energy performance has to be brought up to the highest standard.  It is possible – I’ve done it.  As long as you are doing deep renovation works anyway, the extra costs are not prohibitive.  Even a twenty-year payback would be equivalent to 5% risk-free after-tax – a highly attractive rate of return to most home-owners in a world of persistently low interest rates.  Mainstream mortgage providers need to stop colluding in a system that treats the cost of a new kitchen as an investment, but the cost of a low-energy retrofit as an expense.

Once all new-builds and deep renovations are done properly, we will halve our heating challenge over twenty years, allowing a lot more of the heating load to be met electrically, mainly with air-source and ground-source heat pumps.  If you think they can’t work in cold temperatures, just look at Norway, or Japan.

Then there are other new technologies.  Some of the most intriguing start-ups I come across are working on thermal batteries, using phase-change materials, salts, clever thermodynamics, or just big chunks of concrete or tanks of hot water.  Drake’s Landing Solar Community meets over 95% of its winter heating needs from solar energy collected during the summer.  It lies just 45 minutes’ drive from the 1988 Calgary Winter Olympic venues.  How cool is that – or rather how warm?

There are, however, significant benefits to continuing to power a large proportion of the world’s heating with solid, gas, or even liquid fuels.  These are easier to store in bulk to cope with seasonality and resilience than electricity, which will always need to balance to within a few days’ of real time, and will also be needed by industry.  The question is how to make them zero-carbon.

A significant proportion of the heating load in temperate climates –  in countries like the U.K., Northern Europe, New England, Canada, the former Soviet Union, and Northern Asia – could be met by biogas or biomass, most efficiently using combined heat-and-power, or CHP, cogeneration.  Though it is hard to add district heating in existing neighborhoods, it can be done – look at Sweden.  Some 10% more Swedish households have been connected to district heating every decade since the 1960s, to the point where over half of all homes are now connected.  And here’s a thought – since you are going to have to add more capacity to local grids to charge all those EVs, how about combining new bio-based CHP delivering local heating, with massive battery storage, to provide grid services and improve resilience for energy-intensive industries, all while reducing investment requirements in the distribution grid?

If there’s not enough biogas, you might consider running your CHP on natural (i.e. fossil) gas, which would still be up to 85% efficient, but not zero-carbon.  To achieve that, you would need to use CCS (carbon capture and storage), but let’s be clear, that is not happening in the absence of a carbon price.  Micro-CHP is attractive until you consider the capital cost, and even with a carbon price it’s hard to see how to capture the emissions from distributed sources.

And that brings us to hydrogen, which can be used anywhere without creating local emissions.

My skepticism about hydrogen vehicles is well known.  What real problem do they solve?  If you have electricity and you want to drive somewhere, just use a battery electric vehicle (BEV) – they will be fully competitive with internal combustion vehicles on a total-cost-of-ownership basis with no subsidy within five to six years in most markets, according to BNEF forecasts ... .  Why would you waste half of your electricity electrolyzing hydrogen, compressing and storing it, only to turn it back into electricity in a car?

If you are concerned about how long it takes to refuel, well that is a problem for the few percent of us who actually drive long distances; everyone else will charge their EVs overnight.  Most people won’t want to visit a hydrogen station every few days just to avoid a 20-minute charge on the rare occasion when they drive long-distance.  Even commercial vehicles, unless they regularly drive long distances – say, over 300 miles – will go electric.  Ships, trans-continental trains, long-distance trucking, and niches like fork-lift trucks are the only parts of the transport system where hydrogen makes any sense.

In fact, even if you have already produced your hydrogen for some other reason – such as seasonal storage – and you want to drive somewhere, it will make more sense to generate power centrally and charge an EV, rather than to put it in a hydrogen-fueled vehicle.  Doing so will be much lower-capex per megawatt, much more efficient, and you can extract value from the waste heat.  And that’s before getting into the lack of hydrogen filling stations compared to the ubiquity of the grid, the complexity of fuel cell vehicles versus the simplicity of EVs, maintenance costs, safety, and so on.

Nevertheless, I am bullish about hydrogen.  It is one of the most promising ways of dealing with longer-term storage, beyond the minutes, hours or days that could be met by batteries, or the limited locations in which pumped storage could work.  It can be stored as hydrogen, perhaps blended into the existing natural gas system, or after conversion into ammonia, natural gas (so-called power-to-gas, or P2G), methanol, or some higher-value synthetic liquid fuel.  It can help provide the huge pulses of reliable power needed by some energy-intensive industries like ceramics.  We need to stop fooling ourselves about hydrogen as a transport fuel, and explore its pervasive use throughout our energy, chemical, and industrial system.

Read more at Liebreich:  Beyond Three Thirds, The Road to Deep Decarbonization

Saturday, March 17, 2018

Saturday 17

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

From Residential to Utility-Scale, Solar Wins in Recent State-Level Actions

Community-scale solar (Image credit: CC0 Creative Commons | Pixabay) Click to Enlarge.
22 Solar Projects for New York
Gov. Andrew Cuomo on March 9 announced that New York has authorized competitive awards under the state’s Clean Energy Standard mandate for 22 utility-scale solar projects.  The awards are part of $1.4 billion awarded for a total of 26 renewable energy projects in the state.

Solar Energy Industries Association (SEIA) President and CEO Abigail Ross Hopper in a statement commended Cuomo for what she said is a “historic commitment to solar energy.”

“These 22 solar projects will create thousands of jobs, generate billions of dollars in investment and bring clean and affordable energy to the residents of New York state,” she said.  “It is highly rewarding to see that the Empire State has made this groundbreaking investment in solar energy.”

Energy Bill Signed in Virginia
Gov. Ralph Northam on March 9 signed an omnibus energy bill for Virginia that designates 5.5 GW of solar and wind energy as “in the public interest.”  The bill also initiates a process to modernize the state’s power grid to help spur renewable energy development.

SEIA Vice President of State Affiars Sean Gallagher said in a statement that the public interest finding is a “great first step” for solar in Virginia.

“[W]e must ensure the grid modernization process that this bill initiates is data-driven, solicits the public’s input, and is not a blank check for a utility to spend consumers’ money with little accountability,” Gallagher said.

By 2022 Virginia is expected to have an installed solar capacity of about 2 GW, before taking the new law into consideration, according to SEIA.

New Jersey Considers Clean Energy Bills
New bills filed on March 14 by New Jersey legislators have been lauded by many clean energy organizations for their potential to grow the state’s renewables development and extend benefits of clean energy to more residents.

The text of the bills was not immediately available in the state’s online legislative documents center.

According to the SEIA, the two companion bills introduced in the New Jersey House and Senate would increase the state’s Renewable Portfolio Standard target for solar and begin the process of developing next-generation solar incentives in the state.

This legislation would also help establish a community solar program in the state, giving consideration to residential customers, especially in multifamily buildings, and low-to-moderate income customers, SEIA said.

In a statement Brandon Smithwood, policy director for the Coalition for Community Solar Access, said the bills were important for solar in New Jersey in light of the recent tariffs places on solar cells and panels.

Read more at From Residential to Utility-Scale, Solar Wins in Recent State-Level Actions

Siting a Wind Farm in the Most Challenging Place in the US

Developer: “It’s a bit of a bellwether for what the future looks like.”

Image, right: visual simulation of the AWE as it will be seen from Gregg Lake in Antrim, NH. (Credit: AWE) Click to Enlarge.
According to Jack Kenworthy, CEO of Eolian Renewable Energy, a project developer based in New Hampshire, the best wind projects are those that have died two times because then you know what’s wrong with them.  The project he is currently working on is known as Antrim Wind Energy (AWE), a 28.8-MW wind farm on the Tuttle Hill ridge line in Antrim, N.H. in the United States.

On a windy day in late February, Kenworthy, Henry Weitzner with Walden Green Energy, a subsidiary of German utility RWE, and landscape architect David Raphael with Landworks, took several members of the New Hampshire Site Evaluation Committee (SEC) on a site inspection tour to show them how AWE will impact the community in which it resides.

New England Wind Projects Challenging
In all of the U.S., New England is among the most difficult places to site wind projects.  Walden Green Energy’s Henry Weitzner said this one has been one of the worst.  “Walden has looked at about 15 different projects,” he said, adding, “We have looked at Texas, Minnesota, North Dakota, Utah and California, and I would say that there definitely are some issues in California but this is overwhelmingly the most difficult.”

So why even try?  Going back to 2009, Kenworthy explained he had originally viewed the process of building a wind farm in the state of New Hampshire as the most reasonable of all the New England states.  At that time there had been three wind projects that had gone though the SEC process.  “The process itself was long and expensive and kind of painful for all those projects but at the end of the day they were able to be built,” he said.

Unfortunately, that wasn’t the case with his project, which was not modified or conditioned but outright denied “at the 11th hour on a subjective issue” he said.  The reason for the denial was adverse aesthetic impacts.

Rather than give up, Kenworthy altered the project, dropping one turbine all together and modifying the height of another to lessen its visual impact.  Further, he swapped out the Iberdrola turbines with higher-rated Siemens turbines so he could deliver the same amount of power to the grid with fewer turbines.

Since a few years had passed, he was also armed with more direction regarding what benchmarks the project needed to meet.  “Noise is very clear to us — it is a 40 DBA standard.  Shadow flicker is very clear — it is an 8-hour per year standard.  We can meet that,” Kenworthy said.  

Finding Good Sites
Kenworthy said part of his tenacity in building the AWE project is that it is the best sited wind project in the state.  Not only because of the excellent wind resource, but also because the project can be built close to existing transmission lines and close to a main highway, so there is no need to build new transmission nor is there any roadway impact.

“Look, good wind sites, nowadays in New England are extremely rare.  This is one of them.  In fact, it's not just a good wind site, it’s a great wind site,” said Kenworthy.

Read more at Siting a Wind Farm in the Most Challenging Place in the US

Climates Change Faster in a Warmer and Wetter World

While more rain normally cools a summer environment, a warmer and wetter world could face quite unfamiliar problems.

Heat and moisture together can speed up climate change. (Image Credit: Mary Hollinger, NOAA, via Wikimedia Commons) Click to Enlarge.
Climate change may still cause surprises, if simultaneously it means a warmer and wetter world.  More heat and moisture together can unbalance ecosystems.

Scientists have been warning for decades of shifts towards ever greater risks of flooding in some places, more intense and sustained droughts and potentially lethal heatwaves in others.

But new research suggests an unexpected twist: temperate and subtropical zones could become both hotter and wetter during future summers.

And this could create a whole suite of unexpected problems: farmers and city dwellers who have adapted to a pattern of cool wet summers or hot dry summers could face a new range of fungal or pest infections in crops, or pathogens in crowded communities, as insects and microbes seize a new set of opportunities.

Canadian scientists report in Nature Communications that they considered what they call “departures from natural variability” that may follow as a consequence of continual rises in global average temperature, driven by ever greater combustion of fossil fuels that emit ever higher ratios of greenhouse gases into the atmosphere.

They studied historical records back to 1901, and climate projections as far as the year 2100.  And they see a problem:  creatures – people, crops, pathogens and pests – that have adapted to particular regional ecosystems could be jolted out of their comfort zone.

“Some of the disruptions of climate change stem from basic physics and are easily anticipated.  Increases in sea level, forest fires, heat waves, and droughts fall into that category.

“But there is a whole other category of unexpected disruptions that stem from upsetting the complex balance of ecosystems,” said Colin Mahony, a forester and doctoral student at the University of British Columbia, who led the research.

A global increase in outbreaks of fungal needle blight in pine plantations could be linked to wetter and warmer conditions.  Mosquito-borne pathogens could flourish in hot cities with once rare puddles of standing water.

Read more at Climates Change Faster in a Warmer and Wetter World

Youth and Colombia Forests - by James Hansen

Satellite images show a rainforest being deforested (Credit: Getty Images) Click to Enlarge.
Tropical deforestation does more than fuel global climate change, threatening all people.  It also affects life prospects of local youth.  So I am happy to see young people in Colombia stand up for their rights.  Yesterday my legal adviser Dan Galpern filed my Amicus Brief in Colombia to support 25 plaintiffs, youth between ages 7 and 26, who are filing a tutela (guardianship) action, a mechanism that the Colombian Constitution provides to protect fundamental rights of individuals to a dignified life, health, food and water.  The plaintiffs can be seen here.

Deforestation threatens fresh water supplies, as half of the rain that falls in the Colombian Amazon is recycled rain.  The impact of deforestation on ecosystems and freshwater, together with climate change, risks public health by helping spread vector-borne diseases such as dengue, chikungunya and zika.

Colombia, in the precatory 2015 Paris climate accord, committed to zero-net deforestation in the Colombian Amazon, the most biodiverse region in the world, by 2020.  Instead the nation allowed deforestation to skyrocket in 2016 by 44 percent.

The legal action of the 25 youth has been filed before the Superior Tribunal of Bogota, with the support of Dejusticia.  Dejusticia is a Colombia-based research and advocacy organization dedicated to the strengthening of the rule of law and the promotion of social justice and human rights in Colombia and the Global South.

The youth are asking the government to formulate an action plan within six months to reach zero-net deforestation in the Colombian Amazon.  Further, they are asking the government for an Intergenerational Agreement in which the authorities will commit to take effective and quantifiable measures to reduce greenhouse gas emissions

Read more at Youth and Colombia Forests

Friday, March 16, 2018

Friday 16

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

Meteorologists Have a New Strategy for Bringing Climate Change Down to Earth

Elisa Raffa (Credit: KOLR10 News) Click to Enlarge.
For years, TV meteorologists were hesitant to talk about climate change.  Climatological views — the long-term trends and patterns that influence weather — were not part of their education.  Their time on air is limited.  Some stations may discourage climate change talk.  Many meteorologists simply feel it isn’t their responsibility.  And some are concerned about how it might affect their ratings and job security.

“Audiences trust their local meteorologists,” says Mike Nelson, chief meteorologist at Denver7, an ABC affiliate in Colorado.  “Our jobs depend on that trust.  Meteorologists understand this, and some tend to stay away from controversial subjects.”

But that won’t do anymore, says Nelson.  “We are as close to a scientist as most Americans will ever get.  People invite us into their living rooms.  We have a responsibility to educate them on the facts.”

In 2010 several meteorologists joined Climate Central, George Mason and Yale universities, NASA, the National Oceanic and Atmospheric Administration, and the American Meteorological Society in a pilot project to explore how broadcast meteorologists could better communicate climate change.  Two years later, Climate Central launched Climate Matters as a full-time, national program to help meteorologists talk about climate change in and with their communities.

“We need more people connecting the dots about how climate change is already affecting people and will continue to do so in the future,” says Bernadette Woods Placky, Climate Central chief meteorologist and director of Climate Matters.  By linking local impacts to larger changes, Climate Matters aims to empower people to prepare for impacts like heatwaves, flooding, elevated food prices, and health situations.  “We are a resource to help meteorologists tell their local story,” says Woods Placky.

Today, Climate Matters supplies webinars to help meteorologists understand topics such as climate models, health impacts, and extreme precipitation events.  It provides data for individual markets, such as how viewers think about climate change.  It also offers weekly communication packages containing location-specific climate analyses and visuals as well as workshops offering a deeper dive into the science, impacts, and solutions to climate change.

Read more at Meteorologists Have a New Strategy for Bringing Climate Change Down to Earth

Dust Storms + Snowpack Raise Late-Summer Water Concerns

Darkening of snow from more dust storms in warmer Colorado Rockies 'just pushes on gas pedal for snowmelt'.

Dust on the snowpack enhancing snowmelt rates in Senator Beck Basin, San Juan Mountains, CO, May 2013. (Photo credit: Dr. Jeffrey Deems) Click to Enlarge.
“It looks apocalyptic,” says Jeff Deems, a research scientist at the University of Colorado.  With “a big orange-red sky, it really does look Martian.”

He’s describing dust storms – layers of windblown particles that are landing on mountain peaks and leaving them coated with a dark layer of sand and soot.  As anyone who has sat in a car with black upholstery on hot summer day will attest, black objects absorb more heat than lighter ones, so by the darkening the snow, it’s melting it faster.

Deems explains that “If you put dust on the snowpack, which enhances the absorption of that solar radiation, then that just pushes on the gas pedal for snowmelt.”  In a recent study looking at the Rocky Mountains of Colorado, Deems and lead author Tom Painter of NASA found that the amount of dust on mountain snowpack will control how fast rivers rise in the spring regardless of air temperature.  And the more dust there is, the faster the runoff.

The particles are carried to the Rockies by winds coming from the Southwest when deserts are drying out.  The dust events are frequently on the leading edge of a storm and can be pretty dramatic – closing interstates and making it hard to see and breathe.  The dust will get deposited in the mountains and quickly become buried by fresh snow, but Deems explains that the sun can see through about a foot of snow, so if it’s greater than 12 inches (30 cm) of new snow, then that dust is just buried, lurking in there waiting for the melt season to arrive.

The dust storms mostly happen in the spring, so the particulates tend to be near the surface of the snowpack and are deposited in a series of layers.  Once the runoff season starts, the snow will melt down and then hit a layer of the dust.  The water will drain away, but the dust piles up making the surface darker and darker as each layer is revealed.  This darkening supercharges the melting, and the pattern continues until the snowpack is gone.

More dust, less rain – problem worsening
The problem is getting worse because there’s more dust than there used to be because of less rain with a warming climate and more land development that’s exposing bare soil.  Those soil surfaces are naturally armored by crust – lichen and moss combinations that make this black armored surface.  Those crusts are virtually impervious to wind erosion.  They’re very strong – except when they’re crushed.  Beginning back in the mid- to late-1800s, soil started getting disturbed by people grazing animals.

Since then, Deems says, we’ve got a wide array of disturbance agents – recreational activities, oil and gas exploration and development, suburban development, dry land farming, etc.  All of these activities disturb the soil crust and make the fine grain substrates available for wind transport.

And there are various climate change components.  As we continue in this warming and drying trajectory in the Southwest, plants that might have anchored the soil are less likely to get the water they need to germinate and grow.

But the study doesn’t suggest that air temperature can be ignored.  It contributes somewhat, but Deems says their research finds that dust is the dominant force shaping the pace of spring runoff.  Also, temperature does control whether precipitation falls as snow or rain, so ultimately it regulates how much snow there is to melt.

All of this has important implications for water managers.  The snowpack is Colorado’s biggest reservoir, holding way more water than surface storage.  If it runs off faster, it can be a challenge to store it all.  And, in order to have water available for later in the summer, it’s vital that the snowpack stick around for as long as possible.

Water managers watch spring runoff to decide when – and how much – water to allocate to users, to store or to release from dams.  Deems says that if we shorten the snowmelt period by increasing that rate, they’ll have a much narrower window of time over which to make those decisions.

So, what can be done about the dust?  Studies have shown the soil crusts can regenerate if left alone, but land use development along with drier conditions because of climate change could make that recovery tougher.

Read more at Dust Storms + Snowpack Raise Late-Summer Water Concerns

Half a Degree More Global Warming Could Flood Out 5 Million More People

The 2015 Paris climate agreement sought to stabilize global temperatures by limiting warming to well below 2.0 degrees Celsius above pre-industrial levels and to pursue limiting warming even further, to 1.5 C.

To quantify what that would mean for people living in coastal areas, a group of researchers employed a global network of tide gauges and a local sea level projection framework to explore differences in the frequency of storm surges and other extreme sea-level events across three scenarios: global temperature increases of 1.5, 2.0 and 2.5 C.

They concluded that by 2150, the seemingly small difference between an increase of 1.5 and 2.0 C would mean the permanent inundation of lands currently home to about 5 million people, including 60,000 who live on small island nations.

The study, conducted by researchers at Princeton University and colleagues at Rutgers and Tufts Universities, the independent scientific organization Climate Central, and ICF International, was published in the journal Environmental Research Letters on March 15, 2018.

"People think the Paris Agreement is going to save us from harm from climate change, but we show that even under the best-case climate policy being considered today, many places will still have to deal with rising seas and more frequent coastal floods," said DJ Rasmussen, a graduate student in Princeton's Program in Science, Technology and Environmental Policy in the Woodrow Wilson School of Public and International Affairs, and first author of the study.

The researchers found that higher temperatures will make extreme sea level events much more common.  They used long-term hourly tide gauge records and extreme value theory to estimate present and future return periods of extreme sea-level events through the 22nd century.  Under the 1.5 C scenario, the frequency of extreme sea level events is still expected to increase.  For example, by the end of the 21st century, New York City is expected to experience one Hurricane Sandy-like flood event every five years.

Extreme sea levels can arise from high tides or storm surge or a combination of surge and tide (sometimes called the storm tide).  When driven by hurricanes or other large storms, extreme sea levels flood coastal areas, threatening life and property.  Rising mean sea levels are already magnifying the frequency and severity of extreme sea levels, and experts predict that by the end of the century, coastal flooding may be among the costliest impacts of climate change in some regions.

Future extreme events will be exacerbated by the rising global sea level, which in turn depends on the trajectory of global mean surface temperature.  Even if global temperatures are stabilized, sea levels are expected to continue to rise for centuries, due to the fact that carbon dioxide stays in the atmosphere for a long time and the ice sheets are slow to respond to warming.

 Read more at Half a Degree More Global Warming Could Flood Out 5 Million More People