Monday, December 18, 2017

California's Hellish Fires:  a Visit from the Ghost of Christmas Future

California’s textbook example of weather whiplash
In Charles Dicken’s ‘A Christmas Carol,’ the Ghost of Christmas Future appears to Ebenezer Scrooge to show what will happen if he doesn’t change his greedy, selfish life. California’s record wildfires are similarly giving us a glimpse of our future hellish climate if we continue with our current behavior. (Credit: Click to Enlarge.In Charles Dicken’s ‘A Christmas Carol,’ the Ghost of Christmas Future appears to Ebenezer Scrooge to show what will happen if he doesn’t change his greedy, selfish life. California’s record wildfires are similarly giving us a glimpse of our future hellish climate if we continue with our current behavior. (Credit: Click to Enlarge.
This year, California experienced its worst and most expensive wildfire season on record.  This surprised many, because while the state recently had its worst drought in over 1,200 years, the 5-year drought ended in 2016.  However, California was hit by the opposite extreme in 2017, with its wettest rainy season on record.

Though it seems counter-intuitive, the wet season contributed to the state’s wildfires.  The resulting vegetation growth created fuel for the 2017 fire season, particularly after being dried out by high temperatures.  2017 was the hottest summer in record on California, breaking the previous record set just last year by a full degree Fahrenheit.  As Stephen Pyne put it, “Whether it’s exceptionally wet or exceptionally dry, you’ve got the material for a fire in California.”

California’s wildfire season normally ends in October – big wildfires are relatively rare in November and December.  But fires are raging in Southern California two weeks shy of Christmas, impossible to contain due to intense Santa Ana winds, creating hellish scenes.

This was predicted by climate scientists
A 2006 study published in Geophysical Research Letters found that global warming would push the Southern California fire season associated with Santa Ana winds into the winter months.  As a 2015 study published in Environmental Research Letters found, Santa Ana fires are especially costly because of the speed at which they spread due to the winds and their proximity to urban areas.  That study concluded that the area burned by Southern California wildfires will increase by about 70% by mid-century due to the drier, hotter, windier conditions caused by global warming.

A 2010 study published in Forest Ecology and Management found that global warming may extend the fire season year-round in California and the southwestern USA.  These December fires will become more commonplace in a hotter world.  We’re literally getting a glimpse at Christmas future, and though there are other factors at play, human-caused global warming is largely to blame.

A 2015 special report in the Bulletin of the American Meteorological Society found that “An increase in fire risk in California is attributable to human-induced climate change.”  A 2016 study in the Proceedings of the National Academy of Sciences found that human-caused global warming doubled the area burned by wildfires in the western USA over just the past 30 years.

Add to that a new study just published in Nature Communications finding a connection between Arctic sea ice and high pressure ridges of California’s coast that can block storms from passing over the state.  These results suggest that as Arctic sea ice continues to disappear due to global warming, California may see less rainfall and thus even worse droughts, which along with higher temperatures would lead to worse wildfire seasons.

Scrooge changed - will Trump and Republicans?

Read more at California's Hellish Fires:  a Visit from the Ghost of Christmas Future

Toon of the Week: Record Wildfires / Climate Legislation Retardant

Read original at 2017 SkS Weekly Climate Change & Global Warming Digest #50

Photo of the Week: Emmanuel Macron attends the Tech for Planet event in Paris, France Monday.

Read original at 2017 SkS Weekly Climate Change & Global Warming Digest #50

Poster of the Week:  Science by Santa / Remember, bad children get coal. / Good kids get solar panels, which are much more fuel efficient!

Read original at 2017 SkS Weekly Climate Change & Global Warming Digest #50

Sunday, December 17, 2017

  Sunday, Dec 17

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

Meet the Microgrid, the Technology Poised to Transform Electricity - By David Roberts and Alvin Chang

This is the path to a cleaner, more reliable, more resilient energy grid.

 Meet the Microgrid (Credit: Click to Enlarge.
If we want a livable climate for future generations, we need to slow, stop, and reverse the rise in global temperatures.  To do that, we need to stop burning fossil fuels for energy.

To do that, we need to generate lots of carbon-free electricity and get as many of our energy uses as possible (including transportation and industry) hooked up to the electricity grid.  Electrify everything!

We need a greener grid.  But that’s not all.

The highly digital modern world also demands a more reliable grid, capable of providing high-quality power to facilities like hospitals or data centers, where even brief brownouts can cost money or lives.

The biggest, cheapest sources of carbon-free power — wind and solar — are variable, which means that they come and go on nature’s schedule, not ours.  They ramp up and down with the weather, so integrating them into the grid while maintaining (and improving) reliability means finding clever ways to balance out their swings.

Finally, recent blackouts in the wake of Hurricanes Irma and Maria highlight the need for a more resilient grid — one that can get back up and running quickly (at least for essential sites) after a disaster or attack.

It’s a triple challenge:  We need, all at once, a greener, more reliable, more resilient electricity grid.

But hark!  Lo!  There is a technology, or a set of technologies, that promises one day to be a triple solution — to address all three of the grid’s needs at once.

We speak of the humble microgrid.

What is a microgrid?
Technically, a grid is any combination of power sources, power users, wires to connect them, and some sort of control system to operate it all.

Microgrid just means a small, freestanding grid.  It can consist of several buildings, one small building (sometimes called a “nanogrid”), or even one person (a “picogrid”) with a backpack solar panel, an iPhone, and some headphones.

The research firm GTM counts “1,900 basic and advanced, operational and planned microgrids” in the US, with the market expected to grow quickly.  Most microgrids today are basic, one-generator affairs, but more complex microgrids popping up all over — there’s a cool one in Brooklyn, a cool one on Alcatraz Island, and the coolest one of all in Sonoma, California.  Microgrids also play a big role in plans to rebuild Puerto Rico’s grid.

Let’s take a quick tour of microgrids and their potential.

Off-grid microgrids to extend power to the poor
Click to Enlarge.
Some microgrids stand on their own, apart from any larger grid, often in remote rural areas.  These off-grid microgrids are a relatively cheap and quick way to secure some access to power for people who now lack it, often more quickly than large, centralized grids can be extended.

Grid-connected microgrids can “island” from the larger grid
Click to Enlarge.Most microgrids, especially in wealthier nations, are grid-connected — they are embedded inside a bigger grid, like any other utility customer.  All the examples cited above fit this bill.

What makes a microgrid a microgrid is that it can flip a switch (or switches) and “island” itself from its parent grid in the event of a blackout.  This enables it to provide those connected to it with (at least temporary) backup power.

Again, most actually existing microgrids are extremely basic — think of a hospital with a diesel generator in the basement, or a big industrial facility with a combined-heat-and-power (CHP) facility on site that can provide some heat and power during a blackout.

The next step: integrating more diversity, including distributed renewable energy
Click to Enlarge.
As basic as most of them are today, microgrids hold great promise for the future.  Technology is rapidly expanding the possibilities.
Smart design and software can create microgrids specifically designed to integrate distributed renewable energy, or microgrids designed to provide “six nines” (99.9999 percent) reliability, or microgrids designed for maximum resilience.  There are even “nested” microgrids within microgrids.

Need to work with the larger grid
 Click to Enlarge.
Smarter microgrids can communicate on an ongoing basis with their parent grids, forming a beautiful friendship.

By aggregating together distributed, small-scale resources (solar panels, batteries, fuel cells, smart appliances, and HVAC systems, etc.), a microgrid can present to the larger grid as a single entity — a kind of Voltron composed of distributed energy technologies.

This makes things easier on grid operators.  They don’t necessarily relish the idea of communicating directly with millions (or billions) of discrete generators, buildings, and devices.  It’s an overwhelming amount of data to assimilate.  Microgrids can gather those smaller resources together into discrete, more manageable and predictable chunks.

Grid operators can put these chunks to good use.  A smart microgrid can provide “grid services” — storing energy when it’s cheap, providing energy when it’s expensive, serving as backup capacity, or smoothing out frequency and voltage fluctuations.

Read more at Meet the Microgrid, the Technology Poised to Transform Electricity

Three Surprises on Climate Change from Economist Michael Grubb - By Lynn Parramore

Two years after the 2015 Paris Agreement, where we stand today is better than you may think 

Michael Grubb, Professor of Energy and Climate Change at University College London and a grantee of the Institute for New Economic Thinking, co-authored a recent study showing that what many saw as an overambitious goal to keep the earth’s temperature from rising more than 1.5 degrees Celsius may actually be reachable.  Climate change deniers quickly pounced, using the hopeful news as an excuse to blame researchers for updating their models and to downplay the climate crisis.  Two years after 195 countries signed the Paris Agreement on climate, Grubb explained what the researchers really found and shared with INET surprising developments on global warming, the future of nuclear energy, and why the rest of the climate community isn’t too worried about President Trump.

Alternative Energy Wind (Credit: Click to Enlarge.
Lynn Parramore:  Let’s talk about the recent study you co-authored that created a media stir. You found that things might be a little better than we thought in terms of the Earth’s temperature rising.  Can you explain your conclusions and how they have been spun in the press?

Michael Grubb:  Sure.  It turns out that we had a longer period than expected where temperatures didn’t rise as fast as the trend of the previous few decades – though they have jumped in the past couple of years.  So we updated estimates that were almost a decade old.  I do want to emphasize that the difference between what we found and what was widely understood from previous research is small— it shouldn’t have been a massive deal.

Our study in no way means that we don’t have a climate crisis.  But we might be slightly better positioned to meet certain goals, like those set forth in the 2015 Paris Agreement, than we thought.

LP:  And yet Breitbart and other media outlets shouted that climate scientists “admit they were wrong about global warming.”  How do you respond to that?  How can scientists combat the misinformation?

MG:  Partly it’s a problem of scientists not communicating effectively what they do.  They run big, complicated models, and measure the past.  Scientists looked at C02 emissions since the Industrial Revolution and made projections based on their findings:  For every billion tons of carbon we dump into the atmosphere, the temperature goes up by a certain amount.

Based on those assessments, the people who had been running the big modeling projections, said, ok, if we want to prevent the global temperature from rising more than 1.5 degrees Celsius, then we can have only have so much in emissions—and it looks like we’ve only got a few years left at current emission rates before we pass the limit.

Governments made a deal in Paris based on ‘avoiding dangerous interference’ with the climate system, which included this target of 1.5 degrees Celsius at the ambitious end.  A lot of people, including me, were pessimistic about achieving that goal.

The studies had actually presented estimates on temperatures rising within a range, but unfortunately, some in the scientific community succumbed to the demand for a single number.  So they chose a number in the middle of the range that the models showed.  Where we are today is actually well within the range of the models.  We’re just not right in the middle.  We have additional information about what’s happened since then and we have slightly different estimates of the way gases other than CO2 contribute to rising temperatures.

LP:  So it’s not that scientists got anything wrong.  Rather, it’s a matter of previous findings becoming oversimplified in the public discussion and of more information coming to light since then.

MG:  Right.  Unfortunately, a lot of misleading things have come out in the press, especially Breitbart, which got it all wrong.  But this is the basic challenge for science.  If you really look at what’s happened in relation to this paper, you see that science is about continually trying to improve your estimates.  The political approach being adopted, in contrast, is to say that any attempt to improve anything in your estimation is treated as, “Oh, well, it was all wrong before then!”

How is knowledge supposed to advance if you never improve on what you did before?  There’s also a huge challenge about how to effectively communicate uncertainty and complexity.

LP:  You actually once stated that the Paris goal of 1.5 degrees Celsius was “incompatible with democracy.”

MG:  I did!  That was actually my first tweet ever.

LP:  That was a doozy.  Has anything else happened since that tweet to change your mind?

MG:  Yes it has!  I was responding to the notion held by many that we could reach the goal technologically if we spent enough quickly enough.  Well, of course!  But the problem is a political one.  It’s a social science problem.  Instead of social scientists in this space, we have modelers churning out models with targeted carbon prices and so on, when in reality we can’t get even a small part of it through a political system.

So that tweet was my cri de coeur to say, look, this goal is impossibly ambitious in real countries where people vote and may well object to what we’d have to do.  So you’d better start thinking about the social scientific aspects.

Three things have changed since the tweet.  First, we now have an approach that indicates we may have about 20 years of current emissions before we blow the 1.5 degree Celsius goal – meaning for example, if we reduce in a slight line from today for 40 years we might do it.  Second, to everyone’s surprise, Chinese emissions have stopped growing for the previous two years, and global emissions stopped growing, too.  I don’t know if that’s enduring, but it looks like China has shifted and that’s a fantastic development.  The third thing is that the cost of renewables has collapsed faster than expected.

Read more at Three Surprises on Climate Change from Economist Michael Grubb

Saturday, December 16, 2017

  Saturday, Dec 16

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

When Property Rights Clash with the Rising Sea

The American ethos of individualism is clashing with efforts to protect coastal communities against sea level rise, often to the homeowners’ detriment.

Beach Nourishment Boom (Credit: Program for the Study of Developed Shorelines at Western Carolina University) Click for interactive map
This year the town of Scituate, Massachusetts, which includes Humarock, proposed building a $9.6 million artificial dune and raised road to protect the homes.

Yet some residents are prepared to block the project.  The town is asking them to sign easements that would cede property rights along the privately owned beach and allow public access.  Whatever concerns they have about protecting their homes are being overridden by fear of permanently relinquishing control of their property.

Rising sea levels driven by climate change are forcing communities like Humarock to confront a troubling future.  The global water line has risen by about 8 inches on average since 1900, and it's expected to rise about that much or more by 2050.
As public officials at all levels of government try to protect the nation's coasts from rising seas, they're confronting an American ethos that champions individualism over central planning.  The federal government has no master plan for adapting to sea level rise.  States often leave critical decisions about coastal infrastructure to local governments.  And many people would prefer to protect their own property.
Coastal towns face a sobering reality:  They've been losing land for a century, and they'll lose even more in the decades ahead.  To fight this encroachment, states, towns, and the federal government have spent billions of dollars bolstering dunes and beaches with sand pumped from the seafloor or imported from inland mines—more than $3.1 billion from 2007-2016, according to data compiled by Western Carolina University.

Beach building is one of the more effective, environmentally friendly measures against coastal erosion, according to geologists and engineers.  But some beachfront homeowners have resisted, particularly when they've been obliged to sign easements that open their property to public access.

Read more at An American Beach Story:  When Property Rights Clash with the Rising Sea

Why the Future of Batteries Is Lithium, and Why Their Impact Will Be Bigger than You Think - by Gerard Reid

The tremendous advances being made in lithium battery technology are being underestimated by many people, writes financial energy specialist Gerard Reid. Competitive EVs are just a few years away. They will be followed by radical improvements that will have huge implications for air and ship transport as well. (Credit: Energy and Carbon blog) Click to Enlarge.
I constantly hear from people that batteries are just not good enough; that they are too expensive, that they cannot hold enough energy, that they take too long to charge.  In a nutshell, that they can never replace oil to power the automobile or be used alongside solar panels to power our everyday home needs.

The interesting thing is that I agree with all these criticisms.  However, I am also clear that these challenges will be solved in the coming years thanks to the usage of new chemistries and materials as well as intense competition between battery manufacturers.  I am also convinced that better batteries will lead to the electrification of not just cars but also trucks, buses, and increasingly air and sea transport.

At the same time, we will use these batteries in our homes, businesses, and grids as part of decentralized energy systems.  And all these batteries will be most likely based around the element lithium.

Better materials
Lithium is ... unique as a material in that it is very light, with the lowest reduction potential of  any chemical element, which allows batteries based on lithium to have unbeatable performance.  The other advantage is that there is lots of lithium out there, some 400 years of output according to the US Geological Survey

Energy Density (Credit: Alexa Capital) Click to Enlarge.
The most popular type of lithium battery is the lithium-ion battery, which, because of its unmatchable combination of higher energy and power density, has become the rechargeable battery of choice for power tools, mobile phones, laptops, and increasingly electrical vehicles (EVs).  That all said, there are many different types of lithium-ion battery.

And I don’t mean just different manufacturers such as Panasonic, LG Chem, CATL, and Samsung.  There are five major types of lithium-ion battery chemistry:  LFP (lithium iron phosphate), NMC (nickel manganese cobalt), NCA (nickel cobalt aluminum), LMO (lithium manganese oxide), and LCO (lithium cobalt oxide), all of which have differing strengths and weaknesses, and all of which are used in different applications.

NMC, for instance, is generally regarded as the chemistry with the most potential for use in the EV given its high performance, safety, and low cost.  And in the short term there is significant potential to reduce the cost and improve the performance of NMC batteries.

Currently, the standard NMC lithium-ion battery is called a 333 meaning it uses 3 parts nickel, 3 manganese, and 3 cobalt.  Going forward, we will see 811 NMC batteries which will use more nickel which increases performance and less cobalt which decreases cost.   However, none of these lithium-ion chemistries are going to provide the energy and power density required to power an airplane, so the search is on for better materials.

Read more at Why the Future of Batteries Is Lithium, and Why Their Impact Will Be Bigger than You Think

Three Things that Wouldn't Have Happened in 2016 Without Climate Change

Vehicles are submerged in floodwaters on July 9, 2016 in Wuhan, Hubei Province of China. Many parts of the Wuhan area were submerged in floodwaters during July as torrential rains affected the Yangtze River valley. A new study finds that climate change increased the risk of the rains that led to the flooding by 17 – 59%. (Image credit: VCG/VCG via Getty Images) Click to Enlarge.
“Impossible” is a fraught word, but a new set of studies concludes that at least three atmospheric and oceanic phenomena from 2016 wouldn’t have occurred had we not been adding greenhouse gases to the air for more than a century.

Record heat in Asia, record-high global temperature, and a marine “heat wave” in the far North Pacific are among 27 events analyzed in Explaining Extreme Events in 2016 from a Climate Perspective.  This report, free to download in its entirety, is the sixth annual compilation of climate attribution studies published as a supplement to the Bulletin of the American Meteorological Society (BAMS).  The 2016 report was released on Wednesday.

The BAMS series has become the leading venue for papers in the climate-science subfield called detection and attribution (D&A).  The idea in such work is to identify a measurable weather/climate change (detection) and to determine what caused it (attribution).  Most D&A studies end up yielding probabilities, typically the extent to which odds of a given event have been raised by climate change.  This year, the BAMS report includes three events that go so far beyond any modern precedent that they can’t be explained without invoking human-produced greenhouse gases.

"This report marks a fundamental change," said Jeff Rosenfeld, editor-in-chief of BAMS.  “For years scientists have known humans are changing the risk of some extremes.  But finding multiple extreme events that weren’t even possible without human influence makes clear that we're experiencing new weather because we've made a new climate."

Read more at Three Things that Wouldn't Have Happened in 2016 Without Climate Change

The US Is Penny Wise and Pound Foolish on the Climate - by John Abraham

As America is battered by climate-intensified weather disasters, Republican politicians are trying to slash climate research funding.

Floodwaters from Tropical Storm Harvey surround homes in Port Arthur, Texas, on Aug. 31, 2017. (Photograph Credit:  Gerald Herbert/AP) Click to Enlarge.
The United States is great in many respects.  But we certainly aren’t perfect; we’ve made some pretty silly choices.  One of the dumb choices politicians in the United States want to make is to defund climate science so we wont be able to prepare for increased disasters in the future.  We can see how shortsighted this in when compared alongside with the costs of disasters.

Just think about the respective magnitudes.  Estimates put the costs of the three big 2017 hurricanes (Harvey, Irma, and Maria) at approximately $200 billion.  It is somewhat challenging to estimate the actual cost because not only is there rebuilding that must occur, but there are also lingering damages from loss of power, dislocation of people, and other long-lasting factors.  Some reports estimate that the damage may end up being as high as $300 billion – a staggering amount.

It isn’t just hurricanes that cause damage.  As I write this, terrible fires are devastating parts of California, damaging property and agricultural lands.  This is on top of earlier fires elsewhere in the region, which followed closely on record droughts that had persisted in the preceding five years.

Earlier in the year the United States had other disasters that reached a billion dollars or more in damages (two floods, seven severe storms among others).  NOAA provides an excellent summary.

These disasters are not limited to the United States, of course.  Extreme weather fueled by human carbon pollution is occurring around the world.

But in the midst of this, President Trump and many Republican elected officials want to decrease our spending on climate science.  In the United States we have flagship organizations like NASA and NOAA that are our eyes and ears on the climate.  But throughout the year, Trump has worked to get NASA to sharply reduce or even stop climate research.  NASA has two main missions.  One mission is exploration – going to Mars, the moon, and sending exploration satellites that look outward.  The other part of NASA’s mission is to look inwards, at our own planet.  To do this, they use many instruments, including satellites to measure what is happening on Earth. 

Trump and his administration want to jettison the Earth research portion of NASA’s mission.  This obviously isn’t to save money; the amount we spend on Earth-focused missions is very small.  Rather, it is to halt research into the Earth’s climate.  The following chart compares the cost savings from budget cuts with the extreme weather costs just this year in the USA.

Climate scientists have won the war on the facts.  We know it is warming, we know how fast it is warming.  We know what is causing the warming.  And, we know what to do about it.  Since Trump (and sadly the Republican Party as a whole) have lost that battle, they have decided to blind us so we just won’t know what is happening. 

Read more at The US Is Penny Wise and Pound Foolish on the Climate

Bitcoin Reforms Proposed to Curb Soaring Carbon Footprint

Bitcoin mining uses increasing amounts of energy (Picture Credit: Pixabay) Click to Enlarge.
Clean Coin, a research group tracking the carbon footprint of bitcoin and its rival cryptocurrency ethereum, says they emitted 9.3 million and 3.4 million tonnes of CO2 respectively in November.

If half the population embraces digital coins by 2030, they will use the equivalent of today’s entire global electricity production, the organization claims.

In a white paper on to be published on Monday, Clean Coin will suggest ways to reduce bitcoin mining’s energy consumption.

“The paper analyzes the question of what makes a coin clean, how a digital currency has to be to make sense from an environmental viewpoint.  It is a complex question with no simple answers,” said Beglinger.

Mining must be streamlined, he said – especially the “proof of work” component in which a miner solves a calculation to produce a new coin and show it is valid for the system.  To save energy, the paper proposes a “proof of stake” method in which network stakeholders or owners automatically validate coin production, rendering the “proof of work” process unnecessary.

Read more at Bitcoin Reforms Proposed to Curb Soaring Carbon Footprint

Thursday, December 14, 2017

  Thursday, Dec 14

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

How to Build a More Resilient Power Grid

After Superstorm Sandy hit New York City in October 2012, the city's famous skyline was mostly dark. (Photo Credit: Reuters/Eduardo Munoz) Click to Enlarge.
Abi-Samra has more than 35 years of experience in power generation, transmission, distribution, retail, and end-use energy applications.  He is president of Electric Power & Energy Consulting and an adjunct professor with UC San Diego.  He also is the author of a new book Power Grid Resiliency for Adverse Conditions (Artech House, 2017).

The book is part technical reference guide and part history lesson.  In it, Abi-Samra describes the impacts of heat waves, ice storms, and hurricanes on grid operations through case studies from North America, Europe, and Asia.
Synchophasors measure the instantaneous voltage, current, and frequency at specific locations on the grid, offering operators a near-real-time picture of what’s happening on the system, which lets them take action to prevent power outages.

Load shedding involves the short-term interruption of power to one or more end users to allow the grid to rebalance itself.  Many industrial-scale power users trade off the occasional loss of power for lower power prices, known as “interruptible rates.”

The early 2000s were also marked by hurricanes that hit Florida and Louisiana particularly hard.  Widespread loss of transmission and distribution poles led to efforts to replace wooden poles with steel and concrete.  Further hardening came after Hurricane Katrina devastated substations, leading to investments to elevate them above storm surge levels.

Superstorm Sandy in 2012 exposed storm vulnerabilities in the Northeast, particularly the near-impossibility of insulating a system from damage in the face of fearsome winds and flooding.

The idea that resulted from Sandy was to “allow the system to fail, but in such a way that it could quickly recover,” Abi-Samra says.  This illustrates another lesson:  Efforts intended merely to harden infrastructure are not enough—the grid also needs to be resilient.

Hardening and resiliency are different concepts, Abi-Samra says.  Resiliency refers to characteristics of the infrastructure and operations such as strength and the ability to make a fast recovery, which help utilities minimize or altogether avoid disruptions during and after an extreme weather event.

Read more at How to Build a More Resilient Power Grid

China to Launch Nationwide Carbon Market Next Week

The emissions trading scheme will initially cover only the power sector, not heavy industry as planned, but will nonetheless become the biggest in the world.

China is launching the biggest carbon market in the world, which will require power plants to hold emissions permits (Picture Credit: Flickr/V.T. Polywoda) Click to Enlarge.
China’s long-awaited nationwide emissions trading scheme (ETS) will be officially launched on 19 December, starting with the power sector only, according to a document from National Development Reform Commission (NDRC).

It represents a scaling back from the original plan for eight economic sectors to take part in the carbon market:  petrochemicals, chemicals, building materials, iron and steel, non-ferrous metals, paper, power, and aviation.

Nonetheless, it will instantly overtake the EU’s carbon market to become the world’s largest.  The power sector accounts for 46% of China’s carbon dioxide emissions, of which an estimated 39% will be covered by the ETS, according to data from World Resource Institute.

Explaining the change, Chinese officials said some industrial sectors did not have strong statistical foundations, and the system would involve constant testing and continuous adjustments.

Carbon futures trading will not be available at the launch stage of the scheme, Xie Zhenhua, China’s special representative for climate change, said during the UN climate conference in Bonn last month.  It is intended to create a cost for emitting carbon, not a platform for market speculation, he said.

An official at NDRC who asked not to be named said the conservative approach reflected the importance leaders attached to the overall stability of the country’s financial markets.

Read more at China to Launch Nationwide Carbon Market Next Week:  Officials

GateHouse Media Publishes ... Anti-Wind Article Devoid of Scientific Evidence

Multiple exposures of a wind turbine (Image Credit: Stefan Lins vis Flickr CC) Click to Enlarge.
This week, Gatehouse Media published a long-form investigative report called “In the Shadow of Wind Farms” claiming that wind energy has caused negative health effects for residents living near wind turbines — a claim that flies in the face of actual science.

GateHouse Media’s anti-wind article leans almost entirely on anecdotal evidence compiled during its six-month long project that included interviews with dozens of people who claim negative outcomes from living near wind farms.

Meanwhile, in the realm of scientific facts, the American Wind Energy Alliance, the main trade group representing the wind power industry, points to 25 scientific reviews that document the safety of wind farms for human health and the environment.  One health researcher has told DeSmog the GateHouse article was “simply irresponsible journalism” and actually had “potential to exacerbate the experience of anxiety and related health effects.” 

Although GateHouse, a syndication outfit that publishes 130 daily newspapers in 36 states, briefly mentions the fact that scientific evidence for these claims is nearly non-existent, it plows ahead with a lengthy article full of anecdotes and unsubstantiated claims that aren’t supported by real science.

How this piece got published in its horribly one-sided (anti-wind) approach is a question worthy of many letters to the editor.  Whether GateHouse — which is notoriously quiet when asked questions by other media outlets about its operations — will answer or attempt to defend this attack piece, time will tell.

But there is no hiding that the anti-wind movement is clearly coordinated, which even GateHouse admits its reporters were told repeatedly by sources contacted for their piece.

“Many of the people who do complain, several representatives said, are well-known among industry insiders and comprise a small but vocal group of anti-wind activists,” the article said.

“There are a good number of people who seem to pop up in different states and fight any wind project they can find,” Dave Anderson of the Energy and Policy Institute, a pro-renewable energy watchdog group, told GateHouse.

But the GateHouse reporters failed to disclose important information about several of their sources, including some with extensive ties to highly coordinated anti-wind activism.

Read more at In the Shadow of Honest Journalism:  GateHouse Media Publishes Atrocious Anti-Wind Article Devoid of Scientific Evidence

High-Resolution Climate Models Present Alarming New [Temperature] Projections for US

Climate change (Credit: © f9photos / Fotolia) Click to Enlarge.
Approaching the second half of the century, the United States is likely to experience increases in the number of days with extreme heat, the frequency and duration of heat waves, and the length of the growing season.  In response, it is anticipated that societal, agricultural and ecological needs will increase the demand on already-strained natural resources like water and energy.  University of Illinois researchers have developed new, high-resolution climate models that may help policymakers mitigate these effects at a local level.

In a paper published in the journal Earth's Future, atmospheric sciences professor Donald Wuebbles, graduate student Zach Zobel, and Argonne National Laboratory scientists Jiali Wang and Rao Kotamarthi demonstrate how increased-resolution modeling can improve future climate projections.

Many climate models use a spatial resolution of hundreds of kilometers. This approach is suitable for global-scale models that run for centuries into the future, but they fail to capture small-scale land and weather features that influence local atmospheric events, the researchers said.

"Our new models work at a spatial resolution of 12 km, allowing us to examine localized changes in the climate system across the continental U.S.," Wuebbles said.  "It is the difference between being able to resolve something as small as Champaign County versus the entire state of Illinois -- it's a big improvement."

The study looked at two different future greenhouse gas output projections -- one "business as usual" scenario where fossil fuel consumption remains on its current trajectory and one that implies a significant reduction in consumption by the end of the century.  The group generated data for two decade-long projections (2045-54 and 2085-94) and compared them with historical data (1995-2004) for context.

"One of the most alarming findings in our business-as-usual projection shows that by late-century the southeastern U.S. will experience maximum summer temperatures every other day that used to occur only once every 20 days," Zobel said.

Although not as severe, other regions of the country are also expected to experience significant changes in temperature.

"The Midwest could see large unusual heat events, like the 1995 Chicago heat wave, which killed more than 800 people, become more common and perhaps even occur as many as five times per year by the end of the century," Wuebbles said.  "Heat waves increase the mortality rate within the Midwest and the Northeast because people in these densely populated regions are not accustomed to coping with that kind of heat that frequently."

The extreme temperatures and extended duration of the warmer season will likely take a significant toll on crops and the ecosystem, the researchers said.  Areas like the American West, which is already grappling for limited water resources, could witness much shorter frost seasons at high elevations, leading to a smaller surge in spring meltwater than what is needed for the early growing season.

"The high resolution of our models can capture regional climate variables caused by local landforms like mountains, valleys and bodies of water," Zobel said.  "That will allow policymakers to tailor response actions in a very localized way."

The new models concentrate on temperature and do not factor in the effect that regional precipitation patterns will have on the impact of the anticipated climate changes.  The researchers plan to extend their study to account for these additional variables.

Read more at High-Resolution Climate Models Present Alarming New Projections for US

The Warming Arctic Could Put a Serious Dent in Wind Energy Production.

Wind farm (Credit: Francois le Diascorn/Getty Images) Click to Enlarge.
The warming Arctic could put a serious dent in wind energy production.  There’s a pretty steep temperature difference between the North Pole and the equator.  But that difference — which fuels atmospheric energy, powering storm systems and the breeze — is shrinking, and it’s altering the distribution of wind energy resources around the globe.

This could put significant strain on wind power production in the United States, Europe, and Asia, a study published Monday in the journal Nature suggests.

The study shows that warming temperatures could result in a 17 percent drop in wind power in the U.S. and a 10 percent decline in the U.K. by 2100.  On the upside, wind power in Australia, Brazil, and West Africa could actually increase.

Read more at The Warming Arctic Could Put a Serious Dent in Wind Energy Production.

Wednesday, December 13, 2017

  Wednesday, Dec 13

Global surface temperature relative to 1880-1920 based on GISTEMP analysis (mostly NOAA data sources, as described by Hansen, J., R. Ruedy, M. Sato, and K. Lo, 2010: Global surface temperature change. Rev. Geophys., 48, RG4004.  We suggest in an upcoming paper that the temperature in 1940-45 is exaggerated because of data inhomogeneity in WW II. Linear-fit to temperature since 1970 yields present temperature of 1.06°C, which is perhaps our best estimate of warming since the preindustrial period.

A Harder Rain’s A-Gonna Fall in the US

“The floods of the future are likely to be much greater than what our current infrastructure is designed for”

Much more heavy rain is likely. (Image Credit: Nicholas A. Tonelli, via Wikimedia Commons) Click to Enlarge.
For the US, harder rain is on the way:  America’s summer thunderstorms are about to get stormier.  Later this century, the notorious mesoscale convective storms of middle America will not just darken US skies:  they will dump as much as 80% more water  on the farms, highways and cities of the 48 contiguous states.

Mesoscale thunderstorms cover an area of around 100 kilometers:  these have been on the increase, both in frequency and intensity, in the last 35 years and new research suggests that, as the world warms, their frequency could triple.

“The combination of more intense rainfall and the spreading of heavy rainfall over larger areas means that we will face a higher flood risk than previously predicted,” said Andreas Prein, of the National Center for Atmospheric Research in the US, who led the study.

“If a whole catchment area gets hammered by high rain rates, that creates a much more serious situation than a thunderstorm dropping intense rain over parts of the catchment.  This implies that the flood guidelines which are used in planning and building infrastructure are probably too conservative.”

Thunderstorms already cost the US around $20bn a year in flash floods, landslides, debris flows, high winds, and hail.  Dr Prein and his colleagues report in Nature Climate Change that what they call “observed extreme daily precipitation” increased in all parts of the US from 1958 to 2012:  that is because rising temperatures mean more evaporation, and at the same time a greater atmospheric capacity for moisture.

US President Donald Trump has made it clear that he doesn’t believe in global warming and has promised to withdraw the US from the global climate pact agreed by 197 nations in Paris in 2015.

But research, much of it from US government agencies, suggests that climate change is happening anyway, and that US cities are at risk.  The latest computer simulations suggest that the number of extreme summer storms in some parts of the US could have increased fivefold by the century’s end.

Even the eastern seaboard could be hit:  intense storms over an area the size of New York City could drop 60% more rain than the heaviest now.  And this could add up to six times the annual discharge of the Hudson River.

The finding should come as no great surprise.  Climate scientists have repeatedly warned that climate change, driven by global warming as a consequence of the profligate combustion of fossil fuels that dump ever greater levels of greenhouse gases in the atmosphere, could bring ever greater extremes of heat and rain.

Read more at A Harder Rain’s A-Gonna Fall in the US

Arctic Report Card:  Lowest Sea Ice on Record, 2nd Warmest Year

Climate scientists say the magnitude and rate of sea ice loss this century is unprecedented in 1,500 years and issue a warning on the impacts of a changing climate.

Older Arctic sea ice is being replaced by thinner, younger ice. NOAA reports that multiyear ice accounts for just 21 percent of the ice cover in 2017. (Credit: Thomas Newman/CICS-MD/NOAA) Click to Enlarge.
The Arctic experienced its second-warmest year on record in 2017, behind only 2016, and not even a cooler summer and fall could help the sea ice rebound, according to the latest Arctic Report Card.

"This year's observations confirm that the Arctic shows no signs of returning to the reliably frozen state that it was in just a decade ago," said Jeremy Mathis, director of the Arctic program at National Oceanic and Atmospheric Administration (NOAA), which publishes the annual scientific assessment.

"These changes will impact all of our lives," Mathis said.  "They will mean living with more extreme weather events, paying higher food prices and dealing with the impacts of climate refugees."

The sea ice in the Arctic has been declining this century at rates not seen in at least 1,500 years, and the region continued to warm this year at about twice the global average, according to the report.  Temperatures were 1.6° Celsius above the historical average from 1981-2010 despite a lack of an El Nino, which brings warmer air to the Arctic, and despite summer and fall temperatures more in line with historical averages.

Read more at Arctic Report Card:  Lowest Sea Ice on Record, 2nd Warmest Year