logo: Yale Center for Environmental Law & Policy

YCELP News Feed

Section Image

On the Environment

Thursday, May 05, 2011
| Share

Nifty New LCA Calculator

By testpersona

Sleek and simple new Life Cycle Assessment (LCA) calculator makes it simple for all levels of business to calculate their environmental impact – from cradle to grave.

Posted in: Innovation & EnvironmentEnvironmental Performance Measurement
| Share

Green I.T.: Now an “Essential”

By Guest Author, Jose Iglesias, Vice President of Education and Enablement Serviecs, Symantec Services Group (SSG), Symantec Corporation

In past years, green IT seemed to be more of a "wish list" item, something that companies might look into sometime in the future or when it became convenient. This is no longer the case. Companies are now actively pursuing green IT solutions for a multitude of reasons.

Ninety-seven percent of companies are at least discussing a green IT strategy. Fifty-two percent are in the discussion or trial stages, while forty-five percent have already implemented a strategy.

Additionally, 87 percent of companies said that it is somewhat/significantly important that their IT organization implement green IT initiatives. Only two percent said it was somewhat/significantly unimportant.

Companies are no longer seeking green IT merely to cut costs, either. True, reducing energy consumption (90 percent) and reducing cooling costs (87 percent) were the most important reasons companies listed for implementing green IT. However, a desire from corporate headquarters to qualify as "green" (86 percent) was nearly as important.

Finally, 81 percent of companies listed reducing energy and cooling consumption among goals included in their green policies, followed by reducing carbon emissions (74 percent) and improving the company’s reputation (67 percent).

As a result of its ongoing Green IT efforts, Symantec is achieving substantial business benefit. The Alchemy Solutions Group conducted a Total Operational and Economic Impact (T.O.E.I.)™ analysis and quantified realized and projected business value in the following areas between July 2007 and December 2011:

  • Remote Site Backup Productivity Gains: $692,743 in hardware and media cost avoidance and $443,328 in labor productivity gains through global remote site backup with Veritas NetBackup PureDisk from July 2007 through December 2011.
  • Hardware Maintenance Cost Savings: $12,358,000 in maintenance savings on retired server and storage hardware from August 2007 through December 2010.
  • Labor Productivity Gains: $1,341,130 in IT productivity gains related to server and storage reduction from January 2008 through December 2010.
  • Energy Cost Avoidance: $2,164,438 in utility cost avoidance through hardware device reduc­tion and corresponding power consumption savings from August 2007 through December 2010.


The decommissioning of hardware from a major data center closure reduced Symantec’s overall device power utilization from approximately 500 kilowatt hours (kWh) to 168 kWh, a 67 percent reduction in energy consumption.

Further, by converting the cost of the kilowatts of electricity avoided to kilograms of carbon emissions, Symantec conservatively estimates a cumulative carbon footprint savings of 15.5 million kilograms of CO2 from 2007 through 2010.

Finally, in addition to the benefits realized within the IT data center environment, Symantec also realized significant cost savings by stemming energy use at the IT endpoint. By deploying an automated power management profile that places company laptops and desktops in standby mode after four hours of inactivity, the company expects annual savings of $800,000 and more than 6 million kWh of energy per year.

(Source: Symantec Data Center Survey 2010, 1052 World Wide Enterprise companies)

This post, by guest author Jose Iglesias, Vice President of Education and Enablement Serviecs, Symantec Services Group (SSG), Symantec Corporation, was originally published on the Green to Gold Business Playbook website.

Posted in: Innovation & Environment
Wednesday, May 04, 2011
| Share

Achieving Zero Waste to Landfill

By Guest Author, By Steve Walker, Manager of Environmental Sustainability, Burt’s Bees, and Bill Morrissey, VP of Environmental Sustainability, The Clorox Company

Companies are increasingly finding that striving for zero waste-to-landfill (ZWL) can be a powerful mobilizing sustainability initiative that can also deliver cost savings, provide a new revenue stream, and serve to reinforce an efficient operations mindset. Here is one company's story and some of their lessons learned - The Clorox Company's Burt's Bees division achieved ZWL across their administrative, manufacturing, and distribution operations in April 2010.

Zero waste-to-landfill (ZWL) is part of a larger zero waste aspiration whereby manufacturers that are exemplars in sustainability strive to eliminate waste throughout the full life cycle of their products. In 2006, the Burt’s Bees business unit set a goal to be zero waste by 2020. Achieving zero solid waste to landfill in its operations in 2010 was an important milestone in this larger journey to zero waste. Here are some of the learnings the Burt’s Bees team garnered from its recent ZWL effort:

Define zero – Surprisingly, a common ZWL standard does not exist so it is important to get very clear about how you define zero. The Zero Waste International Alliance (http://www.zerowaste.org), a non-profit focused on eliminating waste, has previously defined ZWL to be 90% or greater diversion. But companies claiming ZWL today are more typically reporting 100% absolute diversion from landfill rates. These companies, however, usually do not account for waste generated outside their facilities such as the resulting ash when sending waste to a waste-to-energy facility. The Burt’s Bees team decided on a strict definition of zero, which included this remnant ash that usually finds its way to landfills. As a result, they found a firm that turns its non-recyclable, non-compostable materials into an efficient fuel for cement processing, with that residual ash actually then incorporated into the cement itself.

By-Product Hierarchy

Map out how you are going to get to zero – The Burt’s Bees team looked broadly at all their solid waste by using the term “by-product” which they defined as anything leaving their facility other than a person or saleable finished good. They then created a by-product hierarchy (right) that prioritized how materials should be diverted from landfill. Giving higher value to source reduction and reuse than to composting and recycling, and using waste-to-energy as a last resort provided strong guidance to their path to zero. As a result, today the Burt’s Bees division sends less than 10% of its waste to the more expensive and less eco friendly waste-to-energy destination.

Learning from the Burt’s Bees team’s experience, Clorox, has stipulated that a facility aiming for ZWL must not send more than 10% of its waste to waste-to-energy facilities. Clorox believes that in a ZWL facility, the “smell of the place” should be one of a highly efficient and responsible manager of its waste with low levels of waste and robust composting and recycling infrastructure.

Kick-start your ZWL journey with a high employee involvement dumpster dive – In order to make waste visible and real to all employees, the Burt’s Bees team organized “dumpster dives,” giving everyone the opportunity to get up close and personal with their trash. This exercise involves dumping your trash dumpster onto your parking lot and having your employees literally sort the resulting pile of trash. This eye opening educational exercise showed employees how the majority of this landfill bound trash was actually either compostable or recyclable and resulted in an immediate 50% reduction of trash to landfill volume.

Be firm with your goals but flexible in your tactics – Different facilities require different approaches. In a manufacturing operation, having conveniently placed trash and recycling gaylords where the waste is generated can facilitate higher sorting rates. On the other hand, removing individual under-the-desk trash bins from Burt’s Bees administrative offices and forcing a quick trip to central in-office by-product stations facilitated sorting by taking away a convenient way for employees to throw compostable and recyclable items into their nearby trash bins.

Educate your employees – Even eco-minded employees do not necessarily know how to accurately sort the many waste items one encounters into various recycling, compost and trash streams. Burt’s Bees posted bi-lingual signs with by-product bins showing acceptable materials along with photos. Colored bins were employed and by-product station locations were included in the company’s workplace organization program. It also helped to have a trained group of employee volunteers serving as “trash experts” so that employees could get quick answers to their inevitable sorting questions.

Continuously monitor and measure your progress - ZWL is typically a rather long journey. It took the Burt’s Bees business three years to achieve this at its three facilities, so it is important to provide regular feedback via robust monitoring and measurement in order to see and celebrate progress. A key enabler for the Burt’s Bees team was the “Green Derby” monthly audit of by-product bins which scored the accuracy of sorting waste into composting, recycling and residual trash. A progress report became a standing agenda item at monthly all-employee manufacturing and distribution meetings, and progress was tied to employees’ short-term incentive eligibility. Industrial floor scales were also deployed at each facility to weigh by-products before they were shipped off-site which allowed Burt’s Bees to ensure landfill waste was being reduced as well as diverted.

Leverage your waste diversion partner to achieve your ZWL goal – An important factor in the Burt’s Bees team’s success was enlisting a waste management expert, who intermediates all of its diversion needs with 17 actual service providers. The Burt’s Bees team is also able to leverage the larger waste volumes of this partner for more favorable national contract pricing. If working directly with a recycler, remember that recyclables are commodities and it’s in the recycling firm’s interest to take the high-value, high-volume materials. Don’t hold back in pushing your recycler(s) to “take the good with the bad”. For instance, using your valuable streams (such as cardboard) to incentivize a recycling outlet to take less desirable materials (such as mixed plastic films) can create a win / win for you and your recycler. Finally, keep up to date as non-landfill outlets for by-products are evolving rapidly so what was not possible yesterday may be tomorrow, or even today.

Choose one or a few ZWL pilot sites as a beacon for your entire organization – The Burt’s Bees business has served this function within The Clorox Company. Now, Clorox has the confidence to expand ZWL to other select manufacturing, distribution and administrative facilities. Also, without an internal exemplar like the Burt’s Bees business, it is likely that Clorox would have been satisfied with achieving its current overall goal of reducing solid waste to landfill by 20% by 2013. Today, Clorox is able to see how achieving zero waste is possible which works as an accelerant across the whole enterprise.

Institutionalize your ZWL achievement – Being a ZWL operation is now part of the Burt’s Bees division’s identity. And today, there is no choice but to divert as there are no longer trash compacters or dumpsters on any Burt’s Bees sites.

 

This post, by Steve Walker, Manager of Environmental Sustainability, Burt’s Bees, and Bill Morrissey, VP of Environmental Sustainability, The Clorox Company, was originally published on the Green to Gold Business Playbook website.

Posted in: Innovation & Environment
Wednesday, April 20, 2011
| Share

American Electric Power v. Connecticut: The Good News

By Guest Author, Douglas Kysar

This post by Yale Law School Professor Doug Kysar was originally published on the American Constitution Society website and is reprinted here with permission.

In one of the most, er, hotly anticipated cases of its term, the Supreme Court yesterday heard arguments in the climate change nuisance suit of Connecticut v. American Electric Power. From the beginning of this litigation, pundits have questioned the plaintiffs’ decision to seek injunctive relief gradually abating the defendants’ greenhouse gas emissions. To critics, this form of relief – as opposed to, say, monetary damages – seems to highlight the complex and value-laden aspects of climate change as a policy problem, making judges more likely to dismiss the suit as lying beyond the ken of the judicial branch.

Yesterday morning’s argument confirmed the pundits’ view, as even reliably liberal justices like Ruth Bader Ginsburg greeted the plaintiffs’ claims with palpable skepticism. Justice Ginsburg’s money quote, which is being cited around the blogosphere, came when she told the plaintiffs that their prayer for relief “sounds like the kind of thing EPA does.” Justice Kagan quickly piled on: “It sounds like the paradigmatic thing that administrative agencies do rather than courts.” Justice Breyer, ever the policy wonk, wondered aloud whether “the courts [can] set a tax” because, in his words, from “what I get from reading, these [carbon taxes] might be the best way to deal with the problem.”  (Answer: Courts set implicit harm taxes every day in the form of monetary tort awards. Bonus Answer: The Clean Air Act might well be a great way to deal with the problem, as the benefits of emissions permits have been oversold and the likelihood of a carbon tax passing Congress is nil).  For her part, Justice Sotomayor was nowhere to be found since she had recused herself from the case, even though she would have been within ethical guidelines to stay involved.

With friends like these, environmentalists might be forgiven for asking themselves, who needs Scalia?  Well, actually, even the reliably conservative Justice Scalia surprised observers this morning with just how conservative he could be. Throughout the oral argument, Scalia brazenly asked the electric utilities’ lawyer for suggestions on how to use this case to prevent climate change tort suits in both federal and state courts. (Answer: There is no appropriate way because the question of state common law climate change claims has not been raised in the present suit).  

So is there any good news for environmentalists and other progressives from yesterday's argument?  Surprisingly, yes. Let’s be honest with ourselves: Ever since the Supreme Court granted review in this case, speculation has focused not on whether the plaintiffs will lose, but on how they will lose. The narrowest ground for reversal would be on displacement, i.e., a ruling that the Clean Air Act and the EPA’s halting efforts to implement that statute with respect to greenhouse gas emissions work to effectively block federal courts from using common law principles to address climate change. The two other arguments in play – that the plaintiffs lack standing to press their claims or that their claims constitute political questions beyond the power of the court – would be much more disastrous for progressive causes if they received the blessing of the Supreme Court. They would make available new all-purpose broadsides against any tort litigation in federal court, requiring every injured party to first prevail against these arguments before they could even begin to press their claims against wrongdoers.  Arming defendants with these new SCOTUS-branded clubs would further tilt an already uneven litigation battlefield against tort claimants.

The good news, then, is that the justices were most keenly focused on displacement in their questioning, rather than on standing or political question. Apart from Justice Scalia, the justices seemed uninterested in dismissing the case on Article III standing grounds or on taking the Acting Solicitor General’s prudential standing bait. The latter resolution would be particularly pernicious as it would essentially invite judges to dismiss a case whenever they felt like it. Chief Justice Roberts appropriately swatted the argument away by noting that it “cuts off our jurisdiction at our own whim, as opposed to dealing with this on the merits.” Likewise, the political question doctrine barely made an appearance during the oral argument. Even the conservative justices seemed to recognize that the political question doctrine doesn’t really belong in the context of tort law claims. Like standing, it was developed for a much different context than common law adjudication, as I have argued with Benjamin Ewing in a forthcoming article.

Both standing and political question doctrine are crude substitutes for the merits of common law claims, a fundamental point that the all-important Justice Kennedy acknowledged throughout the argument.  (My favorite example, in response to a claim that the plaintiffs’ injunction would not solve climate change: “Well, again, that just goes to the merits. You make that argument to the district court that your injunction is meaningless, equity does not require an idle act. End of case.”).

So the good news is: We may lose this one, but at least we will lose in the least bad way.

Posted in:
Wednesday, April 13, 2011
| Share

Q & A with Steve Katona, managing director of the Ocean Health Index

By Susanne Stahl

Steve Katona, managing director of the Ocean Health Index for Conservation International, recently spoke at the Yale School of Forestry and Environmental Studies about his work on the OHI, a project founded by CI, the National Geographic Society, and the New England Aquarium. The OHI, scheduled for release in early 2012, will establish a standard for measuring ocean health and help policymakers gauge the success of efforts to improve ocean governance and health.

YCELP: When you talk about ocean health, what are you looking at?

Steve Katona: The way we define it, a healthy ocean has the ecological function and structure necessary to provide things that people value, and to provide them sustainably – now and in the future.

So it is a human-oriented view of the ocean, but the things that people want are not just extractive things like fish or other marine products. People also value the ability of the ocean to sustain cultures and traditions and to maintain subsistence … for people to fish and eat that fish for their protein. They also value, of course, tourism and recreation and the livelihoods that the ocean can produce and many other things. And, above all, they do value diversity in a broad sense, not just the species, but their interactions that produce the ecological structure and function

Underlying everything is clean water. There are buried in this loads of natural and aesthetic and existence values in addition to the things that we take from the ocean.

YCELP: As you’ve been working on the index, is there any particular element of ocean health that you’ve found most troubling?

Steve Katona: Let me start with the things that are most encouraging. Most encouraging are some very good trends and results, for example, for the protection of marine mammals. Because of protective legislation in many countries and internationally, the big whales have really done pretty well, and they’re coming along nicely as are other marine mammals of various kinds – seals, manatees in different places. Small cetaceans are having more difficulties, so there’s plenty of work to be done there – but it’s a success story for the big whales.

I think there are also some success stories in fisheries where management is applied and effective and enforced and reasonable. You can see changes fairly quickly over a decade or perhaps more. So there is hope that by really showing the political will and public will, things can get done.

There are some other things that are very slow moving, particularly climate-related things and ocean acidification. Those are going to move very slowly, and they are – for some time to come – because there’s so much carbon dioxide in the system already, so they will impact various aspects of ocean health for a long time to come. How bad those conditions become is a matter of how much we’re able to do now and how quickly we’re able to act, but even if we did everything that we could do immediately, there still will be warming and acidification coming. So those are a little discouraging, but, on the other hand, they’ll get a heck of a lot worse if we don’t take action now.

YCELP: What are a few of the things you wish people understood about how their day-to-day activities affect ocean health?

Steve Katona: We all demand quite a bit of the ocean in terms of what we eat – seafood choices we make, for sure – but there are other things, things that we might vote for, which look like they might be good – funding for a new pier, subsidy for some new fishing boats, subsidy for fishermen’s fuel, things like that. All of these kinds of things may have beneficial local consequences, but may have very detrimental broader consequences because they increase fishing pressure at a time when fishing pressure is already too heavy. Most people don’t always see that relationship.

So trying to eliminate or at least minimize what I think are harmful subsidies – that is harmful to fish stocks – that’s one thing to think about. Energy use (is another). We usually don’t think of energy use in terms of the ocean, but we should. Every watt that we use in our house, every drop of gasoline that we use in our car implies carbon dioxide production somewhere, unless we’re getting your energy from a sustainable or renewable source or nuclear, which has its own set of problems. All that comes home to roost in rising global temperature, rising sea temperatures, rising sea level and ocean acidification. So energy use is absolutely important in this and people need to be aware of that and to reduce their energy use both for environmental reasons and, more immediately, for personal financial reasons. Everything you save, you save in money, too.

YCELP: You’ve mentioned that you were affected by Rachel Carson’s books growing up, have there been any ocean-related books you’ve read recently that you’d recommend?

Steve Katona: Charles Clover’s End of the Line is a wonderful book about fisheries and the pressure on fisheries; I highly recommend it– it’s discouraging, but it’s certainly enlightening. That’s one that’s stuck with me. Another one is Joe Romm’s Hell and High Water. It’s a terrific book, and his blog Climate Progress is well worth viewing for anybody who is interested in the energy side of this equation.

On a more hopeful note, Defying Ocean's End: An Agenda For Action by Linda Glover, Sylvia Earle, and Graeme Kelleher contains some ambitious plans for restoring ocean health. I highly recommend it.

Posted in:
Friday, April 08, 2011
| Share

Balancing Biofuels and Food Prices

By Susanne Stahl

New York Times reporter Elizabeth Rosenthal wrote a piece this week outlining the complexities in the relationship between biofuel production and food prices. "The starchy cassava root has long been an important ingredient in everything from tapioca pudding and ice cream to paper and animal feed," she writes. "But last year, 98 percent of cassava chips exported from Thailand, the world’s largest cassava exporter, went to just one place and almost all for one purpose: to China to make biofuel. Driven by new demand, Thai exports of cassava chips have increased nearly fourfold since 2008, and the price of cassava has roughly doubled."

With food prices rising sharply in recent months, she continues, "many experts are calling on countries to scale back their headlong rush into green fuel development, arguing that the combination of ambitious biofuel targets and mediocre harvests of some crucial crops is contributing to high prices, hunger and political instability."

No one is suggesting abandoning biofuels, but some food experts suggest countries revise their policies so that fuel mandates can be suspended when food stocks get low or prices become too high. "It can be tricky predicting how new demand from the biofuel sector will affect the supply and price of food," Rosenthal writes. "Sometimes, as with corn or cassava, direct competition between purchasers drives up the prices of biofuel ingredients. In other instances, shortages and price inflation occur because farmers who formerly grew crops like vegetables for consumption plant different crops that can be used for fuel."

Read the full article here.

Posted in:
Thursday, April 07, 2011
| Share

The Basics of Shale Gas

By Josh Galperin, Associate Director

 

Want to understand the basics of shale gas? Then the U.S. Energy Information Administration has the primer for you. Key facts include:

  • U.S. shale gas plays could provide approximately 110 years of use in the United States at 2009 rates of consumption.
  • Shale gas (or natural gas extracted from shale resources) made up 14% of total U.S. natural gas supply in 2009.  The EIA estimates that this share could increase to 45% by 2035.
  • Natural gas is a predominantly domestic energy resource -- 87% consumed in the United States in 2009 was also produced in the United States.

There's much more, including links to other relevant EIA reports and some helpful visuals. This is an excellent starting point for further research.   

Posted in: Energy & Climate
Wednesday, April 06, 2011
| Share

Natural Gas and RGGI’s CO2 Emissions Drop: The Overlooked Story

By Josh Galperin, Associate Director

 

According to a recent estimate, CO2 emissions in the RGGI region experienced a significant decrease from 2005 to 2009 -- approximately 33%. What's been overlooked is the key role natural gas played in that drop. Here's the relevant chart:

No doubt there are several important factors driving the emissions drop -- the recession, the weather, increased energy efficiency, and increased renewables capacity, among them. But what's worth underscoring is the 31.2% coming from fuel switching. That's all from natural gas -- specifically, generally decreasing natural gas prices that were lower than petroleum prices after 2006 and much closer to coal prices by 2009. Per another excellent chart:

This trend could be very important for our climate future. It suggests natural gas might actually be the viable transition fuel that's been heavily promised. Though of course much more research and analysis needs to be done on that front before a firm conclusion can be drawn. Nevertheless, natural gas prices are now something very much worth watching in the coming years.    

Posted in: Energy & Climate
| Share

Cleantech Group Announces Second Highest Quarter Ever in Clean Tech Investment

By testpersona

The Cleantech Group released preliminary results yesterday from their 2011 first quarter report.  The major finding: a total of $2.57 billion in clean technology venture investment across 159 companies. While the total number of deals were down, actual dollar investments increased by 52% compared to the previous quarter.  The top investment areas were:

SOLAR - $641 million in 26 deals
TRANSPORTATION - $311 million in 8 deals
MATERIALS - $296 million in 9 deals
BIOFUELS - $148 million in 13 deals

The report also found a huge increase in investment in North America, while the UK saw a sharp drop from the previous quarter.  After the US, Canada raised the most clean tech investment dollars, followed by India.

Posted in: Innovation & EnvironmentEnergy & Climate
Tuesday, April 05, 2011
| Share

Climate Change Triage: The Northeast and Sea Level Rise

By Josh Galperin, Associate Director

Sobering look at sea level rise in the Northeast and the hard choices it puts before us.  Do taxpayers pay to defend coastline with expensive sea walls in what looks to be a losing battle?  Do emotionally-invested homeowners on the coast retreat now while their property may be at its optimal value?  What can we save from the sea's rise, if anything?  How will we as a society triage the many victims of this climate change harm?      

I detect no real sense that policy makers have a good handle on how to resolve, or even approach, these kinds of terrible choices.  Unfortunately, going forward blindly is also a choice, and one that usually doesn't end too well.

Posted in: Environmental Attitudes & BehaviorEnvironmental Law & GovernanceEnergy & Climate
Friday, April 01, 2011
| Share

USDA Invests in Projects Looking at the Effects of Climate Change on Agriculture, Forests

By Susanne Stahl

The U.S. Department of Agriculture's National Institute of Food and Agriculture (NIFA) is investing $20 million dollars into each of three major studies looking at the effects of climate change on agriculture and forest production.

1. Dr. Lois Wright Morton of Iowa State University will lead a research team estimating the carbon, nitrogen and water footprints of corn production in the Midwest. The team will evaluate the effects of various crop management practices when various climate models are applied. The Iowa State project, which includes researchers from 11 institutions in nine states, will integrate education and outreach components across all aspects of the project, specifically focusing on a place-based education and outreach program called “I-FARM.”  This interactive tool will help the team analyze the economic, agronomic and social acceptability of using various crop management practices to adapt and mitigate to the effects of climate change.

2. Dr. Tim Martin, of the University of Florida, will lead a team looking at climate change mitigation and adaptation as it relates to southern pines, particularly loblolly pine, which comprises 80 percent of the planted forestland in the Southeast. The team of 12 institutions will establish a regional network to monitor the effects of climate and management on forest carbon sequestration.  Research in the project will provide information that can be used to guide planting of pine in future climates, and to develop management systems that enable forests to sequester more carbon and to remain robust in the face of changing climate.

3. Dr. Sanford Eigenbrode, of the University of Idaho, willlead a team monitoring changes in soil carbon and nitrogen levels and greenhouse gas emissions related to the mitigation of and adaptation to climate change in the region’s agriculture, which produces 13 percent of the nation’s wheat supply and 80 percent of its specialty soft white wheat for export. The research team will look at the effects of current and potential alternative cropping systems on greenhouse gas emissions, carbon, nitrogen and water-levels and how that, in turn, affects the local and regional farm economy.

“Climate change has already had an impact on agriculture production," said NIFA Director Roger Beachy. “These projects ensure we have the best available tools to accurately measure the effects of climate change on agriculture, develop effective methods to sustain productivity in a changing environment and pass these resources on to the farmers and industry professionals who can put the research into practice.”

For further details, see the full press release here.

Posted in: Energy & Climate
| Share

The RGGI Emissions Cap:  Is It Too Forgiving?

By Josh Galperin, Associate Director

There are many valuable lessons to be drawn from the Regional Greenhouse Gas Initiative (RGGI), the nation's only operational, and mandatory, cap-and-trade program.  One worth dwelling on is the effectiveness of RGGI's CO2 emissions cap.  Recent analysis suggests this cap is much too forgiving -- not just now, but, more importantly, also over the next two decades.

The whole point of the RGGI emissions cap is to create a market for CO2 emissions from power plants that will ultimately drive down those emissions over time in the most economically efficient way possible.  A relatively harder cap - one set below actual CO2 emissions, for example - should make RGGI's tradeable CO2 pollution allowances more scarce and thus more valuable to polluters, resulting in higher prices per allowance than a cap set above actual emissions would.  The key idea here is that RGGI's cap on CO2 emissions from its regulated entities - electric utilities basically - creates a new market that has the potential to push those utilities towards low- or no-carbon generation.  Where policy makers set the cap can therefore matter a great deal; a relatively tough one pushes harder than a relatively lenient one.  This chart, produced on behalf of RGGI, strongly suggests the RGGI cap is not hard enough now, nor will it be hard enough in the future:

The important lines to look at for our purposes are the dashed one - that's the RGGI cap as set by agreement of the RGGI members - and the solid black line - that's both historical and projected total CO2 emissions from RGGI's regulated entities.  You can see that presently, the cap is simply way too high (and to be fair, some of that is on purpose).  The factors behind the recent massive drop in actual CO2 emissions are several (more on that later).  The recession undoubtedly plays a huge part.  Nevertheless, the cap just does not appear to be exerting real pressure on utilities right now.  Maybe that's not a problem.  There's an argument that a soft cap is just fine early on, as we refine and tweak RGGI.  That argument might be even stronger in the current economic climate.  No need to clamp down on utilities in the midst of the recession.

So perhaps the short-term performance issues of the cap are okay to put aside for the moment.  That's not at all true for the long-term performance issues.  Here's the major problem, and one policy makers should make an urgent focus of their thinking:  According to these projections, the cap doesn't appear to really bite until maybe 2030 or later, and that's just too late in the scheme of things.  Climate science tells us we need meaningful CO2 reductions much much sooner than that to avoid catastrophic harms.  So what's the point of an emissions cap if it doesn't drive change when we need it?  It's time to give serious thought to how best to tighten the RGGI cap to make it better correspond with the scientific reality we find ourselves in.    

Posted in: Innovation & EnvironmentEnvironmental Law & GovernanceEnergy & Climate
Wednesday, March 30, 2011
| Share

EPA Revisits Hydraulic Fracturing and Drinking Water

By testpersona

As oil prices increase and energy security becomes a concern in the US, more is being done to explore cleaner burning fuels such as natural gas. Natural gas has seen big increases in the number of wells and total production as shale gas extraction, in particular, intensifies. The EPA projects that 20% of US gas supply will come from shale gas by 2020.

An EPA report in 2004 found that "there was little to no risk of fracturing fluid contaminating underground sources of drinking water during hydraulic fracturing of coalbed methane production wells." But public concern over the process by which shale gas is extracted  known as hydraulic fracturing, or "fracking," has escalated with the growing number of wells.  Each well requires the pumping of tremendous amounts of fracking fluid into the earth and, according to the EPA's 2004 report, "[t]here is very little documented research on the environmental impacts that result from the injection and migration of these fluids into subsurface formations, soils, and USDWs."  Until last year (when the EPA called for the voluntary reporting of chemicals used in fracking fluids), many of the chemicals used in fracking were unknown.  Chemicals now known to sometimes be involved in the process include: diesel fuel (which contains benzene and other toxic chemicals), polycyclic aromatic hydrocarbons, methanol, formaldehyde, ethylene glycol, hydrochloric acid, and sodium hydroxide. Given this situation, the EPA has announced another study to examine the effects of hydraulic fracturing on drinking water and groundwater.  The EPA aims to issue preliminary findings in 2012 and a full report in 2014.  The draft study plan is available at here.

Posted in: Environmental Performance MeasurementEnvironmental Law & GovernanceEnergy & Climate
Monday, March 28, 2011
| Share

China amends air quality measures but misses key pollutant – PM 2.5

By Guest Author, Angel Hsu

This guest post by Angel Hsu, a doctoral student at the Yale School of Forestry and Environmental Studies, was originally published here.

The Chinese Ministry of Environmental Protection has been drafting new proposals (see rough Google translation in English here) to amend former daily reporting of the Air Pollution Index (API), which has been used the last 20 years to communicate air quality and health hazards posed by air pollution on any given day.  In 2000 the MEP (then the State Environmental Protection Agency, or SEPA) began reporting a daily API for 42 cities; now, data for 113 cities are available from the China National Environmental Monitoring Center.

Although these new specifications are still in draft form, it’s interesting to take a look to see what the MEP is considering, particularly in light of the fact that China’s annual National People’s Congress parliamentary meetings just concluded and approved the 12th Five-Year Plan (full text in Chinese here).  As I’ve written with my colleague Deborah Seligsohn of the World Resources Institute, the new Plan includes an ambitious range of energy and environmental targets, including those for air pollutants like SO2 and for the first time nitrogen oxides (NOx).  While the Plan included a blueprint for major reduction goals for these criteria pollutants, the specifics as to how targets will be allocated and policies implemented still remain to be developed over the coming months by the individual Ministries, provinces, and cities.

These draft AQI guidelines provide some insight as to how monitoring of criteria air pollutants might change as a result of the Plan.  I’ve taken a look at the second draft, which the MEP posted on their website during the last week of February, prior to the release of the 12th FYP.  Most notably, the new specifications appear to reflect a greater attempt by the Chinese MEP to make the former API more consistent with the United States’ Air Quality Index. This effort is reflected through:

  • Renaming the API the “Air Quality Index” to be consistent with the U.S.’s nomenclature.
  • Providing a consistent color classification system identical to the U.S.’s AQI color scheme.
  • Descriptions of health effects of AQI scores in language similar to that used by the AQI
  • Inclusion of new particulates carbon monoxide (CO) and Ozone (O3), which were previously absent from the API
  • Changes to the calculation methodology to reflect the U.S.’s AQI

I’ll spend the rest of this post highlighting some of these proposed changes.

Consistent Communication

Figure 1. Color classifications and descriptions of the new AQI, compared to
previous versions and the US. Sources: MEP, 2011 and Andrews, 2009.

 

Even though the API was originally based on the United States’ AQI, there are differences in the scale and corresponding health hazard categorizations as well as color classification schemes.  Inconsistencies in color classifications within China are due to the fact that unlike in the United States, where the AQI colors are standardized, local environment protection bureaus (EPBs) in China have been allowed to set their own color schemes.  As Figure 1 depicts, the MEP is proposing a color coding scheme that is entirely consistent with the U.S.’s AQI, unlike the example Andrews (2009) shows of conflicting colors between Beijing and Guangzhou that could be confusing for travelers between the two cities.  Making the colors classifications the same as the United States will also provide more transparency and clarity for those familiar with the U.S. AQI system.

Further, while the descriptions of the AQI classes (i.e. ‘Excellent,’ ‘Good,’ etc.) haven’t changed from the previous API, the descriptions of the Health Effects are similar to those provided by the U.S. EPA.

How is the new AQI calculated?

Remember that the API was determined from only using three pollutants:  SO2, NO2, and PM10. The concentration of each pollutant is measured at various monitoring stations throughout a city over a 24-hour period (noon to noon). The average daily concentration of each pollutant is then converted to a normalized index, which means that each pollutant is given its own API score.  The daily API then only reflects the pollutant with the highest API (see Vance Wagner’s concise explanation on how the API is calculated).

Figure 2. Concentration normalization table for pollutants in the AQI. Source: MEP, 2011. 1) If 1 and 2 are the same, then use 1/2 of 2; 2) Use the concentration limit of Class 2 (TBD); 3) if the concentration of O3 exceeds 0.800 mg/m3 then it exceeds the scale.

As shown in Figure 2, the AQI now includes three additional measures: carbon monoxide (CO), Ozone (O3) – 1 hour average and Ozone (O3) – 8 hour average. The concentrations of each of these six pollutants are normalized according to the table in Figure 2.  The (1),(2), and (3) annotations mean that the MEP is still debating what concentration levels should be. Several options under consideration are found in the “Instruction Manual” document of the draft proposals (in Chinese only).

What is notably missing, however, is a measure of PM 2.5 (air particulates with a diameter of 2.5 microns or less; known to have serious health implications such as asthma, lung cancer, and cardiovascular disease, due to their ability to penetrate human lungs).  Despite the fact that many major countries report PM 2.5 concentration data, some are viewing the lack of PM 2.5 from the new AQI as a major disappointment.

Ma Jun, Director of the Institute of Public and Environmental Affairs (IPE), the Beijing-based NGO who released the Air Quality Transparency Indexwrote about earlier this year, said in a Global Times article that leaving it out was a mistake.

He goes on to say:

“Technology for measuring PM2.5 is not a problem for China any more, as cities in developing countries like New Delhi and Mexico City have already made the index public,” said Ma. He said a reluctance to include the crucial index has to do with concerns about local economies.

“Government agencies feel the index may hurt the image of many cities that want to attract investment or that they may not be able to improve PM2.5 pollution in a short time,” Ma said.

However, contrary to Ma Jun’s assessment that it’s not a technological capacity issue, I wonder if the decision to leave PM 2.5 out of the new AQI isn’t really due to local availability of PM2.5 monitoring technology.  There is a sentence (6.1 地方各级环境保护行政主管部门可根据当地的实际情况和环境保护工作的需要,参照本 标准的要求,增加空气污染物评价项目,如细颗粒物(PM2.5 等)) on Page 4 of the draft proposal that says that local EPBs can consider projects that increase the range of pollutant monitoring, specifically mentioning PM2.5.

On the other hand, one notable improvement in the AQI’s calculation is the change in methodology proposed.  While the U.S.’s AQI is based on the highest reading in a city and thus represents the “worst” air quality case a person could encounter, the Chinese API represents an average.  The draft proposals improve upon the API’s methodology, adopting a similar calculation method to that of the U.S.

First, “individual AQIs” are calculated as follows:

Figure 3. Formula for determining the Individual Air Quality Index (IAQI). Source: MEP, 2011.

where:

IAQIp is the individual AQI;
Cp is the concentration of the six pollutants (SO2, NO2, PM10, CO, O3-1hr and O3-8 hour averages. If a city has more than one monitoring station, the average of the pollutant concentrations are used [对于城市区域为多测点的日均浓度值]).

BP(hi) is the pollutant with the highest concentration
BP(lo) is the pollutant with the lowest concentration
IAQI(hi) is the index score of BP(hi) (on the IAQI 0-500)
IAQI(lo) is the index score of BP(lo) (on the IAQI 0-500)

The max of these IAQIs (Figure 4) is then used to determine the AQI.

Figure 4. Formula to determine the AQI. Source: MEP, 2011.

I will spend some more time going through the longer instruction guidelines for the proposals and will update this post if I find more details. In the meantime, I welcome any comments or alternative interpretations.

References:

Andrews, S.Q. 2009. “Seeing through the Smog: Understanding the Limits of Chinese Air Pollution Reporting.” China Environment Forum, Vol. 10.  http://www.wilsoncenter.org/topics/pubs/andrews_feature_ces10.pdf

Ministry of Environmental Protection. 2011. Technical Regulation on Ambient Air Quality Index Daily Report. Second Draft. Available here:  http://www.mep.gov.cn/pv_obj_cache/pv_obj_id_47B37A70B7A7F94EBAE2DC9709456678C1210400/filename/W020110301385498176520.pdf

Acknowledgements:
Thanks to Chris Haagen for providing some translation assistance.

Posted in:
Friday, March 25, 2011
| Share

As the VSL Turns…: In Value of a Statistical Life Debate at EPA, Moral Decisions Hide Behind Techn

By Guest Author, Douglas Kysar

This post by Yale Law School Professor Doug Kysar was originally published here on the Center for Progressive Reform's website.

A report yesterday from Inside EPA offered a fascinating overview of the agency’s struggle to update the way it assigns dollar values to the suffering and premature death that its regulations prevent. Seriously, as far as economic esoterica goes, this stuff is riveting. What’s more, your life may depend on it.

Currently, EPA values each statistical human life saved by its rules at $7.9 million. This number is derived from so-called “wage-risk premium” studies that examine large data sets on employment and occupational risk. The idea is that, if you control for education, job sector, geographic region, and other relevant factors, then you should be able to come up with a number representing the portion of an employee’s wage that compensates for higher on-the-job health or safety risks. Depending on how a worker values health and safety compared to other goods, he – and he is an important distinction here since the value-of-life studies tend to only look at male-dominated blue collar jobs – might be willing to take a higher wage in exchange for accepting higher levels of occupational risk. In theory, then, the studies can pull out the amount at which workers themselves value risk exposure, which can then be converted into a uniform “value of a statistical life” (VSL) for policy analysis. By using the VSL number to value the health and safety benefits of regulations, EPA can avoid the messy task of government deciding on its own how much protection is worth investing in.

According to the Inside EPA report, staff experts are recommending a new, updated methodology, but the agency’s Environmental Economics Advisory Committee (EEAC) cautioned that the new method might be “too complicated for non-specialists to understand.” This claim is a real howler as it seems to imply that the current methodology is accessible to non-specialists. It is not. Deep and controversial value judgments are embedded within the current methodology, ones that lay persons can scarcely glean. For instance, studies show that union workers receive much higher wage-risk premiums than non-union workers – a finding that suggests bargaining power has a lot to do with the market outcomes that are supposedly capturing individuals’ true “preference” for life preservation. Should EPA use the higher union VSL, rather than the lower non-union VSL that economists tend to favor? This is not a matter of expertise. It is a value judgment that should include a full range of democratic inputs, but its import instead is buried deep within the technicalities of economic regression models.

Apparently the EEAC wants to push EPA even deeper into the weeds by asking the agency to compile a unique VSL figure for each regulatory context that the agency addresses. For instance, if a mercury emissions regulation would disproportionately benefit Native Americans (who eat far more contaminated fish than the general population), then the monetary value of reducing mercury exposure would be calculated using studies that find out how much Native Americans in particular are willing to invest in health and safety. In theory, this would bring the agency closer to the economists’ ideal world in which all values are assessed by the affected individuals themselves, rather than by collective democratic processes. In practice, however, it would involve the government intimately in the perpetuation of discrimination.

The VSL is affected not only by an individual or group’s willingness to invest in health or safety, but also by the ability to do so. This is made clear by the difference between union and non-union VSL data. It is also made clear by studies that show certain minority groups, especially African-Americans, actually receive significantly lower wage-risk premium than should be expected based on their occupational hazard exposure. We might say this represents a weaker “preference” for staying alive among those groups, so that if EPA’s cost-benefit calculations weigh benefits to them at a lower rate than non-minorities, then, well, that’s just giving the people what they “want.” Alternatively, we might say that the picture is messier than this, and that past injustices continue to impact deeply the social and economic opportunities available to individuals and groups today. Treating current market outcomes as somehow neutral and objective does not wash the government’s hands of this history.

The VSL debate is a gripping saga, one with more than a little fiction in it, but with all too real consequences. And it is anything but accessible to non-specialists. For an attempt to break it down in more detail, and for supporting citations, see Chapter 4 of my book, Regulating from Nowhere: Environmental Law and the Search for Objectivity.

Posted in:

Page 13 of 19 pages « First  <  11 12 13 14 15 >  Last »

Blog Home



2007-2010 Yale Center for Environmental Law & Policy | Contact Us | Website by Asirastudio LLC