Friday, July 31, 2009

"Over a Barrel" - Part I

I finally caught up with last Friday's broadcast of an ABC News special entitled, "Over a Barrel: The Truth About Oil." The subtitle gives a strong hint at the tone of the piece, though Charles Gibson and his crew did a reasonable job of lining up some talking heads who could offer a balanced perspective, along with the more predictable exponents of suspicion and conspiracy. My former employer, Chevron, also provided access to several facilities and got some good exposure in the process. Rather than dissecting the entire program and its arguments, I thought it might be more useful to take essentially the same starting point and create my own quick summary of the basic facts about oil that informed Americans ought to know, referring to the show when appropriate. That's a tall task, since the subject is clearly too complex to cover in much detail in a single posting. As it is, I'll break it up into two segments, with today's focused on oil and a subsequent posting looking at gasoline, other products, and the impact of climate change.

1. Oil is finite, but production matters more than reserves, at least when it comes to influencing prices. Nor are reserves an especially good predictor of future production, since they reflect a static view at a given level of price and technology, both of which constantly evolve. That explains the apparent paradox that since 1859 the US has produced just shy of 200 billion barrels from reserves that never exceeded 40 billion barrels. So when you hear, as Mr. Gibson reminded us, that the US consumes 23% of the world's oil but possesses under 3% of proved reserves, you should also consider that we produced 10% of global petroleum output in 2008. And that 23% of demand doesn't look quite so disproportionate, when you recall that the US makes up roughly 24% of the world economy.

2. Contrary to widely-held perceptions, overall net US energy independence, considering all the different forms of energy we produce and consume, import and export, currently stands at 74%. That's less than it once was, but not so bad compared to some of our economic competitors around the world. While China is about 90% independent (but falling,) the EU is at around 50%, and Japan is only 16% energy independent. When we talk about energy independence, though, we tend to focus on oil, because it is so important for the economy and accounts for 84% of US energy imports--a vulnerability that had been growing at an alarming rate in the last 15 years.

3. While we certainly cannot drill our way back to energy independence--a condition we have not enjoyed since the 1950s--the US still has substantial untapped oil and gas resources that are not counted in current reserves, along with many other forms of under-utilized energy that are beginning to reach a useful scale. Although I don't see us becoming truly energy independent again, or even needing to, the only potentially insurmountable obstacles to restoring a more comfortable and sustainable level of energy security are of our own making. That potential 2 million barrels per day (MBD) of additional production that T. Boone Pickens described in his interview, which awaits only unimpeded access and capital, would make a serious dent in our net petroleum imports of roughly 10 MBD, down from 12 MBD in 2007 as a result of the recession.

4. Oil still supplies vastly more energy than biofuels, wind, and solar power, and that comparison cannot change very quickly, no matter how fast these alternatives grow--and they are growing rapidly indeed. That's because of the enormous scale of our oil use and the sheer quantity of energy in each barrel. Last year the US consumed roughly 300 billion gallons of gasoline, diesel and other petroleum products. That figure includes 9.6 billion gallons of ethanol and 320 million gallons of biodiesel. After adjusting for ethanol's much lower energy content, biofuels thus met just 2% of our petroleum needs, equivalent to 400,000 barrels per day. That's not inconsequential, any more than the output of new US offshore oilfields would be. Biofuels won't close the oil import gap anytime soon, however, because the targeted 36 billion gallons of ethanol and biodiesel expected to be produced under the national Renewable Fuel Standard in 2022 works out to only about 1.5 MBD on an oil-equivalent basis. (For comparison purposes, the 29,440 MW of wind turbines currently in place in the US generate the equivalent of roughly 0.3 MBD of oil, assuming it displaces natural gas in gas turbine power plants.)

5. In the absence of any realistic means of becoming 100% energy independent, energy security should be the main focus of government oil policies. Happily, this outcome is not nearly as unattainable as self-sufficiency, though it can seem awfully elusive at times. The principal source of our energy security today, aside from our very large production of non-oil energy sources, derives from our diverse mix of suppliers. Crude oil imports are dominated by Canada and Mexico, which together contributed 32% last year, compared to 24% from the Persian Gulf. Meanwhile, over half of our substantial net imports of petroleum products came from Canada, the EU and the US Virgin Islands. The US Strategic Petroleum Reserve, which presently contains 724 million barrels of oil, constitutes an important emergency back-stop in case of a disruption in these supplies, though it is long overdue for a fundamental re-think.

6. The oil market is global, and prices are not set by oil companies or even mainly by traders on the New York Mercantile Exchange, though the latter play their part. The price level for oil is mostly determined by the interaction between global demand and the two key components of supply: OPEC and non-OPEC production. When non-OPEC output is growing faster than demand, prices tend to fall, while any increment of new demand or shortfalls in non-OPEC output that boosts OPEC's market share tends to raise prices. If you want to understand why oil has rebounded above $60 with the global economy still in recession, look no farther than the roughly 3 million barrels per day of oil that OPEC has managed to keep off the market, in an uncharacteristic display of cohesion and discipline.

Although I've omitted numerous other important aspects of the situation, we would have a more fruitful national dialogue on energy if our leaders and the electorate just understood these six points. Reasonable people differ as to how best to respond to these facts, as demonstrated by a long succession of US administrations that have pursued a variety of energy approaches, seeming consistent only in their lack of a coherent strategy with respect to oil, or at least in their inability to find one that could be sustained from one administration to the next. And before my readers inundate me with comments reminding me that any comprehensive discussion of oil must now incorporate climate change, I intend to cover that when I address the petroleum products side of this story, since most of oil's emissions result from consumption, not production.

Tuesday, July 28, 2009

Speculation and Physical Oil Prices

The story above the fold on the front page of this morning's Wall St. Journal suggested that the Commodity Futures Trading Commission (CFTC) is about to issue a report tying last year's oil price spike to speculation by non-industry participants in the oil futures, options and swaps markets. This would reverse the agency's previous finding that speculation had not played an important role in influencing the record-breaking prices we experienced in 2008. Although I plan to assess the report with an open mind, the dissemination of such contradictory conclusions--separated mainly by the handoff from one administration to another--hints that a jaundiced eye might be in order regarding both. More important than any politics that might be involved, however, is the deeper question of whether the futures-market speculation the CFTC has apparently uncovered actually harmed the real economy by spreading its contagion to the markets for physical oil with which consumers interact. The answer to that question has serious implications not just for the justification of stricter regulation of energy markets, but for overarching policies and trends affecting the production and consumption of real energy.

As I noted in a posting last summer, the growth of the futures exchanges over the last two decades has fundamentally changed oil trading. Most oil is now bought and sold on price formulas pegged to the futures prices, or to published market reports strongly influenced by them. What traders are agreeing to when they do a deal is not a fixed price, but a differential above or below a particular futures contract during a set period, usually aligned with the time when the shipment will be loaded or delivered. So while these differentials fluctuate due to a variety of factors, the price that refiners pay for crude oil remains directly tied to the futures price. That means that anything that drives up the futures market, whether a disruption in supply, higher demand, or speculation by a new class of commodity investors, has a direct impact on what we all pay for the products that refineries make.

When I discussed this issue last summer, I was careful to note that if the prices for physical grades of oil moved in lock step with the futures price, that might not by itself absolve speculators from driving up those prices, along with the futures. Other factors could produce a similar result, even if the futures were mainly driven by speculation. However, when I now look at last year's price relationships for two of the most important crude oil streams in the country, I see evidence that goes beyond a neutral result and undermines the notion that anything other than the fundamentals of supply and demand was driving prices in the run-up to oil's peak of $145 per barrel last July.

The chart below tracks the price difference between two important grades of physical oil and the monthly-average NYMEX futures price for West Texas Intermediate, which is the focus of the CFTC's investigation and the principal grade of oil against which most US oil--and indeed much of the world's--is typically priced. I chose Alaskan North Slope crude and West Texas Sour because both are produced in substantial quantities, are representative of the medium-gravity, medium-sulfur crudes that many US refineries turn into gasoline, and cannot be delivered into the NYMEX WTI contract. While there might conceivably be some degree of speculation in these grades, anyone buying them would be required either to take physical delivery themselves or sell to a refiner or other physical buyer before the oil was delivered. If futures market speculation had been driving the prices of these grades of oil last spring and summer, we'd expect to see their discounts either remain steady or widen, indicating that they were being dragged along by frothy futures. Instead, between March and July 2008 we see these grades strengthening relative to WTI--their discounts shrinking--both sequentially and relative to their average discounts since 2004. In other words, in that period the prices of these grades of physical oil appear to have been stronger than the futures market that was thought to be driving them.

Why is that important? First, the argument for stricter regulation of the commodity markets, beyond the very sensible suggestion to increase the transparency of participants' trading positions, depends on a finding that speculators not only influenced the futures markets in which they participated directly, but also the price of the physical oil purchased by refiners and thus the prices of the petroleum products that consumers, trucking companies, school districts, airlines and others purchased, to the detriment of the economy and our trade deficit. If speculation was driving oil futures but not the price of physical oil, the necessity for clamping down on it aggressively begins to resemble a fever remedy that works by banning thermometers that read above 99 degrees.

Of greater significance, I believe, is the psychological effect on our expectation of oil prices in the future. If we convince ourselves that $145 oil and $4 gasoline were mainly the fault of big, bad speculators, and that regulating them will avert such an outcome in the future, we foster a dangerous illusion that supply and demand will somehow always result in prices more congenial to our preferences and lifestyles. That's arrant nonsense, and you don't have to be an ardent believer in Peak Oil to see how unrealistic expectations of low future oil prices can stimulate demand and stifle expensive oil projects, with their long inherent time-lags. That would eventually lead to precisely the outcome we wish to avoid: much higher oil prices.

Since the summer of 2007 I have been arguing that speculation might have been influencing oil prices around the edges, but that with or without it the narrowing gap between growing demand and straining supply was the main factor behind high prices. The sudden inversion of those forces--the sharp drop in oil demand caused by the recession and the growth of inventory and restoration of adequate spare production capacity--equally and fully explains the price collapse that followed, pummeling exposed speculators and index investors. Whatever the CFTC concludes about last year's price spike, it shouldn't distract us from the necessity of investing in expanding oil production and alternative energy sources, while working hard to improve the efficiency with which we use energy, and particularly oil. Blaming it all on Wall St. would be the quickest way to undermine the gathering momentum for improving our real energy security.

Monday, July 27, 2009

Cooler Cars in California

I was just perusing the month-old press release for the new regulation from California's Air Resources Board requiring car makers and auto repair shops to reduce the amount of infrared light that windshields and other car glazing allow into a car's interior, beginning in 2012. On the surface this seems like an eminently sensible idea and one sure to appeal to drivers in a warm, sunny state who are tired of climbing into hot cars in the summer and waiting for the A/C to catch up. Living in another warm state, I'd be tempted to order this as an option on my next car, too. The problem is that CARB isn't requiring carmakers to offer low-transmittance glass as an option; they're mandating its use, in two successively stricter stages, and consumers must absorb the higher cost, whether they want to or not. Moreover, when the touted emissions reductions are compared to those costs this looks like a very pricey means of achieving them, compared to many other alternatives. Regulating car-window glazing as a way to reduce emissions epitomizes the pitfalls of applying the command-and-control approach from the regulation of local pollution to climate change.

When I couldn't quickly find enough information to unpack the assumptions behind the emissions numbers in the press release, I reverted to examining its implied cost/benefit by applying some very simple assumptions to the figures in the release itself. At a conservative average fuel price of $3 per gallon, CARB's estimate of annual fuel savings of $16 per car from reduced air-conditioner use implies a fuel saving of 5 1/3 gallons per year, resulting in cumulative avoided greenhouse gas emissions of under 0.6 tons of CO2 over 10 years. Applying the $70 per car estimated cost of the 2012 standard, it appears the cost of achieving these reductions comes in at around $120 per ton. That's more than double the expected cost of directly capturing and sequestering CO2 from power plants and ten times the federal government's expected price of emissions allowances under the Waxman-Markey climate bill in its early years--coincidentally beginning in 2012. Using the same logic, the equivalent emissions reduction cost of the tighter 2016 standard exceeds $300/ton of CO2.

I'm also skeptical about the need to apply such a standard to the entire state, ignoring the enormous geographic diversity it encompasses, unless that was simply intended to increase CARB's leverage with carmakers. I grew up on the Central California coast and didn't feel the need to purchase the A/C option until I moved to L.A. While I doubt many cars are sold in my native state without A/C today, the coastal concentration of California's population, particularly north of the Tehachapis, suggests much lower fuel savings and emissions benefits in important portions of the state. If anything, many car owners in cooler regions of the state will need to use their heaters more to compensate for less natural warming of the vehicle interior. That's not a big drain on conventional cars, but it represents a direct energy cost for EVs and plug-in hybrids.

Now, it seems clear that the new regulation would benefit some consumers directly through reduced fuel costs, though I wonder how many of them would find the agency's expected payout of five to twelve years on the required investment especially compelling, if they weren't already sold on the comfort aspects of this feature. Since CARB isn't constrained by cost/benefit analysis of its rules, car window glazing represents just another piece of the puzzle to an agency that is already under the gun to implement the state's AB32 emissions law--even if the contribution of the new regulation toward cutting the state's 480 million tons per year of net emissions looks relatively trivial. There is no disincentive to pursuing such prescriptive solutions, however incremental or inefficient they might be. As a result, CARB doesn't need to ask the more important question of how that $250 per car (after 2015) might better be spent to reduce more emissions. Anyone preferring national regulation of emissions by the EPA to cap and trade--assuming the excesses and distortions of Waxman-Markey can be reined in by the Senate--should consider this a cautionary tale.

Thursday, July 23, 2009

Big Algae?

In spare moments during the last week I've been mulling over the implications of ExxonMobil's announcement of a very large investment in research and development on producing biofuels from algae, in collaboration with a leading biotech firm, Synthetic Genomics, Inc. While the reported figure of $600 million wouldn't buy much in the way of actual deployment, it could sure pay for a heck of a lot of R&D. The joint conference call about the announcement emphasized that the companies will be pursuing several possible technological pathways, though all appear to be focused on producing biofuel from algae continuously, rather than in a batch mode more analogous to farming. That would certainly increase the attractiveness for Exxon, which after all operates some of the world's biggest continuous production processes, in the form of its oil & gas fields, refineries, and chemical plants. The timing of this announcement is also interesting, coming just a few weeks after the US House of Representatives passed the first cap & trade bill to make it through either chamber of Congress.

The fundamental question I've been pondering is "why"? Why algae, and why ExxonMobil? For all of algae's enormous potential to produce large quantities of useful fuel, skepticism that this could ever be done economically on a useful scale abounds. And until now, Exxon had made a virtue of avoiding investments in renewable energy, generally seeing them as delivering returns well below those of the large oil & gas projects that have earned Exxon a sterling reputation for capital discipline. The answer to both questions likely resides in a word that appears frequently in the press release, in news coverage of the announcement, and in the press conference: scale. Two aspects of scale are relevant, here. First, in order to contribute meaningfully to our energy and climate problems, an alternative energy technology must be capable of being scaled up rapidly to a level comparable to today's oil, gas and coal industries. Current biofuels, solar power and wind still don't come close to matching the energy delivery of conventional sources. Exxon's website indicates potential liquid yields from algae of 2,000 gallons per acre, presumably in the form of the hydrocarbon-based "biocrude" emphasized repeatedly in the press conference. Even that relatively conservative estimate--my own back-of-the-envelope upper-bound estimate was 6,000 gal./acre--is at least ten times the current US yield of corn ethanol, after adjusting for energy content. Simplistically, if the acreage currently devoted to growing corn for ethanol were devoted to oil-excreting algae, it could replace nearly 60% of our gasoline supply from crude oil, rather than the 5% or so we get from ethanol.

Scale is also crucial for a firm of Exxon's size. A report in today's Wall St. Journal caught my eye. Occidental Petroleum announced its discovery of a 200 million barrel onshore oilfield in the middle of one of the most mature oil provinces in the world, in the San Joaquin Valley of California. I know that territory very well from my oil trading days, and it's an exciting development. However, Exxon is so big that it must find the equivalent of 8 such fields every year, just to stay even with its production. When I listen to the way Exxon describes its algae investment, I get the distinct sense that it views this arrangement as analogous to a very large oil exploration project, one that would be material to the results of the largest oil SuperMajor--and perhaps with similar odds of success. Now, it would be meaningless and of no value to Exxon if algae could produce the equivalent of hundreds of thousands of barrels per day of oil, but at a cost of $1,200/bbl. Exxon appears to be convinced that algae can contribute at a price very close to today's hydrocarbons, and probably without subsidies, knowing the firm's distaste for them. That has implications beyond algae.

In the conference call, Exxon's VP of R&D indicated that the company had assessed all of the advanced biofuels technologies and concluded that algae offered the best hope for producing fuels that would compete economically, with acceptable environmental impacts. That says something very worrying about the near-term prospects for cellulosic ethanol and the other "second-generation" biofuels technologies on which companies such as BP, Shell, and many others have pinned their hopes. Indeed, the US Congress pinned the whole country's hopes on the prompt commercialization of these unproven technologies in the remarkably ambitious national Renewable Fuels Standard they enacted in late 2007. If Exxon has concluded correctly that algae--which faces many serious hurdles of its own--is the best bet, then the entire US alternative fuels strategy could be in trouble.

There is also another way to look at this announcement. Exxon has been under enormous pressure to take a big stake in renewable energy. I vividly recall a Congressional hearing last year when committee chairman Ed Markey (D-Mass.) berated and belittled the Exxon representative for doing so little in this area. More recently, an environmental group took out full page ads targeting Exxon's opposition to cap & trade. I can't find the ad on the internet, but it said something like, "Poor Exxon, all alone in opposing Waxman-Markey." That has to get old, even for Exxon.

Could the algae tie-up with Synthetic Genomics, with its impressive expenditures contingent on achieving a series of unspecified milestones, be intended mainly to get this particular monkey off their backs? I doubt it, even though all the other advanced biofuel technologies being touted by their promoters also involve a substantial element of PR, until they actually produce commercial outcomes. If Exxon merely wanted to create some "green cred", it could have taken the same money and bought a dozen bankrupt corn ethanol plants or a few medium-sized wind farms. If the Exxon/Synthetic Genomics collaboration is about making Exxon greener, then it is certainly doing it the Exxon way, investing in something that, if successful, would neatly and profitably slot into their existing business model--and by the way into the hundreds of existing refineries and hundreds of millions of internal combustion engine vehicles globally. It's probably too early to imagine Big Oil becoming Big Algae, but the possibilities have obvious appeal, apparently even for the world's most successful oil company.

Tuesday, July 21, 2009

How Much Per Gallon?

A book I recently received from a publisher makes an interesting contrast with last Friday's posting on how many cars our current oil production might eventually support. Its title of "$20 Per Gallon" demands attention, though the book proves to be less of an argument for how we might get there than for what things might be like if--the author would say when--we did. Rather than providing detailed arguments for the imminent arrival of Peak Oil, Mr. Steiner essentially accepts that premise and builds on it to offer a set of scenarios describing life in the US at gasoline prices escalating steadily in $2 increments between $4 and $20 per gallon. It makes for an entertaining and sobering set of "what ifs?" Unfortunately, despite a brief author's note dated from February of this year, the book is something of a victim of the collapse of oil prices late last year. While its premise might have been accepted eagerly and unquestioningly last summer, the world looks a bit different today. The challenges he describes appear somewhat less urgent, particularly after oil's recent surge past $70 per barrel was cut short when it turned out that all that talk of "green shoots" might have been a bit premature.

In a sense "$20 Per Gallon" seems like two books, one quite interesting and the other seriously flawed, at least as a document about our energy future. The interesting part lies in the author's exploration of what successively higher energy prices might mean for different aspects of the US economy and lifestyle. True to its subtitle, it's hardly a tale of uniform woe, unless you have the misfortune of working in one of the sectors he concludes is doomed, including anything connected to commercial air travel as we now know it. He points out the environmental, health and safety benefits that might ensue from our responses to progressively dearer petroleum-derived products. Many of these benefits sound quite appealing, though I would propose that they are neither as inevitable nor as neatly tied to oil use as Mr. Steiner suggests. The book is also filled with anecdotes accumulated from his travels researching its subject. I particularly liked his description of the airplane graveyard and his rides in various energy-efficient UPS trucks. If you come to this book already convinced that we are on the precipice of Peak Oil, I suspect you would find most of this not just entertaining, but riveting.

The book is less likely to appeal to anyone who is skeptical about the inevitability of Mr. Steiner's scenario assumptions. Start with his structural choice of using gasoline prices as a proxy for underlying oil prices, despite the fact that petroleum product markets experience supply and demand fluctuations that differ--sometimes markedly--from oil's. This choice also ignores the enormous influence of taxes and other government policies on gas prices. You don't need $300/bbl oil to reach $8 gasoline, as European drivers can attest. Last week the price of the average gallon of gas in the US fell to $2.46/gal., compared to the equivalent of $6.40/gal. in the UK and $6.77/gal. in Germany. The difference is almost entirely due to taxes. Despite this, daily life in those countries is not so far beyond the pale of recent American experience as to frighten small children. The implications of a world of high fuel prices resulting from the combination of moderate oil prices and high taxation look quite different from those arising from oil prices above last summer's peak of $145/bbl.

There's an even bigger issue lurking under the surface, and it relates to the author's conviction that in the long run oil prices can only go higher--much higher--due to Peak Oil. There's at least some truth to that, and I've posted periodically on the enormous difficulties involved in attempting to increase oil production in the face of constraints on access to resources--internationally and domestically--along with high interest rates, scarce capital, chronic project delays, and the inexorable depletion of mature oil fields. But oil prices are determined by more than supply, and while he eagerly describes all of the ways in which we would have to adjust our habits to a world of higher and higher gasoline prices, I don't get the sense that Mr. Steiner has considered the ways in which these responses would tend to retard the steady price advances he describes. We have only to look at the impact that a demand reduction of less than 4% since late 2007 has had on oil prices in the last 12 months. That responsiveness to lower demand is as inherent in a commodity with a steeply-sloped short-run supply curve as were the high prices that accompanied the steadily increasing demand we saw earlier. This behavior reflects two sides of the same coin.

The complexities of the various feedback mechanisms involved would also make some of the positive outcomes that Mr. Steiner sees more uncertain. Consider the drop in traffic fatalities that he posits as a consequence of higher gas prices. While you would generally expect people to drive less if gasoline were much more expensive, that response would probably be less pronounced in the long run than in the short run, because of the other ways in which consumers would react. $4 gasoline is painful if your current automobile gets 20 mpg. However, once you've traded it in on a 50 mpg hybrid, your cost per mile--and thus your monthly fuel bill--is lower even at $6/gal. than it was before at $3.

In addition to these concerns, I noticed a few basic errors and misleading comparisons along the way. Compared to the above, they are nit-picks, but anyone who reads the book ought to bear them in mind. First, Mr. Steiner suggests a pretty dramatic impact from high gasoline prices on all the plastics we consume, without delving deeply enough to determine that most of the ethylene- and propylene-derivative plastics in North America--including Saran Wrap--aren't sourced from oil but from the liquids produced with natural gas. That's a crucial distinction, with vast new gas resources available and with the prices of oil and gas having diverged rather dramatically, at least for now. He also makes several numerical comparisons between the response to last year's oil price spike and the aftermath of the oil crisis of the 1970s without taking into account the 42% increase in US population since 1974.

I have to believe that Mr. Steiner would have written a somewhat different book, had he begun the project this year rather than last. I don't doubt that some of the outcomes he describes are waiting on the sidelines until the economy climbs out of its current trough, even if oil prices don't quite reach the stratospheric heights he expects. For example, it wouldn't take the oil-price equivalent of $8/gal. to trigger a radical restructuring of the airline business, after what's it's been through. At the same time, though, I doubt we've seen the last oil price cycle, and the relationship between the prices of oil and alternative energy sources remains complex and dynamic. In some respects proposals such as cap & trade or a carbon tax are intended to evoke some of the same responses that Mr. Steiner imagines, but on a gradual basis and without having to pay an external supplier for the privilege of motivating us. I suggest reading "$20 Per Barrel" in that spirit, rather than as a firm prediction of the inevitable future of our oil-based world.

Friday, July 17, 2009

Going Farther on Oil

As I was perusing my UC Davis alumni magazine last night I ran across a short article mentioning a new book from a professor, Dan Sperling, who directs Davis's well-regarded Institute of Transportation Studies. I know him slightly from his participation as in invited expert in a scenario workshop many years ago, so this caught my eye. His book, which I haven't read yet, examines the impact and implications of the rapidly growing global vehicle population, which he sees reaching the two billion mark within the next 20 years. In the article he suggested that this would require an entirely new transportation energy mix, made up of hydrogen, electricity, and advanced biofuels. That certainly fit my own long-standing expectations, as well. However, it occurred to me to wonder just how far we might be able to stretch the transportation fuels we get from oil, and just how far short they would fall as the global car-park expands. To my surprise, it doesn't require very aggressive assumptions concerning improvements in fuel economy, reductions in vehicle miles traveled, and additional oil supplies to cover the needs of a significantly larger number of cars in the world.

The starting point for such an analysis is current oil supplies and the way we process them. Global oil output in 2008 reached 86.5 million barrels per day (MBD), including crude oil, natural gas liquids, and the volumetric gain that occurs when you run them through a modern refinery. Roughly 60% of that input is currently turned into gasoline, diesel and jet fuel. Improvements in refining technology should make it possible to push that fraction to perhaps 70%, at the expense of heavy fuel oil displaced from power generation and shipping. So even if global oil output plateaued at only 90 MBD, a scenario that would probably seem optimistic to the adherents of Peak Oil and pessimistic to some industry experts, it could still yield 63 MBD of liquid transportation fuels. Set aside 7 MBD of that for jet fuel and kerosene and another 26 MBD for trucking and home heating oil, and we're left with 30 MBD of gasoline and diesel for passenger cars. That's roughly 25% more than current global consumption in light-duty vehicles, including the couple of MBD of diesel fuel that power Europe's popular diesel cars.

That doesn't seem to get us nearly far enough, until we consider that in the near future, cars will become much more efficient than they have been, particularly in the US, where an improvement from the current notional average of 25 mpg to the required 35.5 should eventually reduce average fuel consumption per mile by 30%. If the recent reversal in annual vehicle miles traveled persists after the recession ends, that would compound future fuel savings. When we consider that new cars in Europe currently average about 35 mpg and are required to reach approximately 43 mpg by 2015, based on a standard of 130 grams of CO2 emitted per kilometer, and that China has also introduced stricter fuel economy standards, it's not hard to imagine the average world car getting 40 mpg by 2020. That doesn't even require the majority of cars to be hybrids, let alone plug-in hybrids. If that average car drove 9,000 miles per year, it would consume 225 gallons of fuel annually. Following this back-of-the-envelope calculation to its conclusion, our 30 MBD of petroleum-based fuel for light-duty vehicles would be sufficient to cover Dr. Sperling's 2 billion cars with a little bit left over.

I'm not for a moment suggesting that this is the likeliest scenario, or that it means we don't need any of the advanced biofuels or electric vehicle technology currently under development. As I've pointed out frequently, fleet turnover in the developed world has slowed, thanks to the recession, and we can expect a long "tail" of older vehicles to persist for some time. However, the results of this simple exercise surprised me; I had expected the final number of cars that could be supplied by oil to be much lower. So while our transportation energy mix in the next couple of decades is still likely to include a much greater variety of fuels and an increasing penetration of electricity, we should not lose sight of the potential for realistically-achievable fuel economy improvements and non-efficiency conservation--driving personal cars less and relying more on mass transit and electronic trip substitution--to be the most important "transition fuel" in our arsenal, as we reduce our present reliance on oil, in order to tackle energy security and climate change.

Wednesday, July 15, 2009

Apollo, Forty Years Later

I couldn't let the 40th anniversary of the first moon landing pass by without comment, and not just because of what that event meant to a space-obsessed 11-year-old in 1969. Aside from numerous calls for an Apollo program for energy, or the periodic allusions to energy and climate change as the equivalent of the space program for our time, Apollo's extraordinary accomplishment might still have some lessons to teach us about what it takes to achieve goals of such a magnitude--as well as the proper limits of those lessons. It's also high time to give some serious thought to the role of space exploration in our future.

Although I had originally intended to post on this subject next Monday, on the anniversary of the day that Neil Armstrong stepped onto the lunar surface, it struck me as more appropriate to commemorate the entire mission and the enormous effort that went into planning and executing it. I was pleased to find a website called "We Choose the Moon" that will retrace the events of Apollo 11 in real time, beginning exactly 40 years after the launch on July 16, 1969. NASA has put up a 360-degree interactive panorama of the lunar landing site, and the New York Times has extensive coverage. As interesting as these sites are, however, none of them can recreate the feeling of that long-ago summer, when families and neighbors gathered around their TVs--many of them new color sets bought for the occasion. That cohesion proved fleeting, and it seems almost alien today. Sadly, so does the remarkable combination of urgency and patient, meticulous planning without which the moon landing would have remained as impossible as it must have seemed a decade earlier.

When President Kennedy made his speech to Congress in 1961 setting the goal of landing on the moon within the decade, the technology to deliver that outcome did not exist. The first American manned space flight by Alan Shepard had taken place just three weeks earlier, and the first unmanned Saturn V moon rocket wouldn't be flight-tested for another six years. The financial cost of the moon landing program was so high--roughly $150 billion in today's dollars--because so much of it had to be designed and built from scratch, from the vehicles to the entire infrastructure to assemble, launch, monitor and control them.

It was also high because despite the intense pressure of needing to pull off this feat within eight years, it involved the step-by-step incremental development and demonstration of the capabilities that would ultimately be required. For example, the Gemini Program, involving 10 manned launches in 1965 and '66, was mainly intended to test techniques such as rendezvous, docking and spacewalking that were integral to executing the Apollo concept for going to the moon. Then, between the disastrous Apollo 1 launch pad fire, which forced NASA to redesign the Apollo capsule, and "The Eagle has landed", there were four other manned Apollo flights, each testing incrementally more complex elements of the lunar mission. This was the epitome of combining bold strategic planning with planning by doing; that much, at least, seems broadly relevant to our current energy situation.

The US moon landing effort of the 1960s created a vast technical and industrial pyramid. At its apex was the delivery of a cumulative total of 12 Americans to the surface of the moon and their safe return home. If Apollo 13 had not experienced its well-documented accident, and if the last four missions hadn't been cancelled and recycled into the Apollo-Soyuz demonstration of US-Soviet Detente, plus three visits to the Skylab space station, that figure might have reached 22. Yet as impressive and unprecedented as that was, and in spite of a host of valuable breakthroughs and spinoffs in electronics, medicine, and other fields, this is precisely where all of the analogies between energy and Apollo break down. Remaking our energy systems to provide the safe, secure, affordable and environmentally-sound means of energizing the entire economy--national or global, take your pick--will be nothing like making a few trips to the moon and then turning our back on it for four decades. It will require a durable bi-partisan consensus in government for at least a generation and enduring public support of a kind that NASA was ultimately unable to sustain.

At the same time, the US manned space program has reached an existential crossroads. The shuttle is on its last legs, and its planned replacement, the Orion/Ares system won't be operational until at least 2015. The International Space Station could be "de-orbited"--allowed to burn up in the atmosphere--as early as 2016 if new funding and a renewed purpose aren't found. This is the context for a blue-ribbon panel that will advise the administration on NASA's future direction. Ambitious plans for a return to the moon and an eventual manned mission to Mars look vulnerable to budget concerns.

I would be remiss if I didn't also mention the enormous potential of space to contribute to solving our energy and environmental problems. Whether in the form of space-based solar power or potential deposits of exotic nuclear fuel on the moon, the long-term solutions to the earth's environmental challenges and resource needs must eventually capitalize on the boundless energy and materials available outside our atmosphere. I'm also mindful of the profoundly-expanded perspective that space exploration has provided us. The widely-recognized "Earthrise" photo from Apollo 8's trip around the moon in late 1968 probably did more to awaken our environmental consciousness than a thousand speeches and rallies.

With the economy sunk in a deep recession and the country grappling with the seemingly intractable issues of health care, gargantuan deficits, and a looming retirement crisis, I can't imagine a better time to recall a moment when we proved that we could accomplish almost anything, if we set our minds to it. I don't know how much the media intends to play up this anniversary. NASA certainly has big plans. Although 50th anniversaries tend to make bigger splashes, the ages of the Apollo 11 crew and the surviving scientists, engineers and others who made their journey possible preclude waiting another decade to stage a proper celebration of their achievement. I'm looking forward to explaining to my daughter just how thrilling it was to watch that first fuzzy broadcast from the moon.

Monday, July 13, 2009

The Wrong Flex-Fuel

An article in today's Wall St. Journal highlighted another of the more obscure provisions of the mammoth climate bill recently passed by the House of Representatives. The section in question relates to the "Open Fuel Standard", which would authorize the Secretary of Transportation to require auto makers in the US to build a specified proportion of "fuel choice-enabling automobiles", including flexible fuel vehicles (FFVs) that can run on fuel blends containing a high percentage of methanol, as well as the more common E85 ethanol blend. This harkens back to previous efforts to launch methanol as a consumer fuel. Fortunately, those failed to gain traction, and we should hope that continues to be the case. Methanol makes a fine racing fuel but is entirely unsuited for mass market application.

I'm perplexed why one member of Congress would be quoted as saying he wouldn't have supported the Waxman-Markey bill without its methanol provision. A simpler alcohol than ethanol, methanol is produced mainly from natural gas, rather than from biomass, and it is a common industrial chemical. Because its economics depend on low-priced sources of natural gas, much of the world's methanol is produced in the Middle East, and some plants in North America have closed. It's not clear that increased US methanol demand would be met by either domestic or non-hydrocarbon sources, so its efficacy in addressing either energy security or climate change looks questionable. That's just as well, because methanol offers an inferior way to deliver energy to vehicles, even compared to ethanol, and its toxicity makes it a poor choice for a consumer fuel.

Start with the energy side of these drawbacks. Turning natural gas into methanol consumes around 1/3 of the energy content of the gas, similar to producing hydrogen from natural gas. As with H2, there's no way to recover those losses when burning methanol in an internal combustion engine, so while direct emissions might be lower, indirect emissions negate most of that benefit. We'd be much better off just putting the natural gas directly into cars. Then there's fuel economy. Even after you modify a car to run on a 50% (M50) or 85% blend (M85) of methanol and gasoline, you can't compensate for its lower energy content without precluding operation on ordinary gasoline. While a car running on E85 typically uses 40% more fuel per mile than on gasoline, you'd need 75% more M85 to go the same distance, because methanol's energy content is 25% less than ethanol's and less than half that of petroleum gasoline. So a Ford Fusion FFV that gets a combined 21 city/highway mpg on gasoline and 15 mpg on E85 would deliver a paltry 12 mpg on M85. Even with the car's generous 17.5 gallon fuel tank, its range on M85 would be barely 200 miles.

As if these practical considerations weren't a sufficient disqualification, methanol's handling risks ought to put it out of the running for our future fuel mix. The basic problem is that, unlike gasoline or ethanol, methanol is a neurotoxin. Ingesting even a small quantity can lead to blindness or death, as described in the Material Safety Data Sheet from Methanex, one the world's largest methanol producers. Its vapors aren't much safer, and it can even be absorbed though the skin. These properties create serious concerns for both bulk handling and at the point of sale. Gasoline is hardly as safe as water, but at least if you spill some on your hand, you don't need to be hospitalized. While methanol can be handled safely by trained personnel in industrial facilities and storage terminals, that doesn't extend to the gas station forecourt, where it would pose a hazard to both customers and employees.

Consumers have rejected methanol fuel before, and I am pretty confident they'll do so again, but possibly not before the government imposes another expensive mandate on an automobile industry that surely doesn't need such distractions. The inclusion of this half-baked idea in the House climate bill is a further indictment of its managers' approach of garnering votes one special interest at a time. The Senate has an opportunity to avoid this trap by stripping out all these extraneous provisions and sending a bill to the eventual House/Senate conference committee that focuses squarely on reducing emissions without making concessions to every member's pet idea.

Friday, July 10, 2009

Biodiesel from Sugar Cane

I was intrigued by a story in yesterday's MIT Technology Today concerning a company that is applying biotechnology to convert Brazilian sugar cane to diesel, instead of ethanol. Amyris apparently intends to buy existing mills and convert them to produce hydrocarbons instead of alcohol. It has started up a demonstration-scale facility for this process near São Paolo. With so many other firms pursuing next-generation biofuels from cellulose or algae, tinkering with the most efficient current means of producing ethanol might seem an odd thing to do, but that efficiency is precisely the reason for choosing this pathway. Amyris sees an opportunity to produce a much better transportation fuel than ethanol at a cost low enough to compete with petroleum products, even if oil prices don't return to the levels we saw last year.

Energy efficiency and high energy returns on energy invested are essential to producing competitive biofuels in a way that avoids the trap the US corn ethanol industry fell into in 2008. Ethanol producers didn't benefit nearly as much from last year's high oil prices as they--and their investors--expected, because the rising cost of the large energy inputs required to make corn ethanol rose in tandem with the price of the fuels it was supposed to displace. This is an example of what some analysts call the Law of Receding Horizons. After factoring in the cost of natural gas-based nitrogen fertilizer, diesel-powered cultivation and harvesting, and gas-fueled distillation, the relatively small energy surplus created wasn't worth enough to make the operation profitable, even at the highest oil price in history.

Amyris's concept breaks out of this trap in several ways. First, by starting with sugar cane in the tropics, it avoids the large energy inputs associated with crop fertilizer. The article points out two other key benefits: Brazilian sugar/ethanol mills are net energy producers, not consumers, by virtue of capitalizing on the energy content of the waste left over from the grinding and fermentation process. In addition, while the ethanol produced by traditional fermentation is water soluble, requiring a lot of energy to separate the two, the molecules produced by the company's tailored microbes are not; the diesel precursors separate from water at little additional energy penalty.

The advantages of this approach continue after production, because of the properties of the fuel. Although it is possible to build engines that capitalize on ethanol's high octane and other properties to deliver fuel economy that nearly matches gasoline, the vast majority of the ethanol produced today will be burned either as a 10% blend in conventional cars or as a higher mix in flexible-fuel vehicles that must still be able to operate reliably on gasoline. That precludes the modifications that would compensate for ethanol's 33% lower energy content, compared to petroleum gasoline. Producing biodiesel instead of ethanol puts the fuel into engines that can take full advantage of its environmental properties, while yielding a roughly 30% fuel efficiency gain versus gasoline--and thus roughly twice the fuel economy of ethanol. Amyris claims that its biodiesel would be fully compatible with petroleum diesel, creating a significant advantage over biodiesel produced from soybeans, canola (rapeseed), and other vegetable oils. These so-called FAME biodiesels can normally only be used in blends of less than 5-10% in petro-diesel, to protect the sensitive fuel injection mechanisms of modern diesel engines.

There's no free lunch, of course. Part of diesel's advantage comes from its higher energy content, compared to either gasoline or ethanol, and the energy in the quantity of cane that would produce 100 gallons of ethanol could only yield around 60 gallons of diesel. However, when you burn these fuels in real cars--such as the VW Jetta that is available in both gasoline and diesel versions--the ethanol would take you around 1,650 miles, while the smaller quantity of diesel would be good for nearly 2,000 miles. That 20% improvement results from the higher efficiency of compression ignition engines over spark ignition.

This idea looks clever for another reason. Brazil has become a large exporter of ethanol, but the world's biggest ethanol market is protected by an import tariff designed mainly to recover the $0.45/gal. US ethanol blenders' credit. Meanwhile, the EU, which uses little ethanol, but where half of all new cars run on diesel, has just imposed an anti-dumping tariff on biodiesel imported from the US. That creates an opening for Brazilian biodiesel produced from this process to compete into a market that can't get enough diesel fuel. All that remains is for Amyris to demonstrate that the additional capital and operating costs associated with converting ethanol mills to produce diesel are small enough to preserve the big advantage they start with by choosing the world's most efficient biofuel source.

Wednesday, July 08, 2009

Speculation Witch Hunt?

This morning's financial press was riveted by the prospect of the Commodity Futures Trading Commission (CFTC) imposing tough new regulations on energy markets. Speculation has been widely blamed for the run-up in oil prices since early spring--as well as for last year's roller-coaster up to $145 per barrel and then down to $34--though in a subtle but important shift the focus seems to be turning to volatility, which is a very different thing than absolute price levels. I don't need to add my voice to the many already warning that limits on speculative positions could hamper the proper functioning of the market by drying up liquidity and depriving "legitimate" participants of the access to hedging they need. Instead, I believe the CFTC and its supporters in Congress and the administration are barking up the wrong tree, altogether, based on a fundamental misunderstanding of the markets.

Let's begin by stipulating that speculation probably has a finite but impossible-to-quantify impact on oil prices. I pointed out this likelihood in mid-2007, when oil prices were at roughly their current level and before they began their wild ride. I've also described the difficulties involved in discerning precisely which trades are speculative and which aren't, based on my own experience trading oil commodities, futures and derivatives over a 10-year span earlier in my career. However, the current determination to clamp down on speculation appears to be based on two hypotheses that are not only unprovable in the real world, but probably entirely false: First, that in the absence of speculation, oil prices would not have spiked to nearly the degree they did last year and would be much lower today than they are, and second, that a market without speculators--or indeed without any futures trading at all--would be inherently less volatile than one in which those factors are present.

The latter proposition is easier to refute, because we've seen ample volatility in markets for which no futures contracts or easily-traded derivatives exist. I experienced this first-hand in the west cost spot gasoline market in the 1980s, a market consisting entirely of the trading representatives of local refiners and a small number of trading companies, some with storage tanks but many with no fixed assets other than a phone and a desk. Every time a refinery experienced a major operational upset, the market would spike by as much as a dime a gallon--a significant fraction of the value of a commodity that was trading well under a buck at the time. I recall one instance when the coking unit of my employer's L.A. refinery had a major fire and was out of commission for several months. Local supplies weren't adequate to cover the shortfall, and the gap had to be filled through imports. I started buying gasoline cargoes at around $0.60/gal., and by the time I had lined up all the supply we needed the price had hit $1.00/gal. before it fell back to more normal levels. That's volatility, and it is a feature of markets in tight balance between supply and demand, whether or not speculators play a role.

The question of where oil prices would have ended up last year absent speculation seems much more complex, until you consider that between 2002 and 2007 global oil demand had been growing steadily at an average rate of more than 1.5 million barrels per day (MBD) per year, outpacing the growth of global supply, and crucially of non-OPEC supply. The latter was essentially flat from 2004-7, when the price of oil roughly tripled from the low-$30s to the low $90s. In effect, the demand curve was marching to the right against a supply curve with a sharply steepening slope, as spare capacity was used up and the long inherent time lags for new oil projects constrained the amount of new production that could be brought on quickly. That path was then quickly reversed in mid-2008, once it became clear just how rapidly demand was falling, both in direct response to the high price of petroleum products--the full manifestation of which in many markets was delayed by government price controls--and by contraction of the global economy due to what we now know was the onset of a recession on a scale not seen in decades. Between February and September of last year, demand in the developed world fell by an astonishing 3.5 MBD. We'll never know whether prices would have fallen sooner if speculation hadn't maintained its momentum during the first half of 2008, but it's borderline delusional to imagine we wouldn't have spiked above $100/bbl without it.

The Wall St. Journal's "Heard on the Street" column on this topic begins with the sage observation that blame is a commodity in infinite supply. To that I would add that we rarely like to apportion that blame on ourselves, though in this case the government would do well to consider how its own actions exacerbated last year's oil price spike and the run-up in prices we've seen this year. The oil markets are mainly driven by supply and demand, and with OPEC maintaining remarkable discipline and cohesion in the face of last year's demand collapse, the supply component that has the most influence in holding down oil prices is non-OPEC production. What has our government done to promote that production? Have we seen our elected officials traveling the world and using their influence and the still-considerable diplomatic and economic leverage of the US to urge producing countries to increase access for foreign firms and investors to new oil exploration and production opportunities, on attractive terms, as the Chinese government has? Have they fast-tracked development in the most promising regions of our own country that were off-limits for drilling, including the eastern Gulf of Mexico, where reserves have already been discovered?

Such actions didn't even occur under an administration that was widely viewed as being in the pocket of the oil industry, and they certainly aren't happening now, for reasons I could devote many more paragraphs to dissecting. "Drill, baby, drill" has given way to tax, baby, tax--and I'm not referring to the climate bill, here, but to earlier talk of a windfall profits tax to fund tax relief for the middle class, which has morphed into an effort to close perceived tax loopholes such as the intangible drilling allowance for producers and the manufacturing tax deduction for refiners. None of this is going to add a barrel of real oil to our supply, and it seems likely to eliminate more than a few, while we pin our hopes on corn ethanol that still only supplies 2% of our total liquid fuels demand, after adjusting for its lower energy content. How much of the market's volatility ultimately derives from our own deeply conflicted attitudes towards oil?

Oil prices have fallen by $10/bbl. or around 14% since June 29. This coincides with a general recognition that the economy hasn't yet turned the corner to a real recovery; we've also seen the S&P 500 drop by about 7% since mid-June. Now, you might suggest that this proves that speculators had driven up prices unrealistically, but it makes at least as much sense to suggest that the producers and consumers of physical oil and its products have altered their buying and inventory decisions in light of new information about the likely state of the economy for the rest of the year. No one can win that argument, but we can all lose if regulators impose tough new controls on energy markets based on a misunderstanding of what has occurred. To that end, while I am deeply skeptical of the idea of anyone at the CFTC passing judgment on what is and what is not a proper hedge, I wholeheartedly support Chairman Gensler's call for greater transparency of market reporting, and for a healthy dialog between the industry and its regulators aimed at reining in those practices most likely to add speculative froth to the market without contributing meaningfully to the liquidity required by all participants. Let's get the additional insights that transparency will bring us, before we decide to blunt the tools that actually provide one of the few means by which firms can mitigate the effect of underlying physical market volatility on their activities.

Monday, July 06, 2009

The Forgotten Renewable

An editorial in the New York Times last week highlighted a topic I've been meaning to comment on for some time, the gradual demise of our oldest and still largest source of renewable energy: hydroelectric dams. Along with lauding plans to remove several West Coast dams in order to protect fish populations, the Times urged the dismantling of the four large power dams on the lower Snake River in Washington state. The disconnect between that position and the paper's long-standing advocacy of stronger measures to address climate change is remarkable, considering the elimination of 3,000 MW of zero-emission power generation that would accompany the loss of these dams. But if the unpopularity of existing hydropower dams in environmental circles explains the exclusion of this vital energy resource from the definition of "qualified renewables" included in the proposed national Renewable Electricity Standard (RES) of the Waxman-Markey climate bill, that hardly excuses a policy so counter-productive for our efforts to reduce greenhouse gas emissions.

Consider the four dams in question. I can't speak to concerns about declining salmon populations or other habitat issues, though I note that the dams in question are all "run of river" facilities, without large reservoirs. What is clear, however, is that if the four facilities typically operate at the national average hydropower utilization rate of around 36%, their annual power generation would come to about 10 million megawatt-hours (MWh) of electricity, equivalent to the output of 4,000 MW of wind capacity, or roughly 20% of the entire US wind power output in 2008. After a banner year for wind turbine installations in 2008, the US might not add much more new wind capacity than that this year, and wind remains the largest-scale technology among our preferred renewable power options. In fact, since 1999 US hydropower output has declined by an amount greater than the entire current contribution of wind power. That means the emissions benefits of a decade of dramatic growth in wind and solar power have been negated by the loss of hydroelectric generation--a loss that the authors of Waxman-Markey have chosen to ignore by counting in their RES only "incremental hydropower", which they define as

"(A) energy produced from increased efficiency achieved, or additions of capacity made, on or after January 1, 1988, at a hydroelectric facility that was placed in service before that date and does not include additional energy generated as a result of operational changes not directly associated with efficiency improvements or capacity additions; or
`(B) energy produced from generating capacity added to a dam on or after January 1, 1988, provided that the Commission certifies that--
(i) the dam was placed in service before the date of the enactment of this section and was operated for flood control, navigation, or water supply purposes and was not producing hydroelectric power prior to the addition of such capacity;
`(ii) the hydroelectric project installed on the dam is licensed (or is exempt from licensing) by the Commission and is in compliance with the terms and conditions of the license or exemption, and with other applicable legal requirements for the protection of environmental quality, including applicable fish passage requirements; and
`(iii) the hydroelectric project installed on the dam is operated so that the water surface elevation at any given location and time that would have occurred in the absence of the hydroelectric project is maintained, subject to any license or exemption requirements that require changes in water surface elevation for the purpose of improving the environmental quality of the affected waterway."

In other words, a utility would be able to count increases in hydropower towards its RES compliance only if they came from certain carefully-specified improvements, while the sole penalty for lost hydropower capacity and output would be an increase in the base amount on which the RES would be calculated. So at the full 20% RES level for 2020 and beyond, a MW of new wind, solar or other "qualified renewable" capacity would count 5 times as much as a MW of hydro dismantled.

This mismatch speaks to our conflicted attitudes toward climate change and the broader issues of sustainability. I'm sure that those advocating the removal of these dams would argue that we shouldn't make such decisions on the basis of any single criterion, even one as important as greenhouse gas emissions. Yet that view is at odds with the underlying philosophy of a climate bill that aims to do more than just level the playing field by imposing a charge on greenhouse gas emissions to account for the environmental externality not captured in the economics of the energy market. In addition to its skewed version of cap & trade, Waxman-Markey would stack the deck for a chosen group of renewable energy technologies, in the process excluding the one that produces more zero-emission MWhs than all the rest put together. When the Senate takes up this legislation, it should abandon this narrow focus on specific technologies in favor of one that creates a positive bias for all our low-emission sources, including hydropower and nuclear energy. For a government so determined to demonstrate our seriousness about tackling our emissions, in advance of December's Copenhagen climate conference, that would speak far more loudly than another thousand-plus pages of convoluted new regulations.

Wednesday, July 01, 2009

An Energy Bill for the Other 92%

Now that the Waxman-Markey Bill, the American Clean Energy and Security Act of 2009--all 1428 pages of it--has been narrowly passed by the House of Representatives, its fate rests in the hands of the US Senate, a body that has spurned a long series of cap & trade bills. The Senate's rules will require a much larger plurality just to bring such a bill to a vote, and that doesn't look easy, despite the belated resolution of the Minnesota race. The situation is further complicated by the existence of the Senate's own recently-drafted energy legislation, the American Clean Energy Leadership Act of 2009 (ACELA) from the Senate Energy and Natural Resources Committee chaired by Senator Bingaman (D-NM). Although lacking a counterpart to Waxman-Markey's cap & trade provisions, ACELA seems in many respects the better bill, promoting both renewable energy and the sources that supply 92.5% of our current energy needs and are likely to dominate our energy diet for many years: fossil fuels and nuclear power. This broader scope will be crucial, if our goals extend beyond reducing emissions to include shoring up energy security and fostering net job creation, not just "green jobs."

The full text of the Senate energy bill isn't yet available, nor has it been assigned an "S-number", by which it can be tracked. In reviewing the summary of ACELA on the committee website, I was struck by a marked contrast in its approach, compared to the House bill. ACELA is the product of a deliberately bi-partisan process, and the results of the horse-trading that went into it seem more cohesive and less jarring than the non-cap-and-trade portions of Waxman-Markey. ACELA's renewable electricity standard--which really ought to be a low-emission electricity standard--would start at 3% of electricity sales and ramp up to 15% by 2021. Importantly, the bill emphasizes energy efficiency, particularly for buildings, which would account for most of the greenhouse gas emissions reductions it would promote.

It also includes several provisions that echo themes I've advocated in a number of previous blog postings, such as updating the strategy for the Strategic Petroleum Reserve and opening up more of the Gulf of Mexico for offshore drilling. That would provide prompt access to identified hydrocarbon resources such as Destin Dome and take in a healthy portion of the currently-understood resource potential of those areas that had been kept off-limits by the expired offshore drilling moratoria. In addition, the bill would expand our knowledge of our offshore energy resources, conventional and renewable, through a detailed inventory including seismic exploration. If we're going to have a meaningful national debate concerning the expansion of access for oil & gas drilling, a better understanding of what's actually there is a critical prerequisite. If that seems contrary to the goal of reducing our emissions, consider that the main CO2 cuts from the hydrocarbon sector will result from reduced consumption, which would come at the expense of our enormous oil imports, not from suppressing the domestic production and access to Canadian production that underpin our energy security.

As for energy markets, unlike the heavy-handed regulations buried in the miscellaneous provisions of Waxman-Markey, ACELA would increase the transparency of oil & gas trading by expanding the Energy Information Agency's data and analytical coverage and bolstering industry reporting requirements. And in another provision, the bill would commission a long-overdue assessment of the critical connections between energy and water that I mentioned in last Tuesday's posting.

The bill also emphasizes job creation, both explicitly and implicitly. Its provisions for renewable energy, efficiency, and electricity transmission and grid improvement would promote the same kinds of green jobs claimed by the supporters of Waxman-Markey. At the same time, its oil & gas provisions would stimulate jobs of the kind highlighted by a new labor-industry partnership between the American Petroleum Institute and 15 labor unions. The US oil & gas industry employs 1.8 million people directly and another 4 million or so indirectly. Both figures could grow further with expanded access to US resources, and these jobs typically pay well over the national average.

The gaps and conflicts between the House and Senate bills look too big to overcome through reconciliation, which would in any case require the Senate first to pass either its own energy bill or a version of Waxman-Markey. I spent some time on the phone yesterday with contacts on Senate staffs to try to understand the likely process. Several paths appear possible, with the simplest involving the use of Rule 14 to bring Waxman-Markey directly to the Senate floor. The controversy around the bill and its narrow margin of victory in the House suggest a low likelihood of success for this route. Another avenue would involve moving the House bill into the Environment and Public Works Committee chaired by Senator Boxer (D-CA) and modifying it extensively. That would create the opportunity to include or substitute the measures in the ACELA bill for those in Waxman-Markey. However, that might still not avoid the fate of last year's Boxer-Warner-Lieberman cap & trade bill, which fell significantly short on the cloture vote required to bring it to a full vote of the Senate. The composition of the Senate has changed significantly since last June, yet it remains to be seen whether supporters of cap & trade have gained enough votes to carry the day.

My strong preference would be for the Senate to graft a clean version of cap & trade onto Senator Bingaman's energy bill, jettisoning the distortions that Waxman-Markey acquired in the process of lining up enough House votes to ensure passage. Some of those distortions neatly cleaved the natural business coalition against the bill by lavishing so many free emissions allowances on the utility sector, but in the process severely undermined the bill's potential for achieving prompt and significant emissions reductions. They effectively gave a temporary Get Out of Jail Free card to the sector of the economy that is responsible for the single largest share of our emissions, yet possesses the best options for substituting cleaner natural gas for its highest-emitting energy sources. The legislative fusion I'm suggesting could put a price and a cap on CO2 emissions, while ensuring adequate supplies of nuclear power and North American fossil fuels to manage the long-term transition to a lower-emitting economy.