Wednesday, October 31, 2007

Blaming OPEC

Lately, it seems that everyone has an opinion on why oil prices have reached the threshold of $100/barrel, including several of the candidates in last night's Democratic Presidential Debate. This morning's Wall Street Journal offers some theories from an interesting source, two ministers of an organization that features prominently in many analysts' explanations of the current market tightness: OPEC. Apparently, OPEC's members are vexed at being blamed for high oil prices, which they attribute to three factors: the weak dollar, market speculation, and refining bottlenecks. Considering the source, it's worth spending a few minutes examining these arguments to see whether they have merit.

Let's start with the easiest to dismiss, the refining sector. You don't even have to think through the logic of this argument, which inverts the usual understanding of how temporary shortages in refining capacity normally exert downward pressure on the prices of specific grades of crude oil, as volumes are backed out of the system and crude inventory grows. All that's necessary is a look at refining margins, which measure the difference between the revenue refiners receive for the products they sell and the cost of their raw material. These margins are dramatically lower than they were in the spring and early summer, when crude oil was trading in the mid-to-low $60s. This tells a clear story. Refining capacity--in the US, at least--is not tight, and this factor has not contributed to oil's $15 sprint in the last four months.

Next consider speculation. I don't think this can be dismissed quite so easily, at least as a general contributor to oil prices over the last four years. As I've noted before, the growth in demand for oil futures and options within the broader financial portfolios of hedge funds and other investors must put upward pressure on futures prices, and thus on the enormous volumes of physical oil that are sold at prices pegged to the futures. Still, there's some evidence that the speculative flow has reversed in the last few months, as the ripples from the sub-prime crisis have dried up credit and forced some firms to liquidate positions. In other words, it's much easier to see how speculation contributed to oil's rise from $40 to $70 than from $70 to $90.

That brings us to the dollar, and as the recipients of roughly $2.8 billion dollars per day at current prices, OPEC ought to know as much about that subject as anyone. A chart of oil prices for the last two years expressed in both dollars and Euros reflects remarkable differences, but doesn't quite tell the story that the OPEC ministers might wish.

Between June 1 and October 18 the dollar price of oil on the New York Mercantile Exchange increased by 37%, while in the same period, the price in Euros had risen by almost 30%. There is clearly a lot more at work here than the weakness of the dollar, at least over the last four months.

So while each of these factors has contributed to the escalation of oil prices since 2003, they don't explain why we suddenly find ourselves staring at the prospect of $100/barrel by Christmas. Instead, I think we have to examine the long-term fundamental trends of the industry, which are currently colliding. Demand continues to grow, driven by strong global economic growth, including many places where consumers are insulated from the true market price of petroleum products. Non-OPEC production can't keep up with this growth, because of the combination of production decline rates, drilling bans, and the lagged effects of the near collapse of the industry in the late 1990s. And that's where OPEC comes into the story.

As the world's main oil exporters and the holders of 69% of global proved oil reserves, at the latest count, OPEC has not planned properly for the growth of their long-term market. This isn't a question of the dwindling increments of existing capacity they are holding back by mutual agreement, but of their failure to reinvest in their core business and expand capacity ahead of demand. There are many reasons for this, including domestic financial pressures in many member countries, the impact of sanctions on Iran, and the war in Iraq. They have generally kept out the international companies that possess the motivation, capital and capability to develop OPEC's unexploited reserves, while failing to do the job themselves--with a few notable exceptions, such as the current expansion program in Saudi Arabia.

Whether this underinvestment in new capacity is due to resource nationalism, the inefficiency of state oil companies, or a deliberate effort to keep the market tight, OPEC must bear at least half the responsibility for $90+ oil. The other half, as one of the candidates implied last night, lies with US consumers (along with their counterparts in China and elsewhere) who have increased our demand steadily, with little thought of where the next gallon was coming from. Unless we do something to alter that trend, OPEC will continue to win and we will continue to lose from this ongoing collision.

Tuesday, October 30, 2007

Planting Green Seeds

Finding ways to help China become more energy efficient and reduce its pollution and greenhouse gas emissions is one of the top energy and environmental priorities in the world. It represents the ultimate win-win opportunity. Success on both fronts would reduce the impact of China's growth on global energy markets and on atmospheric CO2, while improving China's economy and its air quality. GM has just announced a significant step in this direction, with the establishment of a new R&D center near Shanghai, as part of a $250 million investment in the country. The potential Chinese market for hybrid and electric cars and alternative fuels is enormous, though it is not clear whether this will improve GM's profitability in China, rather than merely solidifying their position there.

Shifting China's growing personal transportation sector onto a more sustainable basis as rapidly as possible would benefit everyone. If the technologies involved in the new GM R&D center could be introduced quickly enough, China could end up with one of the world's most efficient vehicle fleets, capitalizing on conditions that form the counterpoint of our situation. Instead of being handicapped by the inertia of an installed base of 250 million vehicles, China is at a sufficiently early stage of growth that its first 100 million cars could still be dominated by hybrids and other advanced technologies. In the process, experience curve effects would flow back to developed countries, reducing the manufacturing costs and improving the reliability of all such cars, everywhere. But as fast as China's car output is growing, this window will begin to close within a few years.

The resulting energy savings from such a shift, relative to the status quo alternative, could amount to several billion barrels of oil over the next 20 years, or a daily volume comparable to the entire US ethanol program in 2007. It would also reduce China's CO2 emissions by at least 1%, or much more if the outcome of this effort were something novel, such as a cheap, bare-bones electric car designed to appeal to the entry-level buyer who currently rides a bicycle.

Pundits such as Tom Friedman of the New York Times have been pointing to the challenge of greening China as a great opportunity for which US industry is well equipped and from which it could benefit enormously. While I see the same opportunity, I'm less convinced of the prospect of translating those benefits to the bottom lines of US companies. "Cleantech" is certainly the new darling of venture capital and other investors--as noted in the Wall Street Journal's special section on Monday--but if it follows a path similar to "high tech", it could turn out that the profits accrue more to the implementation than to the owners of the technology itself: to the equivalent of cellphone service providers, rather than handset makers. Why would cleantech not end up every bit as competitive globally as PCs and other devices have become?

Local factors will come into play, as well. In China you are only as good as your partner. GM has fared well in this regard so far, but there is no guarantee that Shanghai Automotive Industry Corp. or Liuzhoue Wuling Motors Ltd. won't emerge in a decade as major global competitors, armed with GM's latest technology. Depending on GM's fortunes elsewhere, it's not beyond the pale that GM's car business could ultimately go the way of IBM's PC business, which was bought out by its partner, Lenovo.

Improving China's automotive technology will have all sorts of consequences, some self-evidently positive, others less so; some foreseeable, others either unintended or entirely unknown. Despite all that, GM looks wise to steal a march on its competitors in this regard, because the alternative is not just leaving things as they are. The world can't accommodate a half-billion Chinese driving conventional SUVs, and GM's global competitors can't be more than a quarter-step behind them in pouring their best resources into the world's fastest-growing large market for cars.

Monday, October 29, 2007

Renewable Reality Check

Last week's issue of Business Week included a fascinating article documenting the frustrations of a manager charged with implementing green energy and sustainability strategies within his company. It also includes a scathing critique of the efficacy of Renewable Energy Credits (RECs,) a form of emissions offsets. Anyone thinking that "going green" will be simple and quickly profitable, even after the low-hanging fruit of highly-attractive projects is exhausted, ought to read this article. But they would also be wrong to conclude that emissions offsets do nothing to benefit the environment, because they allow companies purchasing them to continue to emit and pollute. In fact, it is precisely because of the way they shift the burden of those reductions that RECs help to maximize our response to climate change and local pollution, while minimizing its cost.

RECs are not quite the same thing as the greenhouse gas emissions credits officially traded in countries that have ratified the Kyoto Treaty, or on a voluntary basis here. RECs are specific to electricity produced from new renewable sources--wind, solar, small hydro, etc.--and as with many other derivative instruments, they represent the separation and repackaging of an attribute of a project's operations, risks, or cash flows: in this case the "renewable attribute." The key to this practice is that once a REC has been separated from the Megawatt-hour of renewable electricity that gave rise to it and then sold, the underlying power can no longer be sold as "green." In other words, while its REC can be traded or re-sold, the renewable attribute of a given MWh can't be counted more than once.

What seems to worry Mr. Schendler, the ski-resort sustainability manager in the article, is that the current price of a REC is too low to encourage the construction of more renewable power projects. As a result, he may be paying for something that would have been built anyway. This issue, known as "additionality" was a major concern in the design of the Kyoto emissions trading system, in which a project can't generate emissions credits unless it wouldn't have been built otherwise. For example, in the case of Kyoto's Clean Development Mechanism (CDM,) that effectively means that a developed country facing high costs of reducing its emissions can invest to build a project in a developing country that avoids a like quantity of emissions much more cheaply. Such projects generate emissions credits that can be traded. No CDM, no project; no project, no credits.

US RECs are generated in a very different environment, and not just because we didn't ratify Kyoto. First, the federal government and states provide various cash and tax incentives for the production of renewable power. In addition, 23 state governments have established Renewable Portfolio Standards (RPSs) that require utilities to buy a set fraction of their power from renewable sources--with a federal RPS currently percolating through the energy bill negotiations in Congress. Throw in tradable RECs, and it becomes hard to say why a particular green energy project got built, other than that its net financial returns looked attractive to the developer.

When you consider the rapid build-out of renewable energy capacity in the last couple of years, it shouldn't surprise anyone that the supply of such power might exceed its demand in the market, with the result that RECs end up being pretty cheap. It's still a zero-sum game, however. Because Aspen Skiing Co. paid for RECs to render its power consumption green, some other business or cluster of consumers somewhere else in the system can't make the same claim about theirs. True additionality will return with a vengeance, once federal climate change regulations, or the national Renewable Portfolio Standard now under consideration in the Congress, create a tidal wave of demand for offsets. In the meantime, uniqueness ought to be an adequate standard for RECs within US electricity markets, providing an attractive means for businesses to manage emissions they can't afford to reduce directly.

Friday, October 26, 2007

Breaking The Record

Six weeks ago I crunched some numbers and arrived at an inflation-adjusted all-time high oil price of roughly $91/barrel. I also explained why that was lower than the $101 figure that has appeared in numerous publications. With the market currently trading above yesterday's record closing price in nominal dollars of $90.46, we are within a gnat's eyelash of setting a new real-dollar record for oil prices. An article in today's Wall Street Journal provides a good analysis of the factors that brought us here from $80 only a few weeks ago, while one on Canadian royalties offers useful hints about where things might go from here. Although I remain skeptical that we are encountering geologically-driven Peak Oil, the current market is beginning to exhibit the practical effects of a peak, in which production can no longer keep up with demand. If that's so, then it's impossible to say just how high prices might go before destroying enough demand to right the balance.

In a peak, producers lose the ability to bring on enough new output to meet growing demand. I don't think it matters much to consumers of oil whether that occurs because there aren't enough large new oilfields to tap, or because resource nationalism and geopolitics have made those fields inaccessible or unaffordable. The announced increase in royalties for the Alberta oil sands provides an example of this in a country that has attracted billions in foreign capital, because it looked safer than places like Venezuela or Russia, which have lots of oil but don't honor their contracts. Boosting the oil sands royalty rate to as much as 60% of revenue (combined federal and provincial take) might not bring new development to a halt, but in combination with global project-cost inflation and the steady rise of local costs in Canadian dollars that have risen dramatically against the US dollars in which the oil is sold, it makes the whole proposition a lot less profitable. Thus, at the stroke of a pen, hundreds of thousands of barrels per day of future production have probably just vanished.

That highlights an important difference between the current oil price excursion and past spikes that ended in a flood of new production. Production in the OECD countries is barely replacing the natural decline of mature reservoirs. The non-OECD output that has enabled the expansion of demand in the last several years has come mainly from projects that were planned under more attractive fiscal terms, in host countries that have since cooled towards international oil investment, at least from the western oil companies. Producing countries may be entitled to a fair share of the rent on their own resources, but they are also quite capable of killing the golden goose. Meanwhile, spare OPEC production capacity that might have been adequate a decade ago is now too small to dampen the volatility of an 85-86 million barrel per day market.

What all this means is that the safety valve, if there is one, must be found on the demand side. Any adjustments are unlikely to be evenly distributed, however. The continued growth of the large emerging economies is tied to oil, and many of their consumers are artificially insulated from its true price by government intervention. The price that European consumers pay is already dominated by taxes, so changes in oil prices result in smaller percentage increases in fuel prices there than in the US, which has so far been sheltered by weaker refining margins, at least for gasoline, though heating oil is now starting to spike. Nor are measures such as the 35 mpg CAFE standard I discussed Wednesday going to have any impact in the short run.

On balance, then, the mechanisms by which high oil prices might tend to self-correct look weaker than the factors driving them higher. It worries me that this seems to be as true at $90 as it was at $50, a couple of years ago. For some time now, my trading instincts have been telling me that we ought to be approaching a big correction in prices. However, it's getting harder to construct scenarios that would produce that outcome, and the ones that do pivot on economic events that would be even less pleasant than high oil prices have been.

Thursday, October 25, 2007

Recharging the Electric Car

An article in yesterday's Wall St. Journal prompted some thoughts about the relative merits of entirely-electric cars (EVs,) compared with hybrids employing varying degrees of electric boost. It's interesting that Honda and GM, both of which have marketed all-electric cars in the past, should arrive at such different conclusions on the subject. Honda now apparently finds pure EVs superior to hybrids and plug-in hybrids (PHEVs,) while GM, which invested close to a billion in today's dollars in its EV-1 in the 1990s, is not convinced. Although some of the factors that led to the failure of the EV-1 might no longer apply, others look daunting, except perhaps to environmental regulators, who would naturally find zero-local-pollution EVs preferable to hybrids.

When you consider the experience of GM's EV-1, a few things stand out--after setting aside the unfounded allegations that it was killed by a conspiracy. As the testimonials from its former lessees attest, the EV-1 was a terrific car, as long as your driving needs didn't exceed about 75 miles, or you didn't need to transport more than one additional passenger. The latter was a design issue, constrained by the size of the original lead-acid battery pack and the body shape GM selected. But while the range was largely a function of the available battery technology, it also reflected the challenge of quickly recharging a partially-drained battery, even with a small number of high-voltage recharging facilities that GM and its partners installed around Southern California. How many of us would have really wanted a car that couldn't go more than 100 miles without stopping for a charge that might take an hour?

Batteries have improved a lot since the first EV-1 left the factory. If the lithium batteries around which GM is planning its Chevrolet Volt PHEV are able to deliver 40 miles of gasoline-free driving, surely a larger array of the same batteries could take an all-electric version of the same car 200 miles or more. That's the basis of the Tesla, which claims a range of 245 miles for its sleek electric roadster, running on thousands of laptop batteries.

Paradoxically, that range is both good news and bad news. Because it easily exceeds what most of us require for our daily commutes or errands, it effectively severs the chicken-and-egg infrastructure dependency that I believe really killed the EV-1: you can't sell EVs without recharging facilities but can't justify the recharging facilities without lots of EVs already on the road. A 200+ mile range eliminates the need for most recharging away from home, workplace or other predictable sources of electrical outlets and reduces the inconvenience associated with the lengthy intervals required for low-voltage recharging. In the process, though, it also eliminates most of the incentive to build a fast-recharging infrastructure to meet the needs of longer-distance travel.

It's arguable that this isn't a real impediment. When I first started looking at these issues in the mid-1990s, I saw compelling data that suggested that American motoring habits were evolving towards each household owning a mini-fleet of specialized vehicles: the commute or train car, the kid-hauler, the weekend sports car, etc. That meant that a car like the EV-1, which couldn't fill all of these roles but excelled at one, had great potential. Unfortunately, that view turned out to be wrong, or at least premature. This is still a key question today. Would millions of consumers be happy to own a car that they couldn't sensibly drive from Boston to Washington, DC, let alone from Washington to Minneapolis?

If the answer is yes, then a pure electric car looks pretty good, and the added complication and expense associated with a plug-in hybrid might not be justified, provided the cost per Watt-hour of batteries keeps falling. Personally, though, I think the flexibility of a PHEV able to run on gasoline, E-85 or electricity will appeal to more folks than a simpler battery car. I also doubt that the target market for the six-figure Tesla will tell us much about that trade-off. But isn't it nice that technology is finally providing multiple choices for our future transportation needs? EVs and the various hybrids may compete for market share, but they could also coexist nicely, all furthering the gradual electrification of automobiles.

Wednesday, October 24, 2007

CAFE and Energy Independence

As the readership of this blog has grown, I find myself receiving more inquiries from small PR firms seeking my aid in promoting some issue, product, or cause. Last week, I was contacted by an agency working with the Pew Center on Global Climate Change on their campaign in support of the 35 mile per gallon Corporate Average Fuel Economy (CAFE) standard under consideration in the Congress. After an exchange of emails with the PR contact, I learned that the message for conservatives is intended to emphasize CAFE's impact on energy independence. Instead of harping on the practical impossibility of that goal, I'd rather look at how a tougher CAFE standard could align with other measures and arrest the steady deterioration of our energy independence that has taken place over the last two decades.

I haven't changed my view that CAFE is a relatively weak policy by itself--a target and a tracking mechanism without the teeth to drive the change it will measure. The original CAFE standards probably get too much credit for the improvement in fuel economy that followed the oil price spikes of the 1970s. And while they kept fuel economy on the radar screens of the auto companies after oil prices collapsed in the mid-1980s, their loophole for light trucks--designed to protect small businesses--subsequently midwifed the explosive growth of SUVs. So if we regard CAFE as an important and consequential metric, but not the prime driver of automotive fuel efficiency, it forces us to think about the larger environment in which it plays out. CAFE won't be out there by itself.

Today that environment includes two primary drivers of change. The first is energy security, manifested by the high and volatile prices we pay for fuel. The quantity of energy we import affects those prices directly, through our competition with other consuming countries for the limited exports from oil producing countries, and indirectly, by expanding our trade deficit and weakening the dollar. But while "energy independence" has been part of the campaign rhetoric of most of the presidential candidates, I've yet to hear one quantify it. The chart below highlights the decline in US self-sufficiency, in terms of the share of our total primary energy consumption that we produce ourselves. Remarkably, at 71% our current level of energy independence is much lower than when independence first became an aspirational target in the mid-1970s. Nor has renewable energy made a noticeable dent in the problem, so far.
Energy Independence, %

Although the decline in US oil production contributed to this deterioration, we actually produce more primary energy today than we did in 1984, when we were 90% energy independent. The cumulative effect of demand growth has been the big story, here, and CAFE speaks directly to that trend.

The other major driver of change is global warming. Whether the current fires, drought, Arctic melting, or any other phenomenon is directly attributable to man-made climate change, it is noteworthy that the question is now asked in every case. I will cover the Lieberman-Warner climate change bill, which was introduced last week, in more detail later. Even if it turns out not to be the last word on a cap & trade system, it is becoming hard to imagine a future in which the Congress does not pass meaningful legislation limiting US greenhouse gas emissions. The details are important, but the fundamental fact of assigning a monetary cost to CO2 emissions represents a step-change in the economics of energy and transportation.

Taken together, that means that a 35 mpg CAFE standard would be implemented against the backdrop of high oil prices with no end in sight (eerily reminding me of 1982) and escalating costs for the CO2 that is an inevitable byproduct of burning gasoline, at the rate of 20 pounds for every gallon. Achieving 35 mpg will doubtless reduce those emissions and break the trend of our growing energy dependence. However, it must be the consequences of those high oil prices and of monetizing CO2 that ultimately convince consumers to buy the cars that will make 35 mpg attainable. And if that mechanism isn't as clear-cut as the high carbon tax that many economists would prefer, it at least has the virtue of being possible in the current political environment.

Tuesday, October 23, 2007

The Hydrogen Bet

Preparing my slides for use in the upcoming webinar on future transportation fuels (see below) forced me to confront the difficulties of presenting a positive case for hydrogen without glossing over the numerous caveats that go along with it. As I Googled for data, I ran across an interesting debate that reflects the same conundrum. Although I don't feel compelled to choose a side in this particular debate, which has been blogged by Treehugger, it got me thinking about the introduction of fuel cell vehicles (FCVs) and whether they would be able to ramp up gradually, the way that hybrids have. My conclusion is that in 2015, when Dr. Romm and Mr. Blencoe must settle their bet, there's likely to be no ambiguity about who has won.

When you consider the barriers standing in the way of mass-market hydrogen cars, they really all need to be solved at once, or if not at once, then at least in a package that allows enough of these cars to get on the road quickly enough that their sales can rapidly climb to the 500,000 to 1 million level that manufacturers such GM and Ballard indicate would be necessary to reduce the cost of the fuel cell stack assembly and power electronics into the same range as an internal combustion engine powertrain. The Blencoe/Romm bet hinges on whether FCVs can reach 1% of new car sales by 2015. Allowing for a bit of growth between now and then, that works out to roughly 200,000 units. That level of sales couldn't be achieved without sufficient infrastructure for hydrogen production and distribution, in the form of magnesium hydride or simply compressed H2--infrastructure that is entirely new and incompatible with our current fuels distribution systems.

The best scenario for overcoming this critical interdependence involves a focused regional rollout. Southern California comes to mind as a likely venue. But with about 13 million cars in the entire L.A. Basin, and total car sales in California running at 2 million per year, FCVs would quickly have to attain a 20% local market share within five years from launch. (It has taken hybrids 10 years to capture roughly 5% of this key regional market.) The necessary infrastructure would have to roll out even faster, to prevent a refueling bottleneck from developing, and to assure consumers that they won't be buying orphan cars. If you've ever tried to get anything permitted in California, let alone built, you begin to grasp the magnitude of that challenge, and I haven't even mentioned the problem of producing enough hydrogen to supply a half million or so FCVs, and whether it would come from natural gas, as most of the H2 used in the fertilizer and chemical industries does, or be produced by electrolysis in a state that experienced serious electricity shortfalls only a few years ago.

My readers know I'm generally cautious about projections that depend on many things going just right, especially when fleet turnover and infrastructure are involved. But let's grant, for the sake of argument, that all the necessary preconditions could be met in Southern California between now and 2010, perhaps as a result of new legislation. But then things get tricky. Hydrogen cars will have to compete in a competitive landscape in which many new car models will be available as hybrids, high-efficiency European-style diesels, or in some cases as plug-in hybrids (PHEVs.) Dr. Romm is not alone in regarding the PHEV as a potential fuel cell killer, because it offers similar efficiency while requiring minimal new infrastructure.

The appropriate analogy here isn't to the 1980s' battle between VHS and Betamax, or even the current competition between Blu-ray and HD-DVD. In the case of FCVs vs. PHEVs, it's as if you didn't just need a new DVD player to play a Blu-ray disc, but also a different TV and an entirely re-wired home, while HD-DVD required only one new box. In order for Blu-ray to win under those circumstances, I think it would have to do a lot more than just display incredibly sharp images on your TV. Even 3-D might not be enough. (Note: I don't own a high-def DVD player and have no dog in that fight--this is just an analogy.) For the fuel cell car to win, it must be demonstrably so superior to anything else, including the PHEV, that consumers will yearn for it as though it were the latest iPod/iPhone with wheels, and energy companies must abandon caution--and their economic models--and race to build the infrastructure in advance of the demand, just to be in on the ground floor of the Next Big Thing.

Please understand that my heart is with Mr. Blencoe, here. An FCV running on clean hydrogen is a much more elegant concept than the PHEV, which for all its clever engineering seems a bit kludgy by comparison. But while I've never been as pessimistic about hydrogen's prospects as Dr. Romm, I'm afraid he has the surer side of this bet. 2015 is not much time, to complete all the necessary development, prototyping, certification, permitting and retooling necessary to sell 200,000 fuel cell cars per year.

On November 1, 2007 I'll be participating in a webcast discussion on "Fuels for Now and the Future" with Scott Sklar of The Stella Group, Ltd., hosted by Cleantech Collective. For more information and to register for the webcast, please follow this link.

Monday, October 22, 2007

Regulating Speculation

The front page of the Sunday Washington Post featured an article on the perils of speculation on under-regulated energy futures exchanges. The Post cites the cost to consumers from speculators driving up the cost of these commodities and makes a case for expanding both the powers and budget of the Commodities Futures Trading Commission (CFTC), the federal body established to regulate such transactions. However, the article also describes how further regulation might drive this trade off the regulated exchanges and deeper into the unregulated and much less transparent over-the-counter markets (OTCs.) While all of this is interesting, it reflects the typical shortfalls of coverage that treats futures markets as black boxes. The reality is more complex and less nefarious--and the likely solution much simpler--than the Post suggests.

Futures markets offer important benefits for all participants, particularly for those seeking to manage the price risks of the physical oil positions intrinsic to their operations. That includes oil, gas and electricity producers, refiners and large consumers. The fixed-price heating oil contracts that have become so popular with consumers would not exist without thriving futures markets. Purely financial participants play an essential role in these markets, providing liquidity and taking offsetting positions that the physical players might eschew on any given day. Pegging the price of physical transactions to the settlement prices of these exchanges became popular starting in the late 1980s, because their liquidity and transparency was impossible to match for all but a few high-volume physical trades. The other main benefit for exchange participants is the virtual elimination of counter-party risk, the risk that when the time comes to collect the oil or money owed at the settlement of the transaction, the other party won't be able to make good.

The OTCs serve a different, but complementary role, facilitating transactions that are too specialized or thinly-traded for the established futures exchanges to take on. These can be very lucrative deals for market makers, because transparency is low and transaction costs are often very high. The frontier between the exchanges and OTCs is dynamic, with the former periodically offering new products that encroach on the turf of the latter--sour crude, fuel oil, etc. Anything that made the futures markets less useful or more expensive for their participants would drive trade toward the OTCs, and that's the chief risk of over-regulating these exchanges.

Several months ago I wrote a lengthy posting concluding that it was plausible that speculation had contributed to the dramatic increase in oil prices over the last four years. I took a lot of flak for that suggestion, which I still find eminently defensible on fundamental economic grounds. Notwithstanding expanding global demand for oil and the sharply increased marginal cost of bringing new supplies to market, along with a wide array of "above-ground" risks, financial speculation in oil by non-industry players represents a growing source of demand in the virtual markets that set the price for much of the physical trade in oil and petroleum products. But commodity markets, either regulated formal exchanges or unregulated OTCs, are not responsible for this phenomenon. They are merely the conduit for the impulses of an increasingly securitized financial economy that bets on every aspect of life, down to the weather. The pitfalls of some of these complex bets often don't become apparent until after they go bad, as we are witnessing on a large scale today.

Now factor in the legitimate national interest of the US government and the public it serves to minimize the impact of financial speculation on the final cost of petroleum products and the other forms of energy upon which the real economy depends. There is no corresponding national interest in allowing speculators to profit from these commodities, beyond a practical interest in allowing enough liquidity to ensure that these markets work smoothly for of all participants, especially those with physical exposure to hedge. If speculation in oil commodities drove their price up by $10/barrel, that would cost businesses and consumers $70 billion/year and increase the US trade deficit by about $40 billion. On that basis, I believe there is a solid argument for new regulation. The trick is ensuring that the treatment doesn't kill the patient.

That's where I think the Post and various members of Congress pursuing tighter scrutiny of commodity exchanges are on the wrong track. The issue here is not the exchanges, or even the OTCs, but rather the investors using them. The challenge for an external party trying to piece together all of the market positions taken across various futures exchanges and OTC markets by a party such as Amaranth, the new poster-child for commodities excesses, looks truly daunting, even for auditors examing them after the fact. The most sensible alternative to such a regulatory nightmare would be to require investors themselves to disclose aggregate positions exceeding some threshold to federal regulators, just as the SEC requires the disclosure of any stake over 5% in the equity of a traded company.

While journalists and the public may regard such speculators as undisciplined cowboys, every one of them has internal controls that require daily or real-time "mark-to-market" trading reports, tallying the entity's current exposure to each commodity. Reporting that exposure to the CFTC whenever it exceeded 10 million barrels of oil or the equivalent in gas or other energy commodities could be accomplished with minimal new bureaucracy or accounting burden, and without distorting the relationship among the physical, futures and OTC markets. That would provide the over-the-shoulder scrutiny that the Post and others advocate, but at a much lower cost to the economy.

Friday, October 19, 2007

Lighter Than Air

If you ever want to stump someone at a party, particularly one decorated with balloons, ask them where we get helium. And if someone says, "The Sun," they'd only be half-right. Although it's not widely advertised, most of our current supply of the gas is derived as a byproduct of natural gas production, especially from the helium-rich gas fields of the US mid-continent. Last Friday's broadcast of Science Friday on NPR included a short segment on the growing global shortage of helium, and some of its consequences. This isn't only important to balloon sellers and blimp operators. And while the current shortfall is related to temporary production problems in the US, Middle East and North Africa , it remains to be seen whether new international supplies can offset the natural depletion of our current sources.

Although most of the helium on earth is found in the atmosphere, it is present there in such low concentrations (roughly 5 parts per million) that it is uneconomical to extract it from the inert gases produced at air separation plants. Natural gas from certain fields contains much higher concentrations, ranging up to 8%, or 15,000 times higher than the atmosphere. The Federal Helium Reserve near Amarillo, TX, which dates back to the 1920s, is the linchpin of global helium distribution. It gathers the helium-rich streams from gas fields and their associated processing facilities in the region and stores the crude helium for later refining and sale, at volumes averaging 2 billion cubic feet per year. At one time, this complex was the largest helium source in the world, though commercial facilities have since surpassed its output, though not its storage capacity.

So what does this have to do with energy, other than as a trivia question? Helium itself has relatively few energy applications, beyond its use in superconducting magnets and energy storage, and as a heat transfer medium in advanced nuclear reactors. There's even an exotic wind turbine design that relies on helium's buoyancy, though it may never get off the drawing board.

The bigger issue here is the linkage between the helium and natural gas businesses, which leaves global helium production at the mercy of natural gas market trends. Many of the helium-rich US gas fields are mature, and so US production will eventually decline. Even before then, however, the ongoing sale of the Federal Helium Reserve stockpile could make the global market for helium as volatile as the oil market. As the NPR segment discussed, LNG production outside the US will become an increasingly important source of helium, until the point at which global natural gas production reaches a peak, bringing about Peak Helium, as well. That doesn't sound as ominous as Peak Oil, unless you're in an industry that depends on helium in your manufacturing or operations.

Thursday, October 18, 2007

Buying High

An article on oil prices in yesterday's Washington Post reminded me that the federal government has resumed purchasing crude oil to inject into the Strategic Petroleum Reserve (SPR.) Whether this involves paying cash for crude or swapping it for barrels the government receives as royalty-in-kind, the net result is less crude available for delivery to refiners, at a time when commercial oil inventories are shrinking globally and prices are in record territory. The 70-100,000 barrels per day going into the SPR hardly seems sufficient to drive up prices by the $15/barrel we've seen since August, but that doesn't mean the psychology of the market is unaffected. It's also a terrible deal for taxpayers.

My long-time readers know that I regard the present SPR as an outmoded relic of the energy crisis of the 1970s. It has served a useful purpose during crises, but it has also discouraged industry from holding larger inventories closer to where actual demand occurs. All of this could be rectified, and I've made that case before. If we are stuck with the current SPR for the foreseeable future, though, the question becomes how it ought to be managed. I haven't changed my view that it shouldn't be used by the government to dampen oil prices in non-crisis periods. The releases in 2000 to moderate heating oil prices were well-intended but shouldn't have occurred. However, that's not the same thing as saying it makes sense to pay record high prices to fill the SPR, since there's no guarantee that the oil would be sold for a higher price later.

Look at how volatile oil prices have been this year. They've ranged from a momentary high yesterday of $89/barrel to a low of $50.48 on January 18. While plenty of traders expect oil to go even higher, I would not want to bet my home or my 401-k against seeing prices dip below $60 some time next year, even if the long-dated oil futures are currently around $75 all the way out to 2015.

Given this kind of volatility, what would be a reasonable course for a government that ultimately intends to acquire an additional 205 million barrels of oil to reach the stated 1 billion barrel target? Well, if the government only bought oil when it was below $60/barrel, it might still be able to reach its target, but at substantially lower cost. If such a policy had been in place this year, the roughly 3 million barrels purchased since August would have had to be deferred, at a savings of up to $60 million. Such a mechanism would also function as a sort of soft floor price, without the pitfalls of a tariff-based floor, while still providing some of the reassurance that investors in renewable energy are seeking.

As an old oil trader, I know that the actual details of such an arrangement would be more complicated than that simple description above. You can't gear up to put oil into the SPR on a moment's notice, to capture some blip in the market. But I would also bet that the DOE could hire a suitable US-based trading company to manage the purchasing process along these lines and deliver the oil within a suitable window, for a fee that would look minuscule compared to the effective premium we're paying right now to top up the SPR during a price spike.

Wednesday, October 17, 2007

Spheres of Influence

It would be tempting to view yesterday's meeting between Presidents Putin of Russia and Ahmadinejad of Iran as the beginning of a new "axis of oil"--two petro-authoritarians forging solidarity against the West. Mr. Putin's remarks in support of Iran's nuclear program were hardly helpful to the cause of inducing Iran to become more transparent about its efforts. But while high oil prices have enabled both of their countries to pursue much more ambitious agendas than would otherwise have been possible, I see a different interpretation of Mr. Putin's comments and handshake with Mr. Ahmadinejad. Russian interests in the region go back a long way, and the reestablishment of a "sphere of interest" seems entirely compatible with Putin's view of Russia's rightful position in the world order, and his own role within Russia.

The fact that the encounter between the two leaders took place against the backdrop of a regional summit of the countries bordering the Caspian is significant. The purpose of the gathering was to assert that the only countries that ought to determine the disposition of the Caspian's resources and its infrastructure (think pipelines) are those bordering it. That boils down to Iran plus the states of the former Soviet Union, and explicitly excludes the US and EU. If this were merely a question of sovereignty, no one could argue with it. However, this sort of exclusionary approach pre-dates the "near abroad" view of the USSR and hearkens back to the "spheres of influence" theory prevalent in late tsarist Russia, which sparred with the British Empire across this geography for generations in the "Great Game" of the 19th and early 20th centuries. Perhaps President Putin believes he's engaged in a Great Game II with the US.

It's not surprising that Mr. Putin would pursue such a policy, since his own ambitions seem to mirror those of his pre- and post-Revolutionary predecessors. As described in a recent article in the Economist, the two-term limit on the Russian Presidency probably won't constrain Putin's ability to retain power, via a clever switch to an elevated office of the Prime Minister. If he can pull this off, with the help of a 70% approval rating, Vladimir Putin could easily rule Russia for decades, since he just turned 55. And like the tsars, he can afford to think long-term about expanding Russian influence and reducing that of the US across Central Asia and the Middle East. In that respect, at least, he finds common ground with Iran.

While President Ahmadinejad didn't secure a promise from Russia to fuel the nearly-complete nuclear reactor at Bushehr, Putin's vague warning about any military action against Iran will embolden Iran in its pursuit of a nuclear enrichment capability that I continue to believe only makes sense in the context of a nuclear weapons program, whether current or aspirational. That development wouldn't be in Russia's interests any more than it would be in ours. Let's hope that Mr. Putin sees all this clearly, rather than regarding Iran as simply a convenient foil against the US, or worse, as a potential client state willing to do his bidding. That would make it much harder to employ a strategy of deterrence against a future nuclear-armed Iran, while paradoxically increasing the incentive for us to act against that eventuality now, before Russo-Iranian friendship turns into an alliance.

Tuesday, October 16, 2007

$100 or Bust?

While I was traveling yesterday, oil prices set another record. From $86/barrel, the idea of breaking $100 before the end of the year doesn't seem nearly as implausible as did a few months ago. Happily, I'm not in the business of predicting oil prices--we're already beyond where I'd have expected prices to top out. However, I am in the business of considering the implications of future scenarios, and at this point I think it's more interesting to consider what $100 oil might mean, rather than guessing when and whether we'll see it.

First, $100/bbl would finally vault us past the previous inflation-adjusted peak price from the early 1980s, at least for the light, sweet crude priced by the NYMEX WTI and ICE Brent contracts. Even though the US economy is much less sensitive to oil price increases than it was in the 1970s-80s, since it has become much more efficient, there must be some point at which higher oil prices--even if they're partly driven by a weaker US dollar--start to slow the economy, either by stimulating producer-price inflation or because they are the equivalent of a tax on consumption. $100 might not constitute a magic level at which those effects would suddenly kick in, but it would represent a doubling of nominal prices in three years and a quadrupling in five years. That's approaching "oil shock" territory, even though the process has been a lot more gradual than the shocks of the '70s.

What it would mean for gasoline prices depends on refining margins, which have weakened significantly from their stratospheric heights earlier this year. If refining margins in the fourth quarter of 2007 remain comparable to those in 4Q06, then we could expect a US average pump price of around $3.20/gallon for unleaded regular, versus $2.77 today. (If margins spiked again, it might go as high as $3.60/gal.) That ought to be enough to elevate fuel economy in car-buyers' priorities and put a big dent in the recent resurgence in the sales of large SUVs, while giving hybrids and crossovers a bigger boost. However, I doubt it would be enough to change fuel consumption dramatically, because so many elements of our driving patterns are tied to our lifestyle choices.

Lurking behind these implications is the larger question of the psychological impact of reaching the $100 mark. We've come so far in the last few years, I'm not sure another $15/bbl would make much difference. Consumers complain about high gas prices, and demand is growing more slowly than it was previously, but the sea change that promoters of alternative energy and higher efficiency have been hoping for remains elusive. Has the gradual climb from $25 to $40, $50, $65, $70, and then recent sprint past $80/bbl numbed us and obliterated our memory of just how unthinkable all this seemed not very long ago? Could $100 shatter our complacency, or would it be just another turn of the dial on the boiling frog? I can see it going either way, with very different results in each case: two complex scenarios, instead of the simple one we started with. One sets up radical energy change, while the other preserves the current evolution, gradually adding new sources to our traditional ones. The latter sounds much less glamorous, but it would also be a lot less disruptive and a lot more predictable.

Monday, October 15, 2007

The Other Half of the Nobel

I'd be derelict in my duty if I let the announcement of this year's Nobel Peace Prize pass by without comment. However, I'd like to focus on the half of the award that did not go to former Vice President Gore. If you read the Nobel citation, you'll see that the Intergovernmental Panel on Climate Change (IPCC) actually received top billing, even though the media have largely ignored that or treated it as incidental. While it's understandable that we should focus more on the individual who personifies the cause of climate change in this country, and perhaps the world, I think the award to the IPCC might actually be more significant. By awarding the Nobel Peace Prize to the IPCC, the Norwegian Nobel Committee is putting its moral authority behind what is generally referred to as the "scientific consensus on climate change."

Not to diminish Mr. Gore's efforts, but without the work of the IPCC, his presentation on climate change, captured on film in "An Inconvenient Truth," would have rested on mere conjecture. It took thousands of scientists and decades of research, peer review and public debate to arrive at the present picture of the interaction of all the factors affecting the global climate system, including the man-made greenhouse gas emissions that are nudging the Earth's climate towards a warmer state, with consequences both foreseeable and unknowable. Nor is this a static picture. Evidence is still being gathered, computer models improved, and data periodically reviewed and corrected. That's how science works.

But if the Nobel for the IPCC is a vote for this cumulative body of science and the scientists who produced it, I think it's important to understand that it is a very different kind of endorsement than the Nobel Prizes for the various sciences, which are awarded by the Nobel Committee of the Royal Swedish Academy of Sciences. The Peace Prize, on the other hand, is awarded by a committee selected by Norway's parliament, the Storting. This year's committee includes a university president, a theologian, and a consultant. All served in Norwegian politics or government at some point. In that light, the award to the IPCC should be viewed as a recognition of the geopolitical and world-historical importance of global warming, rather than any kind of peer review of the science behind it. My purpose in drawing this distinction is not to validate skepticism, but to suggest that this Nobel signals an evolution and perhaps even a turning point on the issue.

Now, I realize that some of my readers remain skeptical about the extent and risks of climate change, and of the very notion of a "scientific consensus." But I think we're now at a point with regard to our understanding of climate change and its risks that the main theater of activity will be political and diplomatic, rather than scientific. The scientists have communicated their conclusions, including a detailed series of reports released this year on the scientific basis, potential impacts, and mitigation strategies. It is now up to governments at all levels to decide what to do about it, based not only on the science, but on all of the other issues for which they are responsible: the prosperity, security, and well-being of their citizens, and of the world as a whole. The choices aren't as simple as they apparently seem to some, but neither can they be ignored. And that's what the committee says plainly in the concluding paragraph of the citation:

"By awarding the Nobel Peace Prize for 2007 to the IPCC and Al Gore, the Norwegian Nobel Committee is seeking to contribute to a sharper focus on the processes and decisions that appear to be necessary to protect the world’s future climate, and thereby to reduce the threat to the security of mankind. Action is necessary now, before climate change moves beyond man’s control."

Friday, October 12, 2007

Power from Space

Earlier this week a new alliance was announced to promote the exploitation of solar power from space. The Space Solar Alliance for Future Energy (SSAFE,) formed by space advocacy and research organizations, is basing its initial efforts on a just-released study by the National Space Security Office . Collecting solar power in space, where it is available 24/7 and is not attenuated by atmosphere or clouds, remains one of my favorite long-term energy solutions, on a par with nuclear fusion. It has a major advantage over fusion, however. While both offer inexhaustible sources of energy, space solar power (SSP) requires no scientific breakthroughs. Despite that, its engineering and cost challenges make it unlikely that SSP could contribute significantly to terrestrial energy much before 2020.

If you're not familiar with the concepts for capturing solar power from orbit and beaming it to earth--ideas that have been refined significantly since they were first conceived in the 1960s and '70s--I encourage you to download SSAFE's feasibility study. It provides a good overview of the technology and many of the financial, logistical, diplomatic and other factors that must be addressed. Having been personally involved in the late-1990s NASA studies referenced in the document, I must say that I don't find the update to be quite as novel as the report suggests, when compared with the concepts and constraints that we examined 10 years ago. But as it points out, the energy and security context for considering this option have changed tremendously, and that may be enough to alter the ultimate conclusions about whether to proceed with such a large-scale space endeavor.

When I served on the economic evaluation team reviewing the 1990s NASA effort, two obstacles to the commerciality of SSP loomed large. In that period, ground-based power costs were low and falling--and most of the experts and economists we consulted expected that trend to continue. The high cost of power from space appeared difficult to justify outside of a few niche applications. Compounding that challenge, the construction of a fleet of power satellites capable of producing hundreds or thousands of Megawatts of electricity each would require a launch capability far beyond that of the quartet of Space Shuttles--which ultimately proved barely adequate for the current Space Station--or of a second-generation shuttle. But SSP was the only application large enough to justify building such a capability, short of a major effort to colonize or industrialize earth-orbital space. Chicken and egg.

As NASA foresaw in the late '90s, advances in technology have made it possible to contemplate an SSP design that requires fewer, smaller payloads and relies almost entirely on robotic assembly. That will reduce the number of launches and virtually eliminate the need for a parallel ramp-up in manned space activities, thus improving the economics. It's not clear from my perusal of the report whether this new configuration could be placed in orbit by existing commercial launch capacity. Even if it could, the process would be lengthy and very expensive.

And although I'm intrigued by some of the novel applications for power from space described in the report, including chemical synthesis of carbon-neutral hydrocarbons, most encounter another hurdle we identified a decade ago. While there are certainly lots of cool things that you could do with power from space, there are very few applications that actually require it. I routinely receive comments from readers of this blog who are equally keen on hydrocarbon synthesis based on nuclear power, which would almost certainly be more economical that the SSP-driven version.

Finally, although our perspective on national security and its energy security dimensions has expanded since 9/11, the potential linkage of SSP to military applications raises the prospect of international opposition to a project that could probably only proceed as an international initiative. I appreciate the benefits of having an "anchor customer" that puts a high premium on the ability to deliver power to any point on the planet. The advantages for the military would also be significant, given the expense and risk it incurs delivering energy to the "battlespace." However, the basic architecture of SSP will inevitably raise concerns about its inherent military potential, whether that potential is real or merely perceived. This is an issue that would have to be navigated very cautiously, particularly since other nations with anti-satellite capabilities might regard an SSP beaming power to a war zone as a legitimate military target.

I will follow SSAFE's progress with great interest. Space solar power has enormous potential to provide useful increments of zero-emissions energy as either a reliable baseload or as a sequentially-shifting peak supply across the globe. SSP will also benefit from the steady improvements in photovoltaic technology that are making ground-based solar more competitive. It's not a short-term solution, however, and its development still faces many technical, political and permitting challenges. SSP is a valuable future option, and we'd be wise to invest to increase the value of that option and make it easier to exercise, should we ultimately choose to do so.

Thursday, October 11, 2007

Plug-In Hybrids, CAFE and LNG

One of my readers pointed out that I missed an opportunity in Tuesday's posting to consider the impact of plug-in hybrid cars (PHEVs) on an increase in the Corporate Average Fuel Economy Standard (CAFE.) It's a great question, and it leads in two different but related directions. First, there's the basic problem of how to tally the energy consumption of a PHEV or any other multi-fuel vehicle, compared to a car that uses only gasoline. But looking at the impact of PHEVs also requires at least a superficial discussion of their electric power needs, from both an energy and capacity perspective. This is another one of those topics that really demands something longer than a 600-word posting, but then that's true of many aspects of our highly-interconnected energy systems.

It's rare to see a reference to PHEVs that doesn't include claims for gas mileage in excess of 100 miles per gallon. Of course, that's only the gasoline consumption, and it's premised on the average distribution curve for car travel in the US, in which a large fraction of all trips would not exceed the 30 or 40 mile all-electric driving range of planned PHEVs, before their gasoline engines kick in. That might be a reasonable way to look at it, if all we cared about were oil. But it makes little sense to ignore the fuel consumption and emissions that occur externally to the vehicle, since those also contribute to our overall energy demand, criteria pollution, and greenhouse gas output. For the sake of brevity, I'll confine this posting to looking at the energy side of this, because the emissions picture is even more complex.

Although my reader asked about the specific impact in New York City, it was easier to find consistent power statistics for the entire state. In 2005 New York's 39,122 MW of summer capacity from all sources generated 147 billion kW-hrs, net, for an average utilization of about 43%. So there's plenty of spare capacity to recharge electric vehicles, if it's managed properly so as not to interfere with peak demand from other sectors. Assuming that the state's nuclear, hydropower and coal-fired generators are usually dispatched first, the incremental demand from PHEVs would be met by natural gas turbines, which make up 42% of total NY capacity. Now put a million PHEVs on the road in New York, driving the national average of 12,000 miles per year. If half those miles were on battery power, at a typical consumption of 3 miles/kW-hr, they would use about 2 billion kW-hrs of electricity per year. That would require 250 MW of generating capacity, well within the state's overall spare capacity. It would also increase New York's annual natural gas consumption by 14 billion cubic feet, or about 1.3%.

What does that mean for the true fuel economy of a PHEV, as distinct from its gasoline fuel economy? Well, assuming the same 50/50 split of battery and engine usage, a PHEV traveling 100 miles would burn 1 gallon of gasoline (if its non-PHEV characteristics were similar to a Toyota Prius) and a quantity of natural gas with an energy content equivalent to about 0.9 gallons of gasoline. That means our "100 mpg" PHEV really gets around 53 mpg overall, counting both its direct gasoline and indirect natural gas consumption.

The obvious conclusion from the above comparison is that the benefits of plug-in hybrid vehicles are highly dependent on the energy source for the electricity they consume, and that will vary significantly on a regional and local basis. Natural gas is close to a worst-case comparison, at least on cost, but it's also a very realistic comparison for the first production PHEVs we expect to see within a few years. Where they will draw on gas-fired power generation--which could be most places, initially--their energy benefits look modest, at best. (Emissions are a different story, for another day.) By the time New York state reaches its first million PHEVs, however, it should have much more renewable power available, in line with the state's Renewable Portfolio Standard. PHEVs look like an excellent outlet for the off-peak power from wind turbines--assuming enough of them are actually built.

In the meantime, if we are looking to PHEVs as a relatively painless way to achieve a new 35 mpg CAFE standard, then we need to think very carefully about the reasons we want a CAFE standard in the first place. Unless PHEVs are rolled out in parallel with large quantities of renewable power generation, we could eventually end up with 100 million of them consuming 1.4 trillion cubic feet of natural gas per year, compared to total current US gas demand of 22 trillion, with 100% of the shortfall translating into additional LNG imports. I'm not sure that would create quite the energy security benefits that many PHEV boosters expect to see.

Wednesday, October 10, 2007

The Fuel of the Future

Reports on the science behind our energy concerns have become a regular feature on NPR's "Science Friday" program, and downloading its podcasts is now an established part of my weekly routine. Its first segment last Friday focused on the pros and cons of ethanol and biodiesel and their various production techniques. These issues are important aspects of the larger questions concerning the fuel(s) of the future, which might derive from chemistry, synthetic biology, or electricity. Long experience and numerous applications bias us toward liquid fuels, and the current emphasis on biofuels reflects that, though this is not a foregone conclusion, particularly for supporters of plug-in hybrids and electric cars.

The "Science Friday" segment drew on recent articles in Wired and National Geographic, and included Professor Dan Kammen, the Director of U.C. Berkeley's Renewable and Appropriate Energy Laboratory, whom I've mentioned previously. The discussion provided many interesting perspectives on biofuels, including why it is so difficult to break down the cellulose and lignin in plant material, and how current approaches to cellulosic ethanol differ from those begun in the 1960s and '70s, and are thus more promising. They also considered the utility of ethanol as a fuel, at least during a transition to a more optimized energy carrier for transportation. Yields, energy density, and compatibility with infrastructure and end-use devices are all important considerations, along with whether government or the market will make the ultimate choice.

After listening to the program, I became curious about the relative comparison between transportation energy and food agriculture. Multiplying a daily diet of 2500 Calories by 300 million of us equates to 1 quadrillion BTUs per year. That works out to the energy contained in 500,000 barrels per day of oil. Americans stand at the apex of a vast agricultural pyramid, and its net result supplies our bodies with only about 1/20th of the fuel that our cars use. Now, there are all sorts of inefficiencies on both sides of that comparison, such as those relating to our consumption of animal protein and the use of inefficient engines and other devices. But it still says something about the relative upper boundary of biofuels' potential, compared to the higher energy-density sources on which we currently rely.

And since liquid fuels for transportation still retain a big edge over gases and electricity, we can't ignore the current incumbent. Could the fuel of the future actually be gasoline? Take a look at last week's Economist, before you write me off as a shill for the oil industry. The article covers a variety of biofuel alternatives, starting with ethanol and working its way up the molecular chains, ultimately suggesting that something very similar to gasoline could be produced by advanced biological synthesis, and that this would provide significant advantages over current biofuels, without many of the drawbacks of petroleum products or liquids from coal. Based on my own analysis of ethanol's limitations, I agree, if the synthesis could be done economically--the $64,000 question for any of these alternatives.

Nor should we write off hydrogen entirely, as many energy pundits have. While the cost and infrastructure obstacles it faces are truly Herculean, in the long run all of those could be overcome, if the incentive were large enough. Hydrogen's fate, then, depends on an end-use application that is so superior to the internal combustion engine in performance and cost that it justifies all that extra expense and effort, including throwing away large quantities of primary energy from natural gas, nuclear power or renewables, to produce it. Today's fuel cells don't meet those criteria, but tomorrow's just might. The potential is certainly there. If it can be unlocked, then hydrogen's role as a common-denominator energy carrier made from a wide variety of primary sources--including biological synthesis--could become an advantage, rather than a drawback.

When it comes to predicting the fuel of the future, to replace petroleum and reduce our global carbon footprint, we still lack good answers. The best we can do today is to keep trying out alternatives, along the lines of the "clinical trials" suggested by Dr. Craig Ventor in the Economist article, progressively refining our questions along the way. In the meantime, improving our energy efficiency would greatly reduce the height of the mountain any new fuel must climb.

On November 1, 2007 I'll be participating in a webcast discussion on "Fuels for Now and the Future" with Scott Sklar of The Stella Group, Ltd., hosted by Cleantech Collective. For more information and to register for the webcast, please follow this link.

Tuesday, October 09, 2007

Lake Wobegon CAFE

In the course of writing yesterday's posting, it occurred to me that the efforts to increase US Corporate Average Fuel Economy Standards (CAFE) face a basic problem. Notwithstanding our concerns about energy security and climate change, some simple mathematical truths have been absent from much of the public discussion on the subject of CAFE, and they have something in common with the "Lake Wobegon Effect," in which all the kids in Garrison Keillor's fictional town are above average. Raising the average fuel economy of the country's light vehicle fleet by 10 miles per gallon doesn't sound very daunting, until you start looking at what the mathematics of averages implies for the type and number of cars consumers must actually buy.

It all starts with how you go about raising the average of any set of numbers to a higher target level. There aren't many options:
  1. Add additional members that are above the new target.

  2. Remove members that fall below the current average.

  3. Various combinations of 1 & 2.

If this seems painfully obvious to you, please bear with me, because applying these simple ideas to the world of automobiles produces some awkward realities. Consider SUVs, which been a major factor in increasing US petroleum consumption in the last two decades. Since 2004, however, sales of large SUVs have fallen. At the same time, sales of "crossover vehicles"--smaller, mostly car-based vehicles that combine features of station wagons, vans, and SUVs--are up significantly. That suggests that consumers have taken the gas price signal to heart and are downsizing their old SUVs. But will that improve overall US fuel economy, let alone move us toward a new 35-mpg standard?

Suppose that John Customer has decided to trade in his family's 2001 Ford Explorer for a 2007 Ford Edge, a fairly typical crossover. The Explorer had an EPA rating of 15 mpg; the Edge gets 20 mpg overall, using the old EPA methodology. John might assume he's doing his small part to improve the US fleet mpg by 5 mpg, while saving some money at the gas pump, but there are two problems with his logic. Since his new car's mileage falls below the current average CAFE, it will actually drag it down further--or negate the efficiency gains from a higher-mpg car sold to someone else. Looking beyond the new car fleet, which is all that CAFE measures, unless John takes his Explorer to the scrap yard on the way to the dealership to pick up his new Edge, someone else will be driving his old 15 mpg SUV for years to come. On balance, then, John's switch to a crossover might look like a modest improvement, but it doesn't even contribute to maintaining the current fuel economy average, let alone approaching a stricter target.

Nor is this problem confined to SUVs. In 2004 I bought an Acura sedan, which provides the 5-star crash safety I was seeking and handily exceeds 30 mpg on the highway. But this car came with an EPA sticker estimate of 23 mpg, and because most of my driving is around town, I have only averaged 21 mpg, despite having a light touch on the accelerator and being very sparing of the brakes. This kind of performance is typical for other popular 6-cylinder sedans, such as the Chevrolet Impala or Malibu, or Toyota Camry.

What does this suggest about the kind of cars we're all going to have to start buying, if the US adopts a 35 mpg CAFE target, on the way to moving the entire US car fleet up to that level eventually? It will require a much bigger change than just switching from SUVs to crossovers. By the millions we'll need hybrid crossovers, diesel crossovers, and maybe even diesel hybrid crossovers. Unless the half of the market buying small and large SUVs does a lot better than its current 22 mpg, then in order to meet a 35 mpg overall target, the other half must achieve Prius-like efficiency of at least 48 mpg--probably requiring some of the passenger car fleet to exceed 60 mpg. And unless we want to wait until 2030 for the performance of the entire US light vehicle fleet to reach these levels, we may also need an accelerated-retirement program for the least efficient cars on the road, based again on the simple math of averages. The bottom line is that we must all shortly start buying cars that are truly more efficient than the average, or we will have to look elsewhere for our energy and emissions savings.

Monday, October 08, 2007

The Dingell Plan

Anyone looking for hints about how the Congress will address the discrepancies between the Senate and House versions of energy legislation and tackle the equally complex problem of our response to climate change should read the Wall Street Journal's extensive interview with Representative John Dingell (D-Mich.) As Chairman of the Energy and Commerce Committee of the House of Representatives, he will have enormous input into the process, and as the Journal points out, he also reflects the middle ground between the status quo and those who see the need for radical changes in the way we produce and consume energy. If Mr. Dingell's primary goal is ensuring the continued prosperity of American industry through the transition ahead, that could be our best guarantee for an approach to climate change that is both effective and cost-efficient.

Mr. Dingell seems to prefer making the costs of reducing greenhouse gas emissions explicit to the public, rather than burying them within wholesale prices via a national cap & trade mechanism. Cents per gallon of fuel would be a lot more noticeable than dollars per ton of emissions. There are good arguments on both sides--I generally come down in favor of cap & trade over a direct carbon tax--but the kind of blended approach hinted at in the interview could have a lot of advantages over either alternative alone. He also wants to stretch out the stricter fuel economy regulations passed by the Senate (35 mpg by 2020) to give US automakers time to find the mix of vehicles and technologies that will work for US consumers. Even if the US auto industry has already wasted a lot of time and false starts in that direction, exporting the remainder of it to South Korea and China would only reduce the US government's leverage on the problem.

It's significant that Mr. Dingell appears somewhat at odds with his party's leadership on many of these issues. The pending shifts in our national politics are setting up a configuration that could resemble that of the mid-1960s or late 1970s, when many of the most important debates occurred across the fault lines within the Democratic Party, rather than between the two parties. Mr. Dingell is a veteran of those periods, and his long experience and present vantage point should enable him to work across these coalitions and put his stamp on legislation enacting tougher fuel economy standards and a stronger US response to climate change.

Friday, October 05, 2007

Moral Dilemmas

The recent government crackdown in Burma (a.k.a. Myanmar) serves as another reminder of the perils of doing business in nations run by odious regimes. Oil & gas production in Burma is small by global standards, consisting of a few major gas deposits, and is chiefly of importance to its neighbor Thailand. The main offshore natural gas fields, Yadana and Yetagun, are operated by Total and Malaysia's Petronas, respectively. Chevron Corp., as the only US oil company with a stake in Burma--a 28% share of Yadana--is being harshly criticized for its association with the regime. While genuine and serious moral issues are involved, they are hardly cut and dried. Before passing judgment on these companies, we ought to consider some serious questions about the effect of their operations in such countries, and the consequences of their departure.

The SLORC junta in Burma is one of the worst governments in the world, and its calculated arrogance in brutalizing the country's universally-revered Buddhist monks is breathtaking. With its abuses of human rights, minorities, and democracy, the situation in Burma calls to mind South Africa during the Apartheid era. At that time, my former employer Texaco faced frequent calls for it either to apply more pressure on the government or cease its affiliate operations there. Aside from the obvious financial consequences, there were other impediments to leaving, including the company's responsibility to its local employees and the perceived benefits of remaining engaged. These were ultimately embodied in the internationally-recognized Sullivan Principles, later expanded into the Global Sullivan Principles. While some saw these principles as mere cover for the corporations involved, they were invaluable in helping companies define for their hosts the only acceptable conditions under which they might remain.

In Burma as elsewhere several serious considerations must be weighed, including:
  1. Does the presence of the company in the country convey legitimacy to an otherwise illegitimate government and its practices?

  2. Do the company's operations conform to international law and accepted standards for engagement in such countries, such as the Global Sullivan Principles?

  3. If the company exited the country, which would be harmed more, the regime or the population?

  4. If the company exited, would its departure materially impede the functioning of the regime, or would it be replaced by other investors that are less committed to upholding international standards?

Although the oil companies involved in Burma arrive at answers to the first three questions that justify remaining, while their critics assert otherwise, it's the last question that seems most relevant here, because the answer can be arrived at objectively by reviewing recent history. Chevron got its stake in Yadana when it acquired Unocal Corp. in 2005. That acquisition was hotly contested with China's CNOOC. The risk of US government intervention in the transaction ultimately scared off CNOOC. Given China's aggressive pursuit of global oil and gas assets and the country's other investments in Burma, it's very likely that CNOOC would be the top bidder for Yadana, were Chevron or Total to divest. That displacement could not possibly be beneficial for the Burmese people.

So what should Chevron and Total do? Burma isn't South Africa, where the majors' operations focused on refining and marketing, employing tens of thousands of locals and creating opportunities for basic education and management training. A gas field and pipeline employs many fewer people, and the companies' biggest impact is probably through the social investments they have made. Are those substantial enough to justify remaining? In many ways, divesting would be the easiest answer, especially for a non-operator like Chevron. But as a shareholder, I'm not at all sure that's the best course for either Burma or Chevron. Playing a positive role in developing the world's vital resources inevitably takes US companies to countries that don't meet our standards. Whatever shortcomings these companies may have, they are still a lot more transparent and accountable than many of the firms that would inevitably take their place. It may not represent the moral high ground, but staying and doing the good you can is at least a practical answer for a highly imperfect world.

Thursday, October 04, 2007

Electric Infrastructure

Coming from a part of the energy industry more focused on molecules than electrons, I don't claim to understand all the intricacies of the electricity transmission business. I heard from more than one attendee at last week's Herold Pacesetters Energy Conference that transmission is a critical and under-appreciated element of solving our energy problems. That rings true. Among other things, expanding the existing transmission network is essential for delivering the large quantities of wind energy needed to meet the growing web of state renewable portfolio standards (RPS.) But under what circumstances should new long-distance lines be built, and when should that take place over local opposition?

A few years ago, it was trendy to suggest that distributed power would soon render long-distance, high-voltage power transmission obsolete, with the coming "smart grid" relying more on small generation close to the load, and less on big central power plants far away. I would have been in that crowd, back when I was convinced that home fuel cells were about to become mass-market. But while rooftop solar is making inroads in that direction, and could someday cover a big chunk of peak residential and business demand, distributed power today is a niche application that only substitutes for the grid on the smallest scale.

My local utility, Dominion Power, is seeking permission to build a new line to bring power into Northern Virginia from Pennsylvania and West Virginia, traversing the rapidly-growing counties of DC's outer ring of suburbs and some historical sites. (I'm learning that you can't go far in Virginia without encountering a historical site.) Yesterday's Washington Post reported that the Federal Energy Regulatory Commission (FERC) had declared the Mid-Atlantic region, including the national capital region of DC, MD and VA, as a National Interest Electric Transmission Corridor. That could lead to Dominion being given powers of eminent domain to obtain the necessary rights of way. Dominion has indicated it will not seek federal assistance and would continue to work through the state's process. Ironically, development is driving the need for more power in the region, but the impact of new power lines on the development potential of the land they would cross is an underlying cause of much of the opposition--along with concerns about the unconfirmed health impacts of high-voltage power lines.

In California, which implemented an aggressive RPS, accessing much of the state's wind resource for power generation will require new transmission lines. Until recently, the big question was who should pay for this. The state Public Utilities Commission ruled in June that it was in the public's interest as ratepayers to provide the up-front funding for power lines to connect these renewable sources to demand. Otherwise, the first wind developer in a new area would have to pay for the entire cost of the line, harming the economics of the entire project and potentially derailing it.

Installing new high-voltage lines to enable renewable energy supplies is quite different from doing so to ensure a reliable supply of cheap electricity, although I suspect that Dominion's management and most of its customers would agree that the latter phrase describes the company's primary mission pretty well. Customers whose property values are affected or who have a special interest in preserving the history and scenic beauty of the region would doubtless disagree, and if the line were going through my neighborhood, I'd be less than thrilled, myself. My regular readers are probably anticipating my usual anti-NIMBY stance, here, and they'd be right, if it were clear that Dominion had been doing all it could to avoid the necessity for this project.

Californians already pay high prices for power, with residential pricing steeply tiered and top rates hitting $.30/kW-hour. They have significant incentives to conserve and invest in efficiency. Meanwhile, I'm paying under $0.10/kWh here, all-in, and the payouts on installing more efficient appliances and lighting are pretty lengthy. Moreover, Virginia's electricity supply is about average in CO2, coming from a mix of coal and nuclear power. The new power line would bring in power from Pennsylvania, which has a similar emissions profile, and West Virginia, which is mostly coal-fired. So there's no argument for new power lines here to promote renewable energy; quite the contrary, I would say.

Unfortunately, the projected capacity crunch occurs in 2011, which is barely enough time to get the power lines built, let alone implement a demand-side management plan on the scale necessary to avert the need. If Dominion can't build this line, it would soon be in violation of the North American Electric Reliability Council (NERC) standards that were developed after the big blackout of 2003, and that could put the power supply for parts of the federal government at risk. Unfortunately, most of our long-distance electric infrastructure dates back to the same era in which most of our interstate highways were built, and is similarly stretched. As with other infrastructure, we tend not to appreciate it until it fails. In light of all that, I believe the best course of action would be to let Dominion proceed--hopefully without the exercise of eminent domain--but conditioned on a serious review of their rate structure and demand-side management efforts, to make sure they aren't in the same situation again in only a few years.

Wednesday, October 03, 2007

Waiting for Sputnik

Tomorrow's fiftieth anniversary of the launch of Sputnik seems like an odd sort of event for Americans to commemorate, unless the lesson is that we can start out behind and still win. I've seen many Sputnik articles and commentaries in the past week, and one of the better ones, in Monday's Washington Post, included some useful reminders of just how wrong our immediate post-Sputnik predictions about the future turned out to be. We may not have colonies on the Moon and Mars, but, as the author put it, "Sputnik plus the Internet equals Google Maps." That sort of unpredictability of ultimate outcomes might apply equally well to our current perceptions about energy and climate change, for which a wake-up call as dramatic as Sputnik's might yet lie somewhere ahead.

Aside from ushering in the Space Age, Sputnik was one of those remarkable "caught napping" moments in US history. Although I have no recollection of the immediate aftermath, since I was an infant at the time, the shock wave from Sputnik certainly shaped my childhood and influenced my educational choices. Anyone looking for a comparable current influence from energy or the environment would come up short. Even though both of these concerns routinely make headlines, the effect of these issues on our daily lives remains modest. Our children aren't yet growing up in a nation urgently reorganizing itself to meet the challenges these issues represent, as it did around the parallel arms and space races with the Soviet Union. Many still hope they won't have to, and that oil prices will fall and climate change prove illusory.

In thinking about what a Sputnik for energy or climate might look like, it's helpful to consider why neither meets that standard, today. It has taken oil prices roughly four years to triple their former average level, and the increase owes as much to higher demand spurred by economic growth as to supply constraints. That's very different from the oil shocks of the 1970s. As to climate change, while our awareness is growing steadily, the feedback loop of cause-and-effect, response-and-reaction is longer than our attention spans. The fact that there is still argument about our contribution to the problem and what to do about it is strong evidence that we haven't reached the galvanizing moment at which all doubt is cast aside.

So what would qualify? Certainly not just a politician or media figure telling us that one or the other problem is urgent. We hear that ever day about a host of issues. It would have to be something external and dramatic. On energy, the possibilities are easy to imagine: We awaken one morning to discover that Osama bin Laden has taken over a major Middle East producer, or Hugo Chavez has decided to export 100% of Venezuela's oil to China. Or the Saudis announce that their production has peaked, and oil futures soar to $200. On the climate side things are trickier. Most of the big signals only make sense as part of a larger pattern, after the fact. It's impossible to discern whether individual hurricanes, droughts, or heat waves are signposts of global warming or just random weather events. Melting icecaps may be as close as we come, but they still lack the gut-wrenching immediacy of a Sputnik, Pearl Harbor, or 9/11.

As unwelcome as that kind of surprise might be, it almost seems necessary to overcome the complacency of gradual change--the proverbial "boiling frog". Absent a Sputnik moment for energy and climate change, how do we convince the public of the need for action, for the equivalent of a Manhattan Project or Apollo Program to address these problems, if that's what it takes? Perhaps, for a change, we will have to trust in the good judgment of our fellow citizens, if we provide them all the facts--including those that do not align with our view of impending catastrophe--and lay out all of the choices and their likely consequences. That by itself might be as novel and empowering as another Sputnik.