Saturday, March 31, 2007

Central Limit Theorem and The Oil Shock Model

Frequently I will see a reference to the peak oil Hubbert curve describing it as a Gaussian or normal distribution. The reference to a Bell-shaped curve usually signifies some connection to a law of large numbers or central limit theorem argument -- or less frequently to some type of rate law. I have never bought into the classical hand-waving justification for a Hubbert curve following a normal distribution, as it violates causality in my opinion. However, I can throw in a bone as to how one can follow a path to the central limit theorem just by using the oil shock model.

First consider this statement describing the central limit theorem:
The density of the sum of two or more independent variables is the convolution of their densities (if these densities exist). Thus the central limit theorem can be interpreted as a statement about the properties of density functions under convolution: the convolution of a number of density functions tends to the normal density as the number of density functions increases without bound, under the conditions stated above.
Let us parse the preceding paragraph from the Wiki definition. The first sentence basically reiterates the premise of the oil shock model. We made the assumption that the temporal dynamics of oil production rely solely on a set of random variables representing delays in extraction occurring after the discovery point. The densities (i.e. the probability density functions) of these delays follow a declining exponential in which the standard deviation equals the mean for each random variable. The second sentence indicates that the density of the sum (i.e. the sum of the variable delays) leading to a peak comes from the repeated convolution of the individual variable densities. The convenient1 Wiki definition shows how even an arbitrarily shaped density function when convolved against itself several times leads to a curve that has a "normal" shape:

As a refresher, let me give examples of the random variables that the central limit theorem refers to, in specific terms relative to the oil shock model.
TimeDelay = (T1 + ... + Tn) random variables
So for n=4, we may find for a specific well that
  1. The discovery lays fallow for 3 years (T1=3) while negotiations take place for ownership, rights, permits, etc.
  2. Next, construction of the oil rigs and infrastructure takes 8 years (T2=8)
  3. After completion, it takes 5 years (T3=5) for the reservoir to reach maturation (toss in reserve growth considerations)
  4. Once pumping at full rate, the reservoir drains with a time constant of 10 years (T4=10).
According to a deterministic setting, the sum of these values equals 3+8+5+10=26 years; or 26 years until the reservoir drops to its 1/e original value.

But in a stochastic world, the individual delays turn into density functions that we characterize completely by treating the delays as averages with a maximum entropy standard deviation, i.e. a decaying exponential.

So if we pair up the four stages as two sets of convolutions, we can generate intermediate density profiles.
exp(-at) * exp(-bt) = (exp(-bt) - exp(-at))/(a-b) where * is the convolution operator
The first convolution pairs the fallow stage with the construction stage. The peak of this curve occurs away from Time=zero even though the individual exponentials have peaks at zero. This demonstrates the initial impact of the central limit theorem.

The second convolution pair calculates the maturation+extraction shift. Note that the shift away from Time=0 illustrates the impact of a continuous maturation probability density function -- extraction will not hit a peak until the area matures to an expected value.

Next, the two pairs get convolved together to show the total shift from the initial discovery delta, with a peak at around 18 years. Note that the profile continues to sharpen and become more symmetric. (We can also generate the density profile via Laplace transforms, i.e. characteristic functions in central limit theorem parlance).

But remember that I generated this final production profile for a single discovery delta. Throw in a range of discoveries, perhaps following the quadratic or cubic discovery model I presented recently, and you will see the classical Bell-shaped curve emerge without requiring too many convolutions.

The outcome of this derivation suggests that we can use central limit theorem arguments to "prove" the existence of a roughly Bell-shaped curve without having to precisely match a Gaussian/normal profile. And again, the causal nature of the discovery/production process prevents us from achieving an exact match in the first place.



1 The Wiki definitions for math concepts look surprisingly useful, likely due to the fact that right-winged home-schoolers have yet to find a political agenda embedded in the symbol-heavy text.

Thursday, March 29, 2007

Cubic Discovery Model follow-up

As a followup to the cubic oil discovery model post, I figured that suggesting typical numbers would help us make sense of the scale and dynamics of the discovery profile's rise and fall -- and provide a sanity check.

First, let's recall that the earth has a surface area of approximately 500 million square kilometers. Then we make the crude assumption that we can probe for oil up to 10 Km below the surface with little trouble. This gives a total volume of 5x1018 cubic meters. Under the order of magnitude estimate that it took approximately 100 years to explore all of this volume, we can deduce the initial and final volume/year sampled.
Integral (Volume/year) = Total Volume
So assuming a linear dimensional increase per year (caused by ever increasing manpower supply as well as technical improvements), we likely started out in the year 1858 sampling a volume 20 meters deep and 100 Km on a side. And then we ended in 1958 sampling a volume per year the equivalent of 2 Km deep and 100,000 Km on a side. (The volumes remain more important that the relative dimensions)

To thoroughly sanity check and perhaps validate these numbers, getting data for the actual volume per year explored would certainly help to prove or disprove the basis for the model. Barring that, the power-law increase in sampling volume seems the only reasonable explanation for why the biggest set of discoveries occurred well into the 20th century. As the probability of discovery has to scale proportionately to the sampled volume, a cubic growth remains the simplest way to defer the major discoveries until later. So the fact that the discovery model can predict the rise and fall accurately along with spanning the range of possible sampling volumes, gives us added confidence of the general validity of the model.

Bonus round: If the earth's explorable crust contains 5x1018 cubic meters, and the estimated total oil at 3 trillion barrels (0.117 cubic meters / barrel), what fractional volume does the oil displace? Answer: 0.0000000006. So, on the average, for every barrel discovered, you have to probe 1.66 million cubic meters.

Wednesday, March 28, 2007

Common Ground

Wedge issues divide people leading to a bifurcation in alliances or allegiances. Common ground brings people together. It didn't occur to me until fairly recently that common interests could have brought three big Peak Oil proponents together. First, Geology Professor Craig Bond Hatfield:
My second meeting with Dr. Hubbert came thirteen years later, in April 1969, at the annual meeting of the American Association of Petroleum Geologists in Dallas, Texas. We met on an elevator in the convention hotel, and I told him that I had heard his 1956 talk and asked him if he still thought that U.S. oil production was about to stop growing and start declining. He said it would start to decline within the year. We talked for maybe an hour sitting in the hotel lobby – mostly about fishing. He was an avid fisherman (fresh water lakes – not deep sea).
As a second bit of synchronicity, the publisher/editor of the most popular fresh-water fishing magazine of the 1970's and beyond, George Pazik, wrote extensively on Hubbert and tried to educate and warn his readers, shaming major media outlets of the time with his common sense insight:
Taking off on Hewett's earlier work, Hubbert used integral calculus to produce a curve that mathematically would illustrate the birth, middle years, and finally the death of an exhaustible resource, Fig. 1. Never mind the integral calculus part, this simple curve can be read by any grade school child.
Pazik himself interviewed Hubbert about Peak Oil, and no doubt shared fish tales, during the 1970's. This kind of shared encounter propagated to a fish-crazy teenager becoming aware of an altogether different subject and years later studying the integral calculus to death. All for the love of tossing a lure into the water.

Tuesday, March 27, 2007

Bifurcation

David Roberts says this on the HuffPo:
I would have been skeptical about this even a year ago, but it looks like global warming may well turn out to be a "wedge issue" that fractures the Republican party. Them's the breaks, though. Reality, as they say, has the last laugh.
I thought all along that this would snowball into a wedge issue (and here).

And you know that the Furitanical Minionists have completely bifurcated when they start lining up behind George Monbiot (!) to try to diminish the influence of the hated Al Gore.
George Monbiot is right when it comes to his criticism of biofuels:
Which looks the equivalent of switching to Noam Chomsky when you think that Jimmy Carter has developed some misconceptions on the Middle East conflict. It makes my brain hurt thinking about the wedge driven into their skulls.

Sunday, March 25, 2007

The Cubic Growth Discovery Model

A few months ago, I posited an oil discovery growth model which I consider follows reality better than any currently available model (which basically consists of none, since everyone bar a few contrarians seem to consider the Logistic model the end-game, and no one apparently seems to consider the discovery phase interesting from a modelling perspective). I used a model that features a time dependence which increases quadratically. This has worked in the past for human-effected discovery phenomena such as Wiki word growth. After thinking about a solid first-principle physical basis for such a growth regime, it finally dawned on me to re-think the geometry of the problem domain. Also, even though the quadratic solution showed a nice simplicity, featuring only an amplitude and a time-scale factor, the quality of the fit I thought could have used some more work. (Note that the profile could narrow a bit for USA discoveries below)

The insight came to me as I thought how quadratic growth could occur in various geometric contexts. Of course, quadratic growth occurs in two-dimensions in the case of a circle which has a radius that increases linearly with time:
Area(Time) ~ (a*Time)2
Now, this makes sense from a discovery perspective in one key respect. If you consider that technology and manpower increases at least linearly with time (Moore's law notwithstanding), then the size of the discovery cross-section should at least track a radial increase. So the size of the prospected regions would follow a quadratic (n=2 power law). Each year a progressively larger geographical area gets sampled for oil until it hits a real physical constraint, i.e. the cumulative area sampled. This gives rise to the differential equation I presented last time:
d2Discovery(t)/dt2 = c - a3*Integral(Discovery(t))
Note that I make the assumption that a sampled region generates a discovery proportional to the size of the region, or Discovery ~ SampledArea. So that makes sense from a first-principles perspective and I didn't have to drag in other productivity factors to get the squared-law to work.

But then I realized that sampling actually works from a volumetric perspective as well:
Volume(Time) ~ k*Time3
In this case, not only does the radius of a sampled area increase with time, but so does the depth. Again this makes sense, as oil prospecting has likely shown a monotonic increase in sampled depths over the years. I tried to describe this phenomena in the following illustration.

Figure 1: Cubic model showing the two regimes, A and B


If you look at the part of the figure labelled Regime A, a time sequence of progressively increasing sampling volumes probe the "easy access" volume of size Vd. Every year, a new volume probed gets used up and contributes to a declining pool of remaining volume. In computer science terms, this looks an awful lot like a bin-packing problem. As the remaining bins become progressively more sparse and small, a negative feedback gets applied to the linear rate term. This effect leads to the following linear differential equation and surprisingly simple solution.

The differential equation transitions from a 3rd order solution in the case of the quadratic model to a 4th order solution in the case of this cubic model. The second term on the RHS provides the negative feedback in terms of the cumulative discovery volume (D(t)) probed.

The following figure shows the qualitative difference between the quadratic and cubic growth models:

The cubic growth model shows the expected clear narrowing of the distribution around the peak, which also seems to fit the discover data better.

We should worry about the region to the right of the figure, where the discovery model goes to zero while the real data indicates substantial finite discoveries. But this has a fairly obvious interpretation when we refer back to Figure 1. Consider that the cubic discovery model needs a constraint related to a finite pool of easily discoverable oil -- i.e. Regime A in the figure. Yet we don't really know the exact extend of this finite volume and especially what lies beyond the easily accessible volume. In particular we don't know how deep the volume goes. To model this, we add an extra regime, Regime B, which lies underneath the "easy access" volume (or in more inhospitable offshore or polar regions).

Unfortunately, the remaining "difficult access" area gets progressively harder to probe, likely proportional to 1/Depth of the unknown area (i.e. we have to work harder for region at time T6 than for the shallower region at time T5). We also no longer can make increasingly large volume projections as in the cubic Regime A, as we have pretty much exhausted the entire planet in terms of areal coverage. So at best, any technological increases that track linearly with time gets offset by a factor proportional to the inverse of the depth, leading to a discovery rate proportional to that remaining. This leads to the 1st-order differential equation:
Discovery(t) = c0 - a0*Integral(Discovery(t))
leading to a solution described by a simple decaying exponential -- in other words, the only part of the Logistic model that makes sense.

I didn't spend too much time fitting the declining exponential to real world data as we have limited data here and clearly it won't make much difference to the overall future production of quality crude.

If this model indeed effectively describes discovery, why should it work? My thinking goes like this: we collectively have the one of the most highly sampled data source in human history. Every region of the globe has gotten probed, either randomly or systematically, such that we have few remaining unexplored regions on Earth. The cubic discovery model does basically puts a mathematical basis to a blind-man's dart game. I contend once again that we do not have to understand much about the geology of oil deposits. This model would probably work just effectively if we happened to start looking for expensive buried coins on a hypothetically previously unexplored beach and we could start by digging by hand, then using a metal detector, and finally using earth moving equipment and a sifter (you get the idea). This essentially explains why all the unique elements of petroleum geology get washed out; in the end if we deal with a sparsely populated sampled system, this kind of math should work out fine.



I combined the cubic growth discovery model with the oil shock depletion model via a convolution and came up with this model fit for the global oil production curve. I used 12.5 years for each of the shock model lags, which you can see in the progressively deeply shaded curves from discovery to production. (caveat, I only used Regime A of the cubic discovery so the discovery decay dies out rapidly around the year 2000)

Is this the only first-principles model for global oil discovery and production?


Note that at the time origin at year 1858, I have no choice but to place the original discovery stimulus for Titusville, PA. If this deterministic stimulus did not exist the oil production profile would not match the model, which naively would suggest a starting value of 0 barrels/year. This adds an extra variable to the model, but the residual error only effects the early years, and nothing much past 1900.

Saturday, March 24, 2007

Deforestation

Monkeygrinder found an interesting reference to deforestation occurring in Brazil to make room for sugar cane. This complements the interesting find that Khebab made of Borneo deforestation to make room for oil palm.

I tried to find a "good" example of jungle clearing in Brazil via Google Earth. This Google landmark fit the desciption:
If you see north west and south east of this placemark you will find a HUGE destruction of the amazon forest spreading just like a virus. Unfortunately this is us in action, the same civilization capable of creating such wonderfull things as google earth, is able to close the loop and show us some of our worst atributes.

An area of Amazon jungle larger than the U.S. state of New Jersey has been destroyed this year and work on a new highway is mainly to blame, environmental group Friends of the Earth and the government said on Wednesday.

The preliminary figures, based on satellite images, alarmed environmentalists because they suggest that Amazon destruction has surpassed its second-highest level reached in 2002-2003.

The data is based on a satellite system which has been monitoring Amazon deforestation on a test basis. The government's yearly figures, released in March, are based on data from a different satellite system.

The images indicated that from 8,920 square miles to 9,420 square miles (23,100 sq km to 24,400 sq km), or an area bigger than New Jersey, was cut down this year, said Joao Paulo Capobianco, the government's secretary of biodiversity and forests.

If confirmed, the total figure for this year's deforestation will be above the 2002-2003 level of 9,170 square miles (23,750 sq km), said Roberto Smeraldi, head of Friends of the Earth in Brazil.

The figure was especially worrying because it showed that for the first time in history Amazon deforestation rose despite a slowdown in agriculture during the year, he said.

A record level was set in the mid-1990s in a year marked by an exceptional incidence of fires.

Small farmers have been major culprits in the trend as they hack away at Amazon jungle to expand their fields.

The data showed a big jump in deforestation along a road running through the heart of the Amazon that the government has said it wants to pave.

'The big reason for this (destruction) is the BR-163 road,' Smeraldi said. 'The government knew about this; it was warned. What is surprising is that they are not even talking about their anti-deforestation plans.'

In the region of the road, deforestation soared by more than five times, Smeraldi said. Settlers have moved in even before the government started paving it.


Environmentalists have warned that roads, dams and pipeline projects through the Amazon -- home to up to 30 percent of the planet's animal and plant species -- represent the biggest threat to the forest because they open up access to large-scale development and settlement.

This snapshot spans an area 200 miles on a side.

Sunday, March 18, 2007

A Stinkbomb

"800 years of bubonic plague caused global cooling. Can you imagine the amount of methane that the dying bodies released?"
Paraphrased from an unidentified, apparently serious elderly woman caller to Drudge's radio show, who also referred to Al Gore as a "snake-oil salesman".

This occurred after reporter Mike Allen from politico.com had told Drudge that his audience consisted of the most intelligent listeners in radio.

Not true Mike, Drudge's show attracts listeners who appreciate unintended comedy. Much like The Great Global Warming Swindle documentary does.

Friday, March 16, 2007

Walk the walk

David Roberts at HuffPo explains why it pays to use some humor when explaining or debating technical issues. Gavin Schmidt from RealClimate realized that he could not keep up rhetorically with the show-biz Crichton and Stott. Evidently these two may talk the talk, but they don't walk the walk:
One minor detail that might be interesting is that the organisers put on luxury SUVs for the participants to get to the restaurant - 5 blocks away. None of our side used them (preferring to walk), but all of the other side did.
Pertaining to the deniers, "If you get in the mud with a pig, both you and the pig will get dirty, but the pig will enjoy it"

Piggly Wiggly
"The original Nasa data was very wiggly-lined and we wanted the simplest line we could find," Mr Durkin said.

Wednesday, March 14, 2007

Thinking doesn't make it so

Nora Ephron skewered power-of-positive-thinking experts and cornucopians in one fell swoop in a recent HuffPost:
So I had lots of fun saying that I was going to do something about the oil crisis and talking about ethanol. Nobody really understands ethanol. Nobody really understands it takes more energy to make ethanol than it actually saves. But who cares? It sounds good and therefore it's good.
So what could it hurt to add my own bit of positive thinking to the mix? A commenter at TOD named "memmel" claimed climate change models had approximately the same level of complexity as oil depletion models. No way! I contend that scientifically modelling peak oil actually shows orders of magnitude less complexity than predicting global warming. Consider this self-help motto:
  • Oil Depletion: An exercise in extraction of fluids from a container.
  • Climate Change: An exercise in non-linear fluid dynamics of N-dimensionality.
Which one sounds more difficult to make sense of?

I know it has nothing to do with coming up with new forms of renewable energy, but for this small corner of the simulation universe we can hold out hope to make sense out of nonsense and extract signals from the noise. As Robert Rapier indirectly points out, why use empirical formulas while we have a fighting chance to use some real theory?

Thanks monkeygrinder for the pep-talk.

Monday, March 12, 2007

The Great Rock & Role Swindle

For yucks, you can't beat this "expose" on global warming:
The Great Global Warming Swindle

Johnny Rotten in another few years, played by Harry Shearer, rechristened Lord of Blaby, implicating dear Lady Thatcher for past deeds in promoting the global warming "industry". Stand down, Margaret.


At times the documentary looked like a cross between a Monty Python skit and Shearer's latest "For Your Consideration".

I never realized that cloud formation, low cloud formation, something derives from galactic cosmic radiation. Who knew?

Saturday, March 10, 2007

LOLN

The Oil Shock model at its core consists of a time-stochastic phasing from discovery of oil regions to their maturation. For example, the Shaybah field in Saudi Arabia, though discovered in 1968 only come on line in 1998 and hasn't matured yet. Located in the "Empty Quarter", a particularly desolate and imposing place, this field provides a typical example of how the latency of each of the stages adds up to explain the shift of the discovery curve to the production curve. Although we can't say exactly how long the field remained fallow, or how long it took to construct the rigs, or how long the maturation process took, or even estimate the extraction rate, a global model would require a spread of these values representing the uncertainty/variability of these numbers from location to location and economy to economy. A good conservative estimator of the phases would lead one to guess at a mean with a standard deviation equal to the mean -- this becomes a decaying exponential distribution of latencies. The convolution of this set of exponentials generates the shifted and spread production curve originating from the tighter discovery profile.

The Law of Large Numbers

The Oil Shock model represents an entry from the law of large numbers argument. Given the fact that enough wells exist, I assert the model reflects reality.

For an example of someone trying to look at every deterministic bump, check out Stuart Staniford's analysis of recent Saudi production. This looks like tricky stuff; I might touch it with a 10-foot pole but wouldn't defend any trend it produced. I might have better success predicting the next location of a water strider. In other words, a sure trend could turn into the equivalent of Brownian motion.

Sunday, March 4, 2007

Sludge

Thanks to the Matt Sludge Report, I see a writer (Jad Mouawad) from the NY Times thinks that we will beat Peak Oil by relying on sludge. Note the bolded portion of the article: Oil Innovations Pump New Life Into Old Wells
BAKERSFIELD, Calif. — The Kern River oil field, discovered in 1899, was revived when Chevron engineers here started injecting high-pressured steam to pump out more oil. The field, whose production had slumped to 10,000 barrels a day in the 1960s, now has a daily output of 85,000 barrels.

In Indonesia, Chevron has applied the same technology to the giant Duri oil field, discovered in 1941, boosting production there to more than 200,000 barrels a day, up from 65,000 barrels in the mid-1980s.

And in Texas, Exxon Mobil expects to double the amount of oil it extracts from its Means field, which dates back to the 1930s. Exxon, like Chevron, will use three-dimensional imaging of the underground field and the injection of a gas — in this case, carbon dioxide — to flush out the oil.

Within the last decade, technology advances have made it possible to unlock more oil from old fields, and, at the same time, higher oil prices have made it economical for companies to go after reserves that are harder to reach. With plenty of oil still left in familiar locations, forecasts that the world’s reserves are drying out have given way to predictions that more oil can be found than ever before.

In a wide-ranging study published in 2000, the U.S. Geological Survey estimated that ultimately recoverable resources of conventional oil totaled about 3.3 trillion barrels, of which a third has already been produced. More recently, Cambridge Energy Research Associates, an energy consultant, estimated that the total base of recoverable oil was 4.8 trillion barrels. That higher estimate — which Cambridge Energy says is likely to grow — reflects how new technology can tap into more resources.

“It’s the fifth time to my count that we’ve gone through a period when it seemed the end of oil was near and people were talking about the exhaustion of resources,” said Daniel Yergin, the chairman of Cambridge Energy and author of a Pulitzer Prize-winning history of oil, who cited similar concerns in the 1880s, after both world wars and in the 1970s. “Back then we were going to fly off the oil mountain. Instead we had a boom and oil went to $10 instead of $100.”

There is still a minority view, held largely by a small band of retired petroleum geologists and some members of Congress, that oil production has peaked, but the theory has been fading. Equally contentious for the oil companies is the growing voice of environmentalists, who do not think that pumping and consuming an ever-increasing amount of fossil fuel is in any way desirable.

Increased projections for how much oil is extractable may become a political topic on many different fronts and in unpredictable ways. By reassuring the public that supplies will meet future demands, oil companies may also find legislators more reluctant to consider opening Alaska and other areas to new exploration.

On a global level, the Organization of the Petroleum Exporting Countries, which has coalesced around a price of $50 a barrel for oil, will likely see its clout reinforced in coming years. The 12-country cartel, which added Angola as its newest member this year, is poised to control more than 50 percent of the oil market in coming years, up from 35 percent today, as Western oil production declines.

Oil companies say they can provide enough supplies — which might eventually lead to lower oil and gasoline prices — but that they see few alternatives to fossil fuels. Inevitably, this means that global carbon emissions used in the transportation sector will continue to increase, and so will their contribution to global warming.

The oil industry is well known for seeking out new sources of fossil fuel in far-flung places, from the icy plains of Siberia to the deep waters off West Africa. But now the quest for new discoveries is taking place alongside a much less exotic search that is crucial to the world’s energy supplies. Oil companies are returning to old or mature fields partly because there are few virgin places left to explore, and, of those, few are open to investors.

At Bakersfield, for example, Chevron is using steam-flooding technology and computerized three-dimensional models to boost the output of the field’s heavy oil reserves. Even after a century of production, engineers say there is plenty of oil left to be pumped from Kern River.

“We’re still finding new opportunities here,” said Steve Garrett, a geophysicist with Chevron. “It’s not over until you abandon the last well, and even then it’s not over.”

Some forecasters, studying data on how much oil is used each year and how much is still believed to be in the ground, have argued that at some point by 2010, global oil production will peak — if it has not already — and begin to fall. That drop would usher in an uncertain era of shortages, price spikes and economic decline.

“I am very, very seriously worried about the future we are facing,” said Kjell Aleklett, the president of the Association for the Study of Peak Oil and Gas. “It is clear that oil is in limited supplies.”

Many oil executives say that these so-called peak-oil theorists fail to take into account the way that sophisticated technology, combined with higher prices that make searches for new oil more affordable, are opening up opportunities to develop supplies. As the industry improves its ability to draw new life from old wells and expands its forays into ever-deeper corners of the globe, it is providing a strong rebuttal in the long-running debate over when the world might run out of oil.

Typically, oil companies can only produce one barrel for every three they find. Two usually are left behind, either because they are too hard to pump out or because it would be too expensive to do so. Going after these neglected resources, energy experts say, represents a tremendous opportunity.

“Ironically, most of the oil we will discover is from oil we’ve already found,” said Lawrence Goldstein, an energy analyst at the Energy Policy Research Foundation, an industry-funded group. “What has been missing is the technology and the threshold price that will lead to a revolution in lifting that oil.”

Nansen G. Saleri, the head of reservoir management at the state-owned Saudi Aramco, said that new seismic tools giving geologists a better view of oil fields, real-time imaging software and the ability to drill horizontal wells could boost global reserves.

Mr. Saleri said that Saudi Arabia’s total reserves were almost three times higher that the kingdom’s officially published figure of 260 million barrels, or about a quarter of the world’s proven total.

He estimated the kingdom’s resources at 716 billion barrels, including oil that has already been produced as well as more uncertain reserves. And thanks to more sophisticated technology, Mr. Saleri said he “wouldn’t be surprised” if ultimate reserves in Saudi Arabia eventually reached 1 trillion barrels.

Even if the Saudi estimates are impossible to verify, they underline the fact that oil companies are constantly looking for new ways to unlock more oil from the ground.

At the Kern River field just outside of Bakersfield, millions of gallons of steam are injected into the field to melt the oil, which has the unusually dense consistency of very thick molasses. The steamed liquid is then drained through underground reservoirs and pumped out by about 8,500 production wells scattered around the field, which covers 20 square miles.

Initially, engineers expected to recover only 10 percent of the field’s oil. Now, thanks to decades of trial and error, Chevron believes it will be able to recover up to 80 percent of the oil from the field, more than twice the industry’s average recovery rate, which is typically around 35 percent. Each well produces about 10 barrels a day at a cost of $16 each. That compares with production costs of only $1 or $2 a barrel in the Persian Gulf, home to the world’s lowest-cost producers.

Chevron hopes to use the knowledge it has obtained from this vast open-air, and underground, laboratory and apply it to similar heavy oil fields around the world. It is also planning a large pilot program to test the technology in an area between Saudi Arabia and Kuwait, for example.

Oil companies have been perfecting so-called secondary and tertiary recovery methods — injecting all sorts of exotic gases and liquids into oil fields, including water and soap, natural gas, carbon dioxide and even hydrogen sulfide, a foul-smelling and poisonous gas.

Since the dawn of the Petroleum Age more than a century ago, the world has consumed more than 1 trillion barrels of oil. Most of that was of the light, liquid kind that was easy to find, easy to pump and easy to refine. But as these light sources are depleted, a growing share of the world’s oil reserves are made out of heavier oil.

Analysts estimate there are about 1 trillion barrels of heavy oil, tar sands, and shale-oil deposits in places like Canada, Venezuela and the United States that can be turned into liquid fuel by enhanced recovery methods like steam-flooding.

“This is an industry that moves in cycles, and right now, enormous amounts of innovation, technology and investments are being unleashed,” said Mr. Yergin, the author and energy consultant.

After years of underinvestment, oil companies are now in a global race to increase supplies to catch the growth of consumption. The world consumed about 31 billion barrels of oil last year. Because of population and economic growth, especially in Asian and developing countries, oil demand is forecast to rise 40 percent by 2030 to 43 billion barrels, according to the Energy Information Administration.

Back in California, the Kern River field itself seems little changed from what it must have looked like 100 years ago. The same dusty hills are now littered with a forest of wells, with gleaming pipes running along dusty roads. Seismic technology and satellites are now used to monitor operations while sensors inside the wells record slight changes in temperature or pressure. Each year, the company drills some 850 new wells there.

Amazingly, there are very few workers in the field. Engineers in air-conditioned control rooms can get an accurate picture of the field’s underground reservoir and pinpoint with accuracy the areas they want to explore. None of that technology was available just a decade ago.

“Yes, there are finite resources in the ground, but you never get to that point,” Jeff Hatlen, an engineer with Chevron, said on a recent tour of the field.

In 1978, when he started his career here, operators believed the field would be abandoned within 15 years. “That’s why peak oil is a moving target,” Mr. Hatlen said. “Oil is always a function of price and technology.”
Too bad we have completely debunked the reserve growth cornucopians, and that recent CERA publicity has lead to this ignorant journalist to rely on cherry-picked examples. The Kern River field remains a huge anomaly that even Attanasi & Root kept off their reserve growth prognostications.

Derivation of Logistic Growth versus Groping in the dark

Since many peak oil analysts like to use Logistic growth to model peak, for Hubbert Linearization, etc., I thought to give the standard derivation of the equation a run through. Of course, the logistic formulation comes about from studies of population dynamics, where the rate of birth and death follows strictly from the size of the population itself. This makes sense from the point of view of a multiplying population, but not necessarily from inanimate pools of oil. In any case, the derivation starts with two assumptions, the birth and death rates:
B = B0 - B1*P
D = D0 + D1*P
We base the entire premise on a the negative sign on the second term in the birth rate -- in the event of limited resources such as food, the birth rate can only decrease with size of population (and the death rate correspondingly increases).

The next step involves writing the equation for population dynamics as a function of time.
dP/dt = (B-D)*P
This provides the underpinnings for exponential growth, however critically modulated by the individual birth and death functions. So if we expand the population growth rate, we get:
dP/dt = (B0-B1*P-D0-D1*P)*P = (B0-D0)*P - (B1+D1)*P2
which matches the classic Logistic equation formulation:
dP/dt = rP*(1-P/Pinfinity)
Where Pinfinity becomes the carrying capacity of the environment. So the leap of faith needed to apply this to oil depletion comes about from analogizing population to a carefully chosen resource variable. The one that history has decided to select, cumulatively extracted oil, leads to the classical bell-shaped curve for instantaneous extraction rate, i.e. the derivitive dP/dt. (Note that we can throw out the death term because it doesn't really mean anything.)

I have always had issues with both the upward part of the logistic curve derivative and the decline part. Trying to rationalize why instantaneous production would initially rise proportionally to the cumulative production only makes sense if oil itself drove the exponential growth. But we know that oil does not mate with itself as biological entities would, so the growth really has to do with human population increase (or oil corporation growth) causing the exponential rise. That remains a big presumption to the model. The decline too has a significant interpretion hurdle as well. Why exactly the rate of growth after we start approaching and bypassing peak has that peculiar non-linear modifier doesn't make a lot of sense; the human population hasn't stabilized as of yet (even though oil company growth certainly has, technically declining significantly through mergers and acquisitions). We really have to face that a lot of apples and oranges assumptions flow into this interpretion.

In the end, using the Logistic curve only makes sense as a cheap heuristic, something that we can get a convenient analytical solution from. It fits into the basic class of solutions similar to the "drunk looking for his car-keys under the lamp-post" problem. Somebody asks the drunk why he chose to look under the lamp-post. "Of course, that's where the light is". I have fundamental problems with this philosophy and have made it a challenge to myself to seek something better; if that means groping around in the dark, what the heck.