Peak oil blindness

May 8, 2007

We have only two modes—complacency and panic.”

—James R. Schlesinger, the first energy secretary, in 1977, on the country’s approach to energy

Peak Oil Blindness at the Macro-Social Level

In recent years there has been a diversification among the still minute portion of the population who are actively discussing the implications of the Peak Oil phenomenon. The audience is no longer limited to retired scientists and petroleum engineers. Increasingly, articles, books, and reports are coming from informed citizens, leaders in the business sector, and yes, even a government office or two.

On February 28, 2007 the United States Government Accountability Office (GAO) published a Report to Congressional Requesters entitled Crude Oil: Uncertainty about Future Oil Supply Makes it Important to Develop a Strategy for Addressing a Peak and Decline in Oil Production [PDF]. Although this is the first work of its kind from such a government entity and welcomed by those intent on increasing public awareness about the Peak Oil phenomenon, the GAO leaves plenty of room for ambivalence. The first line of the report states: “Most studies estimate that oil production will peak sometime between now and 2040.” 1

One might be tempted to question the competence of an organization with access to some of the best data in the country that can narrow down to no less than thirty-three years the estimated time of arrival of what promises to be an event with unprecedented adverse consequences. But this is not an issue of competence. One way to interpret this presumed myopia is to treat it as the product of a system in a highly complex environment. At the macro-social level Luhmann’s General System Theory provides insight into why we might expect offices such as the GAO and other governmental bodies to be restrained in their communication to the public about the Peak Oil phenomenon.

Luhmann’s sociological approach distinguishes between a system and its environment based on complexity. For example, American society can be seen as a system that deals with a highly complex environment which includes countless changing parameters and many other diverse societies. However, this complexity must be represented in a much simpler form within the system if it is to maintain a distinct identity. In order to simplify complexity, our society is forced to select a limited amount of information from a chaotic, extremely complex environment. This relates to Luhmanns’s idea of system closure, where society is an example of an autopoietic system, meaning there is no direct connection between the system and its environment. Instead, a system deals with its representations of the environment.

Autopoiesis also indicates that the system produces the basic elements that make it up. Both the system and its sustaining elements emerge together. American society has a distinctive identity that we constantly reproduce in our communication and that depends on what we consider meaningful. One could argue that inexpensive fossil fuels are an integral part of that identity and therefore any possibilities in the environment outside the system that don’t support or reproduce that identity are unlikely to be selected. In addition, the existence of the Peak Oil phenomenon and the possibility that it is a near-term event represents reduced contingency because it limits the array of possible future actions in a fossil fuel dependent society. Meaning appears only against the backdrop of contingency. Anything that reduces contingency for a particular system will have reduced meaning to it.

So it follows from Luhmann’s theory that American society, as an autopoietic system, by its very energy-intense design, is inclined toward blindness of certain stimuli in its environment. There is little doubt that the availability of cheap fossil fuels has biased many of the decisions collectively made over the last one hundred years. Our social system has evolved through this weighted process of trial-and-error to arrive at the particular solution we call modern society but that does not imply that the “best” solution was chosen. It means that all vital parts of the social system have adjusted to this current structure. This is a temporary end that prevails only as long as the necessary sustaining elements exist at the required levels. Unfortunately, those elements are limited.

Luhmann offers one perspective of the macro-sociological issues surrounding the phenomenon of Peak Oil. There are assuredly others that would provide equally valid insight but System Theory provides a salient point: in order to survive, a system must be able to deal with environmental variations. At some point the environment must be allowed to disturb its inner representations. Without such disturbances, the system would be destroyed by environmental forces that would overwhelm it. One example of this is the stock market crash of 1929. The prices of stocks had no relation to real value, and so the system reached a state of crisis.2

It is well known that any large-scale organization as a system adjusts slowly to alterations in its environment. This suggests that there may be more promise in correcting our vision at the micro level. We now look at possible causes of Peak Oil blindness at the other end of the social spectrum – the individual.

Peak Oil Blindness at the Micro-Social Level

Risk perception involves judgments that people make about the severity and likelihood of events. Unfortunately, there is evidence that people’s perceptions of risk are subject to large and systematic biases.3 Understanding of how we view risk helps determine the source of these biases and the level of disparity between the actual risk posed by an event and our perception of it. In studying the psychological mechanisms by which people evaluate the likelihood of events, Kahneman and Tversky proposed that when faced with the difficult task of judging probability or frequency, people employ a limited number of heuristics which reduce these judgments to simpler ones. When using such heuristics, the individual may estimate probability by assessing availability, or associative distance.

A person is said to employ the Availability heuristic whenever he or she estimates frequency or probability by the ease with which instances or associations could be brought to mind. For example, one may assess the risk of heart attack among middle-aged people by recalling such occurrences among one’s acquaintances. Similarly, one may evaluate the probability that a given business venture will fail by imagining various difficulties it could encounter.4 However, availability is affected by factors other than frequency and probability. Thus, reliance on availability leads to predictable cognitive biases.

One such bias is due to the retrievability of instances, and salience is a factor in this biasing. An event is judged more likely if something associated with that event stands outs or is salient. For example, the impact of seeing a house burning on the subjective probability of such accidents is probably greater than the impact of reading about a fire in the local newspaper.5

The idea of salience as a factor in how we perceive risk implies a compound effect on risk perception when in the framework of the larger social system. According to System Theory, if there did exist salient “burning houses” to serve as instances with regard to the Peak Oil phenomenon, these events would likely be simplified at the boundary between the system (society) and its environment. The effect may then be a double reduction in perceived risk, first at the macro-social level and then at the micro-social level. For instance, suppose there were situations at the international level – a complex environment for American society – showing the effects of the dwindling margin between global oil supply and demand. And suppose these situations were of sufficient salience that they could easily be brought to mind in the future; these instances might include regional conflict over diminishing petroleum resources, increasingly heated rhetoric among nation-states concerning energy, etc. At the environmental boundary, any information that did not act to sustain the system would likely be “reduced in complexity” and therefore reduced in salience for members of American society. This international news now has decreased retrievability for the individual. When using the Availability heuristic in the future concerning events associated with the Peak Oil phenomenon, this individual might judge that there is less risk associated with the phenomenon than is actually the case.

Another factor that affects biasing in our judgment is imaginability. Sometimes one has to assess the probability of an uncertain event whose instances are not stored in memory but can be generated according to a given rule. In such situations, one typically generates several instances and evaluates probability by the ease with which the relevant instances can be constructed.

Imaginability plays an important role in the evaluation of probabilities in real-life situations. The risk involved in an adventurous expedition, for example, is evaluated by imagining contingencies with which the expedition is not equipped to cope. The risk involved in an undertaking may be grossly underestimated if some possible dangers are either difficult to conceive of, or simply do not come to mind.6

In many ways the Peak Oil phenomenon is difficult to imagine. This unprecedented event promises systemic social change. The only references we have to draw from are the oil shocks of the 1970’s but it could be argued that even these events are questionable as instances in the same class as the Peak Oil phenomenon. The former energy crises were politically driven and resolved through increased production of alternative petroleum reserves. The decline post-Peak will be geologically constrained and indefinite in length. There will be no alternative reserves to tap this time.

A particularly pernicious aspect of heuristics is that people are typically very confident about judgments based on them. The psychological basis for this unwarranted certainty seems to be people’s insensitivity to the tenuousness of the assumptions on which their judgments are based (in this case, the validity of the Availability heuristic). Such overconfidence is dangerous. It indicates that we often do not realize how little we know and how much additional information we need about the various problems and risks we face.7

Through empirical research8 Slovic et al, determined that laypeople’s risk perceptions and attitudes concerning a potential hazard are measurable against the idea of ‘dread risk’. Dread risk is characterized at its high end by perceived lack of control, catastrophic potential, high risk to future generations, and not being easily reduced. These are all potential components of a global energy crisis left unmitigated. High levels of perceived risk are in conflict with the individual’s desire for certainty and thus a cause of anxiety. One way to reduce the anxiety generated by confronting uncertainty is to deny that uncertainty. The denial resulting from this anxiety-reducing search for certainty thus represents an additional source of overconfidence.9

Denial is even more problematic than the reduction of perceived risk due to the use of heuristics. Once the individual views the potentialities associated with the Peak Oil phenomenon as uncontrollable or catastrophic, their tendency toward denial only helps to increase the likelihood of these potentialities. By denying its existence we are guaranteeing the absence of Peak Oil in our discourse. If Peak Oil is absent in our discourse, individuals remain uninformed about it, collective effort does not materialize to address it, resources are not allocated to mitigate it, and perceived lack of control moves that much closer to actual lack of control. We then have the beginnings of a cycle where not acting to mitigate a problem in the present reduces our agency concerning that problem in the future.

References:

  1. Jim Wells et al., Crude Oil,…, U.S. Government Accountability Office, Report GAO-07-283, February 2007 ( www.gao.gov/new.items/d07283.pdf )
  2. George Ritzer, Modern Sociological Theory 6th Edition, McGraw Hill, 2004
  3. Sarah Lichtenstein, Paul Slovic, Baruch Fischhoff, Mark Layman and Barbara Combs, “Judged Frequency of Lethal Events,” Journal of Experimental Psychology: Human Learning and Memory, 4:551-78 (1978)
  4. Amos Tversky, Daniel Kahneman, “Judgment under Uncertainty: Heuristics and Biases,” Science, New Series, Vol. 185, No. 4157. (Sep. 27, 1974), pp. 1124-1131.
  5. Ibid.
  6. Ibid.
  7. Paul Slovic, B Fischhoff, S Lichtenstein, “Rating the Risks”, Environment, Vol. 2, Issue 3, pp14-20, 36-9, 1979.
  8. Paul Slovic, Perception of Risk, Science, 236, 1987, pp280-5.
  9. Paul Slovic et al., “Rating the Risks”

Tags: Building Community, Consumption & Demand, Culture & Behavior