It's tough to convince people that there is a need for radical change of their attitudes and behavior in relation to energy issues. Cognitive mechanisms that produce beneficial results under most circumstances can lead to an irrational resistance (if not immunity) to any kind of argument aimed at our own set of beliefs about peaking global oil production, renewable energy resources, and the continued viability of nuclear power. In addition, our brains are essentially lazy - we like to take cognitive “shortcuts” and are often reluctant to engage in meaningful thought about seemingly abstract issues presented to us, especially when we are distracted by much more immediate concerns in our own personal lives.
A number of psychological theories and models deal with these phenomena, the most well-known probably being the theory of cognitive dissonance (Festinger 1957). Put simply, any new bit of information we absorb can have one of three possible relations to what we already accept as true: it can fit in accordance with our beliefs (consonance); it can be unrelated to our beliefs (irrelevance); or it can contradict our beliefs (dissonance). For example, let's say you believe that economic growth is a good thing and a natural state of affairs because so far, that's what everybody has been telling you. Based on that belief, you have built a whole network of related beliefs and expectations about how your life is going to develop.
Now suppose a friend comes along and tells you about a documentary he saw on TV, dealing with the possibility that new oil fields will be discovered by increased offshore drilling. Since this is completely unrelated to any other belief you hold, you have no reason to doubt the veracity of your friend's story; most likely, you will assume that what he told you is true and possibly even integrate it into your own store of encyclopedic knowledge. Similarly, if you were told that poverty levels are decreasing in China because of an unprecedented growth in exports, you would probably accept that as being true because it fits the assumptions you already have about how the world works.
But now imagine what you were told was that not only is economic growth not a natural state of affairs, it has actually turned into a threat to global well-being and will end soon because of severe energy resource constraints. Accepting this would not only mean revising one single belief - it would entail rebuilding large parts of your belief system from scratch, including such essential things as the utility of the professional career you have chosen and the security of your financial assets (or even your future food supply). You cannot accept both your existing beliefs and the new bit of information as true since they are fundamentally contradictory; thus, you are in a state of cognitive dissonance. This is unpleasant at best, and most people will try to resolve this and return to peaceful consonance as quickly as possible.
The problem is that this attempt to resolve cognitive dissonance is not normally based on a rational assessment of the facts. Faced with a choice between rejecting a single proposition or revising everything you thought you knew, you are likely to choose the former just because it is so much more convenient. This does not even have to be a conscious choice: by making use of what is known as selective perception, your brain may just decide to make things easier for you by keeping dissonant bits of information below your threshold of awareness. And if it's too late for that, you can still resort to rationalization: for example, you could try to justify your continued belief in unlimited economic growth by coming up with something like “Of course our energy resources are finite, but we'll just substitute whatever stuff we run out of with something else. That has worked fine until now, so it'll probably work in the future”. By doing this, you create a situation in which you can accept both the new (formerly conflicting) bit of information and your old beliefs as true. Of course, integrating old and new knowledge is not inherently bad - the problem is that people will tend to accept quite unlikely assumptions (like infinite substitutability) in order to be able to refrain from turning their lives upside down.
Things are further complicated by what psychologists call “ego involvement” (Sherif & Hovland 1961). Before looking at this in more detail, let's establish some technical terms: we will call a position on a particular issue an “anchor”. Sticking to the example given above, that could be something like “Resources are infinitely substitutable without loss of utility”. For each anchor, there is a continuum of “latitudes” describing deviating positions; like the anchors themselves, this is highly variable between different people. The “latitude of acceptance” describes everything that's not the same as the anchor, but viewed as acceptable without further argument - say the position that resources are not actually infinitely substitutable, but there are so many possible substitutions that for all practical purposes, this is indistinguishable from being infinite.
The “latitude of rejection” describes the exact opposite: this refers to positions that are rejected outright, without any serious review - say the assumption that substitutability is actually a very rare phenomenon and most resources cannot be readily replaced by something equally useful. Finally, the “latitude of non-commitment” describes positions that could either be accepted or rejected, pending successful or unsuccessful persuasion - in our example, this might be something like “Even if most resources were infinitely substitutable, we should not waste them unnecessarily”.
Now depending on your ego involvement, your latitude landscape will be different. If you are highly ego involved regarding an anchor, you view the issue as highly important and your stance on it significantly contributes to your self-identity. As a result, you have a very narrow latitude of acceptance, an almost non-existent latitude of non-commitment, and a very large latitude of rejection. Thus, it will be quite hard to actually convince you to change your beliefs - in all likelihood, that would require starting small, first aiming at widening the latitude of non-commitment and not directly addressing any big issues. Two further psychological mechanisms are worth of note here: the “contrast effect” occurs whenever a proposition falls within someone's latitude of rejection; it will be perceived as being considerably more distant from that person's own position than it actually is. The “assimilation effect” is the exact opposite: a proposition that is within someone's latitude of acceptance will be perceived as being significantly closer to that person's own position than it actually is. Also, note that in cases of very high ego involvement, persuading someone to change his mind may be virtually impossible no matter how rationally convincing your arguments are.
Cognitive dissonance, latitudes, and ego involvement are concepts that can be used to describe overlapping aspects of how people deal with incoming information that conflicts with what they already believe. However, the likelihood of someone accepting a proposition as valid is dependent on more than his current set of beliefs about the world. According to a prominent psychological theory known as the Elaboration Likelihood Model (ELM; Petty & Cacioppo 1986), there are two basic cognitive processing routes for dealing with incoming information: a central one and a peripheral one. Which route is chosen in a given context depends mainly on the cognitive resources available at the time and motivation; these two factors determine the name-giving “elaboration likelihood”.
A person that is highly motivated to learn about a particular topic while not being distracted, tired, or otherwise suffering from temporarily impaired mental abilities is likely to pursue the central processing route. What this means is that new bits of information will be thoroughly judged according to the quality of the arguments that are put forward to support them. If a proposition is accepted as valid, that can lead to significant and permanent changes regarding attitudes, beliefs, and behavior. Processing via the central route is, of course, not free from the potentially detrimental psychological mechanisms we have already discussed - think of rationalization in particular. However, it is essentially the most objective way of assessing information that we humans have at our disposal. A good example for central-route processing would be someone actively researching an issue on the internet or attentively listening to somebody giving a talk.
If, on the other hand, a person is not motivated (i.e. does not think what he is told is important) or only has limited cognitive resources available (i.e. because the TV is running in the background) or both, information processing will be peripheral. Consequently, arguments will not be evaluated on the basis of their inherent merits, but according to superficial attributes like the perceived authority of the person delivering the message. Other cognitive biases and shortcuts also play a large role here (think of selective perception and assimilation/contrast effects). As you might imagine, peripheral route processing only induces small and temporary changes in attitudes and behavior, if at all. A good example would be someone watching the evening news after an exhausting day at work, mostly looking for entertainment, not information.
So why is it tough to convince people that there is a need for radical change of their attitudes and behavior? As we have seen, the reasons are manifold. First of all, it's tough even to get people to listen to what you have to say. If they are not paying attention, it doesn't matter how well thought through your arguments are, you just won't get them across. You need to raise interest if you want to create lasting effects, and you need to consider the context in which what you say will be received. Secondly, people will avoid revising large parts of their belief system. Given a choice, they will probably ignore a single inconvenient fact such as the world-wide increase in oil consumption vs. the known supply, or attempt to explain this away. Being persistent can help here, and offering alternative ways of thinking about the issue is also a good idea, rather than just saying something like “Well what you think is certainly wrong.” Thirdly, the stronger people feel about an issue, right or wrong, the less willing they will be to consider deviating from their position. With issues as fraught with controversy as global oil production / consumption and nuclear energy, it may not be possible to surmount people’s ego involvement. You will have to start by promoting small facts and put in considerable patient effort to exert influence upon others’ belief systems.