Economic History in 10 Minutes
Throughout over 90 percent of our species’ history, we humans lived by hunting and gathering in what anthropologists call gift economies. People had no money, and there was neither barter nor trade among members of any given group. Trade did exist, but it occurred only between members of different communities.
It’s not hard to see why sharing was the norm within each band of hunter-gatherers, and why trade was restricted to relations with strangers. Groups were small, usually comprising between 15 and 50 persons, and everyone knew and depended upon everyone else. Trust was essential to individual survival, and competition would have undermined trust. Trade is an inherently competitive activity: each trader tries to get the best deal possible, even at the expense of other traders. For hunter-gatherers, cooperation—not competition—was the route to success, and so innate competitive drives (especially among males) were moderated through ritual and custom, while a thoroughly entangled condition of mutual indebtedness helped maintain a generally cooperative attitude on everyone’s part.
Today we still enjoy vestiges of the gift economy, notably in the family. We don’t keep close tabs on how much we are spending on our three-year-old child in an effort to make sure that accounts are settled at some later date; instead, we provide food, shelter, education and more as free gifts, out of love. Yes, parents enjoy psychological rewards, but (at least in the case of mentally healthy parents) there is no conscious process of bargaining, in which we tell the child, "I will give you food and shelter if you repay me with goods and services of equivalent or greater value."
For humans in simple societies, the community was essentially like a family. Freeloading was occasionally a problem, and when it became a drag on the rest of the community it was punished by subtle or not-so-subtle social signals—ultimately, ostracism. But otherwise no one kept score of who owed whom what; to do so would have been considered very bad manners.
We know this from the accounts of 20th-century anthropologists who visited surviving hunter-gatherer societies. Often they reported on the amazing generosity of people who seemed eager to share everything they owned despite having almost no material possessions and being officially listed by aid agencies as among the poorest people on the planet. Anthropologists routinely felt embarrassed by this generosity, and, in one instance after another, after being gifted some prized food or a painstakingly hand-made basket, immediately offered a manufactured knife or ornament in return. The anthropologists assumed that natives would be happy to receive the trinkets, but the recipients instead appeared insulted. What had happened? The natives’ initial gifts were a way of saying, "You are part of the family; welcome!" But the immediate offering of a gift in return smacked of trade—something only done with strangers. The anthropologists were understood as having said, "No, thanks. I do not wish to be considered part of your family; I want to remain a stranger to you." It was the ultimate faux pas!
Here is all of economic history compressed into one sentence: As societies have grown more complex, larger, more far-flung and diverse, the tribe-based gift economy has shrunk in importance, while the trade economy has grown to dominate nearly every aspect of people’s lives, and has expanded in scope to encompass the entire planet.
With more and more of our daily human interactions based on exchange rather than gifting, we have developed polite ways of being around each other on a daily basis while maintaining an exchange-mediated social distance. This is particularly the case in large cities, where anonymity is fostered also by the sheer numbers of people one sees from day to day. In the best instances, we still take care of one another—through government programs and private charities. We still enjoy some of the benefits of the old gift economy in our families and churches. But increasingly, the market rules our lives. Our apparent destination in this relentless trajectory toward expansion of trade is a world in which everything is for sale, and all human activities are measured by and for their monetary value.
Humanity has benefited in many obvious ways from this economic evolution: the gift economy really only worked when we lived in small bands and had almost no possessions to speak of. So letting go of the gift economy was a trade-off for progress—houses, cities, cars, iPods, and all the rest. Still, saying goodbye to community-as-family was painful, and there have been various attempts throughout history to try to revisit it. Communism was one such attempt, and we know how that worked out. Trying to institutionalize a gift economy at the scale of the nation state introduces all kinds of problems, including those of how to reward initiative and punish laziness in ways that everyone finds acceptable, and how to deter corruption among those whose job it is to collect, count, and reapportion the wealth.
But, back to our tour of economic history. Along the road from the gift economy to the trade economy there were several important landmarks. Of these, the invention of money was arguably the most important. Money is essentially a tool to facilitate trade. People invented it because they needed a medium of exchange to make trading easier, simpler, and more flexible. Once money came into use, the exchange process was freed to grow and to insert itself into aspects of life where it had never been permitted previously. Money simultaneously began to serve other functions as well—principally, as a measure and store of value.
Today we take money for granted. But until fairly recent times it was an oddity, something only merchants used on a daily basis. Some complex societies, including ancient Egypt, managed to do almost completely without it; even in the U.S., until the mid-20th century, many rural families used money only for occasional trips into town to buy nails, boots, glass, or other items they couldn’t grow or make for themselves on the farm. In his marvelous book The Structures of Everyday Life: Civilization & Capitalism 15th-18th Century, historian Fernand Braudel wrote of the gradual insinuation of the money economy into the lives of medieval peasants: "What did it actually bring? Sharp variations in prices of essential foodstuffs; incomprehensible relationships in which man no longer recognized either himself, his customs or his ancient values. His work became a commodity, himself a ‘thing.’"
While early forms of money consisted of anything from sheep to shells, coins made of gold and silver gradually emerged as the most practical, universally accepted means of exchange, measure of value, and store of value.
Money’s ease of storage enabled industrious individuals to accumulate substantial amounts of wealth. But this concentrated wealth also presented a target for thieves. Thievery was especially a problem for traders: while the portability of money enabled them to travel for long distances to purchase rare fabrics and spices, highwaymen often lurked along the way, ready to snatch a purse at knife-point. These problems led to the invention of banking—a practice in which metal-smiths who routinely dealt with large amounts of gold and silver (and who were accustomed to keeping it in secure, well-guarded vaults) agreed to store other people’s coins, offering storage receipts in return. Storage receipts could then be traded as money, thus making trade easier and safer.
Eventually, goldsmith-bankers realized that they could issue paper receipts for more gold than they had in their vaults, without anyone being the wiser. They did this by making loans of the receipts, for which they charged a fee amounting to a percentage of the loan.
Initially the Church regarded the practice of profiting from loans as a sin—known as "usury"—but the bankers found a loophole in religious doctrine: it was permitted to charge for reimbursement of expenses incurred in making the loan; this was termed "interest." Gradually bankers widened the definition of "interest" to include what had formerly been called "usury."
The practice of loaning out receipts for gold that didn’t really exist worked fine, unless many receipt-holders wanted to redeem paper notes for gold or silver all at once. Fortunately for the bankers, this happened so rarely that eventually the writing of receipts for more money than was on deposit became a perfectly respectable practice known as fractional reserve banking.
It turned out that having increasing amounts of money in circulation was a benefit to traders and industrialists during the historical period when all of this was happening—a time when unprecedented amounts of new wealth were being created, first through colonialism and slavery, but then through the harnessing of the enormous energies of fossil fuels.
The last impediment to money’s ability to act as a lubricant for transactions was its remaining tie to precious metals. As long as paper notes were redeemable for gold or silver, the amounts of these substances existing in vaults put at least a theoretical restraint on the process of money creation. Paper currencies not backed by metal had sprung up from time to time previously; by the late 20th century, they were the near-universal norm.
Along with more abstract forms of currency, the past century has also seen the appearance and growth of ever-more sophisticated investment instruments. Stocks, bonds, options, futures, long- and short-selling, derivatives, credit default swaps, and more now enable investors to make (or lose) money on the movement of prices of real or imaginary properties and commodities, and to insure their bets, and even their bets on other investors’ bets.
Probably the most infamous investment scheme of all time was created by Charles Ponzi, an Italian immigrant to the U.S. who, in 1919, began promising investors he could double their money within 90 days. Ponzi told clients the profits would come from buying discounted postal reply coupons in other countries and redeeming them at face value in the United States—a technically legal practice that could yield up to a 400 percent profit on each coupon redeemed due to differences in currency values. What he didn’t tell them was that each coupon had to be redeemed individually, so the red tape involved would entail prohibitive costs if large numbers of the coupons (which were only worth a few pennies) were bought and redeemed. In reality, Ponzi was merely paying early investors returns from the principal amounts put down by later investors. It was a way of shifting wealth from the many to the few, with Ponzi skimming off a lavish income as the money passed through his hands. At the height of the scheme, Ponzi was raking in $250,000 a day, millions in today’s dollars. Thousands of people lost their life savings, in some cases having mortgaged or sold their houses in order to invest.
A few critics (primarily advocates of gold-backed currency) have called fractional reserve banking a kind of Ponzi scheme, and there is some truth to the claim. As long as the real economy of goods and services within a nation is growing, an expanding money supply seems justifiable, arguably necessary. However, a resource-consuming economy cannot continue to grow forever on a finite planet. Units of currency—which exist today mostly in the form of electronic bookkeeping entries—are essentially claims on labor and resources; and, as those claims multiply (with the growth of the money supply), and as resources deplete, eventually the remaining resources will be insufficient to satisfy all of the existing monetary claims. And so those claims will lose value, perhaps dramatically and suddenly. When this happens, paper and electronic currency systems based on money creation through fractional reserve banking will produce results somewhat similar to those of a Ponzi scheme: i.e., a few may profit, at least temporarily, but the vast majority will lose much or all of what they have.
Is this the end of the story? As society dramatically simplifies itself in the wake of fossil fuel depletion, will we revert to some form of gift economy? Or will we catch and steady ourselves on some intermediate rung on the ladder of economic development?
Only time will tell. Perhaps a general knowledge of our economic history can help us assess the options ahead and plan for a managed "money descent," just as some far-seeing Transition communities are planning for "energy descent."
(Published in the National Post of Canada, March 19)
The "Peak Oil" concept — that the world’s petroleum-production rate will soon reach its maximum and commence an inevitable decline, with negative economic consequences — has been around in scientifically articulated form at least since 1998, long enough to see it confirmed in significant ways.
The rate of discovery of new oilfields has been falling since 1964. The biggest find in recent years is Tupi, in Brazilian waters, which is claimed to hold five-to-eight billion barrels of oil; but that’s only enough to slake the world’s thirst for 60 to 90 days. Most producing nations are past their domestic peaks and are experiencing slowing output, despite every effort to maintain flow rates.
Skeptics point out that total world oil reserves continue to grow. But this may not be a reliable indication of where we stand: Often, in nations that have seen a peak and subsequent decline in production, domestic reserves continued to rise right up to, or even past, the date of peak production. Why? Oil companies replace reserves of high-quality, cheaply-produced oil with reserves of low-quality, slow-, or expensive-to-produce oil or tar sands.
Rates of output decline in older, giant oilfields have proven to be more trustworthy indicators of long-term trends. (For instance, they’ve enabled successful peaking forecasts for the United States, the North Sea and other regions). For the world, the average decline rate from existing fields has been calculated by the International Energy Agency at 4.5% per year. The world needs to develop the equivalent of a Saudi Arabia’s worth of oil production capacity every four years to offset such declines. This is quite a burden for the industry, which must now look for oil in ultra-deep water, in polar regions, or in politically fractured nations, since all the easy-to-find, easy-to-extract oil already has been located and much of it pumped.
So far, the record year for world crude production was 2005, and the record month was July 2008. Tellingly, the leveling-off of extraction rates between 2005 and 2008 occurred in the context of rising oil prices; indeed, in July 2008, the price spiked 50% higher than the previous inflation-adjusted record, set in the 1970s. Yet as both oil demand and prices rose, production barely budged in response.
While many commentators believe the jury is still out on Peak Oil, the list of petroleum analysts who say world oil production has already peaked, or will do so in the next five years, lengthens almost daily, and includes CEOs and other well-placed leaders within the oil industry.
The argument that oil production could theoretically continue to grow past 2015 is mainly put forward by organizations such as Cambridge Energy Research Associates and Saudi Aramco, which explain away evidence of dwindling discoveries, depleting oilfields and stagnating total production by claiming that it is demand for oil that has peaked, not supply — a claim that hinges on the observation that oil prices are high enough to discourage potential buyers. But high prices for a commodity usually signify scarcity, so the "peak demand" argument doesn’t hold water.
Peak Oil has significant implications for our economy. In response to the 2008 price spike, the global airline industry nose-dived and auto companies suffered. Worldwide shipping slowed drastically and hasn’t recovered. Demand for oil plummeted in late 2008, and so did the price — temporarily. But today’s price is again high, almost to the point of nipping economic recovery.
What should we do about Peak Oil? Start with what the U.K. Industry Task Force on Peak Oil (which included Sir Richard Branson of Virgin Airlines) has done: Acknowledge the reality of supply limits. Then study the vulnerabilities of Canada’s transport and food systems to high and volatile oil prices, and start making those systems more resilient and less oil-dependent.
But do it fast. Adaptation will take decades, and we are starting very late.
According to an article in Le Monde on March 25, the U.S. Department of Energy "admits that ‘a chance exists that we may experience a decline’ of world liquid fuels production between 2011 and 2015 ‘if the investment is not there.’" This bombshell emerged in "an exclusive interview with Glen Sweetnam, main official expert on the oil market in the Obama administration."
The Le Monde article goes on: "The DoE dismisses the ‘peak oil’ theory, which assumes that world crude oil production should irreversibly decrease in a nearby future, in want of sufficient fresh oil reserves yet to be exploited. The Obama administration supports the alternative hypothesis of an ‘undulating plateau.’ Lauren Mayne, responsible for liquid fuel prospects at the DoE, explains : ‘Once maximum world oil production is reached, that level will be approximately maintained for several years thereafter, creating an undulating plateau. After this plateau period, production will experience a decline.‘"
In other words, we don’t believe that world oil production will soon reach a maximum and begin to decline (the "peak oil theory"); instead, we believe that world oil production will reach a maximum, stay there for a few years, and then decline. That decline could commence as soon as next year.
Two comments: First, what’s the difference? Is this just a way to announce Peak Oil without acknowledging it? The idea of the "undulating plateau" has been part of the Peak Oil discussion for years (see my book Powerdown), and world oil production has in fact been at a plateau since late 2004. Second, how is it that readers in France now know more about U.S. Department of Energy oil supply forecasts than Americans do? There has been no equivalent article in the mainstream press in North America.
It’s time for the DoE to answer some tough questions. Too bad U.S. media outlets are evidently too timid, busy, or uninformed to bother themselves with the trivial business of alerting the American people to an impending calamity that is entirely foreseeable and that a few people in government are evidently willing to speak about (at least in code), if only someone asks.