Closing out this series, it strikes me that your view of the future boils down to one central question: energy or innovation?
Most people are unaware of how recent our ever-growing technologically-based society is. In the beginning of this series, we featured some economic history showing that the lifestyles we take for granted began on a grand historical scale only yesterday – 1870 seems to be a good starting point of longer life expectancy, decreased infant mortality, population explosion, height increases, abundant goods, and the plentiful jobs. I was unaware myself of just how recent this all was. Yet we assume that it will not only continue into the future, but accelerate. It has become "the new normal" for the modern world.
But what caused it? Some say it was the scientific revolution which did it. Others that was institutional reforms allowing capitalism to flourish and preventing elites from harvesting any and all gains from innovation for themselves. Some say it was Enlightenment social reforms banishing religious domination over the lives and thinking of common people and allowing them to advance. Some say it was the resources from the plundering of the new world.
One view of the future sees change and progress (in their view) as exponential and accelerating. They see scientific discovery driving limitless innovation, as each discovery builds upon previous ones. Things formerly unimaginable have become commonplace, and reality has consistently defied predictions of what was possible, as well as past predictions of doom. They envision a world of mastery over materials through nanotechnology, limitless energy from fusion, robotic servants tending to our every need, plentiful goods thanks to 3D printing, intelligence embedded into everything via computer chips and artificial intelligence, and the elimination of sickness and aging via biotechnology.
The other sees humans as rapacious locusts temporarily burning through a one-time windfall of fossil fuels. As that windfall is used up, scientific progress and innovation will slowly grind to a halt. Humans will regress to simpler and more basic forms of social and political organization, as they will no longer have the energy needed to sustain the complexity of the modern globalized economy. We will end up as squatters among the ruins of past civilizations, marveling at what we were able to accomplish using the bounty of fossil fuels which can only be used once. No longer able to sustain such a huge population, it will fall either slowly or rapidly, and the world will come to resemble the pre-fossil-fuel era of low growth and Malthusian limits. There will be a recrudescence of things like starvation, disease, economic depressions, famine, resource wars, social breakdown and political strife.
Now the fundamental difference between these two views rests on where you think human progress (yes, I’m aware of the pitfalls of the term, but it will have to suffice) comes from.
If you believe that the primary engine driving progress and change is innovation, you see innovation continuing apace and accelerating. By contrast, if you believe that the driving engine behind change and progress is energy, and more generally, resources, you are likely to see a marked decline in the human condition as we pass the peak of various fossil fuels and are forced to use lower net energy sources, as well as looming scarcity in key materials like fresh water, arable land, topsoil, phosphorus, copper, uranium, rare earth metals, etc. One side takes a materialist view. The other side sees resources as an incidental factor, with human inventiveness as an "the ultimate resource" that is inexhaustible, as economist Julian Simon put it.
One common argument is that before the modern era, our knowledge was "compounding" and that it only reached a critical mass in this time period, allowing us to make a quantum leap:
Suppose, I give you a magic coin worth 1 cent, which multiplies itself 100 times every year.
At the end of 1 year, you would have a negligible amount: $1.
At the end of 2 years, you would have a very small sum: $100.
At the end of 3 years, you would have barely enough: $10,000.
At the end of 4 years, you start seeing a modest $1 million dollar heap.
At the end of 5 years, you would have a good $100 million.
Now, at the end of the fifth year, you come to me and say, "I have kept the coin with me for 5 years, but 99 percent of the money it made came in last year. What was the coin doing before that?"
So, to answer in one word: compounding.
Why Has 99 Percent of the Technological Progress by Modern Humans Come in the Last 10,000 Years? (Slate)
This is the "back half of the chessboard" idea; that change is exponential, and therefore tends to move more and more rapidly over time. This idea has been most forcefully argued by Ray Kurzweil, who projects this trend out into the future:
When people think of a future period, they intuitively assume that the current rate of progress will continue for future periods. However, careful consideration of the pace of technology shows that the rate of progress is not constant, but it is human nature to adapt to the changing pace, so the intuitive view is that the pace will continue at the current rate. Even for those of us who have been around long enough to experience how the pace increases over time, our unexamined intuition nonetheless provides the impression that progress changes at the rate that we have experienced recently. From the mathematician’s perspective, a primary reason for this is that an exponential curve approximates a straight line when viewed for a brief duration. So even though the rate of progress in the very recent past (e.g., this past year) is far greater than it was ten years ago (let alone a hundred or a thousand years ago), our memories are nonetheless dominated by our very recent experience. It is typical, therefore, that even sophisticated commentators, when considering the future, extrapolate the current pace of change over the next 10 years or 100 years to determine their expectations. This is why I call this way of looking at the future the “intuitive linear” view.
But a serious assessment of the history of technology shows that technological change is exponential. In exponential growth, we find that a key measurement such as computational power is multiplied by a constant factor for each unit of time (e.g., doubling every year) rather than just being added to incrementally. Exponential growth is a feature of any evolutionary process, of which technology is a primary example. One can examine the data
in different ways, on different time scales, and for a wide variety of technologies ranging from electronic to biological, and the acceleration of progress and growth applies. Indeed, we find not just simple exponential growth, but “double” exponential growth, meaning that the rate of exponential growth is itself growing exponentially. These observations do not rely merely on an assumption of the continuation of Moore’s law (i.e., the exponential shrinking of transistor sizes on an integrated circuit), but is based on a rich model of diverse technological processes. What it clearly shows is that technology, particularly the pace of technological change, advances (at least) exponentially, not linearly, and has been doing so since the advent of technology, indeed since the advent of evolution on Earth.
The Law of Accelerating Returns (Kurzweil Hub)
Less extreme conrnucopian views have been argued most recently by bailed-out banker Matt Ridley ('The Rational Optimist'), silicon valley investor Peter Diamandis ('Abundance:The Future Is Better Than You Think'), former Microsoft executive Ramez Naam ('The Infinite Resource'), financial journalist Daniel Ben-Ami ('Ferraris For All'), Bjorn Lomborg ('The Skeptical Environmentalist'), Stewart Brand, Steven Pinker, and as we featured last time, Joel Mokyr. The belief is that human ingenuity solves all problems, and that we have left behind the Malthusian world forever. Human society is progressing, we've just hit a temporary bump in the road, and we just need to iron out the difficulties.
The innovation case rests on two arguments. One is the fact that large-scale, collaborative, empirical, experimental, peer-reviewed science has become the lifeblood of modern growth-based economies, and that we have never seen that institutionalized before now. As the eminent historian of science Alfred North Whitehead put it, “the greatest invention of the nineteenth century is the method of invention.”
“The stone age didn’t end for lack of stones,” is the popular platitude here. There are many just-so stories trotted out – energy-dense coal replaced wood that became scarce through deforestation; kerosene replaced oil from declining populations of overhunted sperm whales; gasoline-powered automobiles made sure that cities could continue grow without drowning in horse manure; shellac from southeast Asian beetles which was used to insulate electronics was replaced by plastic, nitrogen fixed from the atmosphere replaced dwindling sources of South American guano, and so on.
But note that all of the above depend on fossil fuels. This brings up the counter-argument that the only thing that allowed us to escape the Malthusian trap was the harvesting of a massive store of solar energy that had been harvested by the earth over geologic time-scales, just waiting for humans to discover how to use it.
The case for energy as the primary driver is well summarized in this essay by Nate Hagens:
The chemical potential energy available from the burning of things (e.g. wood) is rather astounding when compared with the energy which we supply our bodies in the form of food, and the fossil fuels of coal, oil, and natural gas burn even hotter while also being much easier to store and transport. We quickly learned that using some of this heat to perform work would transform what we could accomplish in massive ways. One barrel of oil, priced at just over $100 boasts 5,700,000 BTUs or work potential of 1700kWhs. At an average of .60 kWh per work day, to generate this amount of 'labor', an average human would have to work 2833 days, or 11 working years. At the average hourly US wage rate, this is almost $500,000 of labor can be substituted by the latent energy in one barrel of oil that costs us $100. Unbeknownst to most stock and bond researchers on Wall Street, this is the real ‘Trade’.
The vast majority of our industrial processes and activities are the result of this ‘Trade’. We applied large amounts of extremely cheap fossil carbon to tasks humans used to do manually. And we invented many many more. Each time it was an extremely inefficient trade from the perspective of energy (much more energy used) but even more extremely profitable from the perspective of human society. For instance, depending on the boundaries, driving a car on a paved road uses 50-100 times the energy of a human walking, but gets us to where we are going 10 times faster. The ‘Trade’ is largely responsible for some combination of: higher wages, higher profits, lower priced goods and more people. The average American today consumes ~60 barrel of oil equivalents of fossil carbon annually, a 'subsidy' from ancient plants and geologic processes amounting to ~600 years of their own human labor, before conversion. Even with 7 billion people, each human kWh is supported by over 90kWh of fossil labor, and in OECD nations about 4-5 times this much.
Technology acts as an enabler, both by inventing new and creative ways to convert primary energy into (useful?) activities and goods for human consumption and, occasionally, by making us use or extract primary energy in more efficient ways. Even such services that appear independent of energy, are not so- for example, using computers, iPhones, etc in aggregate comprise about 10% of our energy use, when the servers etc are included. Technology can create GDP without adding to energy use by using energy more efficiently but:
a) much of the large theoretical movements towards energy efficiency have already occurred and b) energy saved is often used elsewhere in the system to build consumption demand, requiring more and more primary energy (Jevons paradox, rebound effect).
Despite the power in the Trade, its benefits can be readily reversed. Firstly, if we add obscene amounts of energy, even cheap energy, the wage increases/benefits start to decline. But more importantly, and has been happening in the past decade or so, as energy prices increase, so too do the benefits of the “Trade” start to wane. The graph to the right (source, page 18) shows that as the price of energy doubles or triples the benefits of this 'Trade' quickly recede. This is especially true for the extremely energy intensive processes, like aluminum smelting, cement manufacture- fully 30% of US industry falls into this category. This reduction in 'salary' can only partially be offset by efficiency measures or lean manufacturing moves, because the whole 'Trade' was predicated on large amounts of very cheap energy. Basically, the benefits to human societies from the mammoth bank account we found underground are almost indistinguishable from magic. Yet we have managed, over time, to conflate the Magic with the Wizard.
Twenty (Important) Concepts I Wasn't Taught in Business School (The Oil Drum)
This view has also been forcefully argued by Chris Martenson:
The really big picture goes like this: Humans discovered about 400 million years worth of stored sunlight in the form of coal, oil, and natural gas, and have developed technologies that will essentially see all of that treasure burned up in just 300 to 400 years.
On the faulty assumption that fossil fuels will always be a resource we could draw upon, we fashioned economic, monetary, and other assorted belief systems based on permanent abundance, plus a species population on track to number around 9 billion souls by 2050.
There are two numbers to keep firmly in mind. The first is 22, and the other is 10. In the past 22 years, half of all of the oil ever burned has been burned. Such is the nature of exponentially increasing demand. And the oil burned in the last 22 years was the easy and cheap stuff discovered 30 to 40 years ago. Which brings us to the number 10.
In every calorie of food that comes to your table are hidden 10 calories of fossil fuels, making modern agriculture and food delivery the first type in history that consumes more energy than it delivers. Someday fossil fuels will be all gone. That day may be far off in the future, but preparing for that day could (and one could argue should) easily require every bit of time we have.
What galls me at this stage is that all of the pronouncements of additional oil being squeezed, fractured, and otherwise expensively coaxed out of the ground are being delivered with the message that there's so much available, there's nothing to worry about (at least, not yet.) The message seems to be that we can just leave those challenges for future people, who we expect to be at least as clever as us, so they'll surely manage just fine.
Instead, the chart above illustrates that on a reasonably significant timeline, the age of fossil fuels will be intense and historically quite short. The real question is not Will it run out? but Where would we like to be, and what should the future look like when it finally runs out? The former question suggests that "maintain the status quo" is the correct response, while the latter question suggests that we had better be investing this once-in-a-species bequeathment very judiciously and wisely.
Energy is vital to our economy and our easy, modern lives. Without energy, there would be no economy. The more expensive our energy is, the more of our economy is dedicated to getting energy instead of other pursuits and activities. Among the various forms of energy, petroleum is the king of transportation fuels and is indispensible to our global economy and way of life.
To what do we owe the recent explosion in technology and living standards? To me the answer is simple: energy. Because a very large proportion of our society was no longer tied up with the time-consuming tasks of growing their own food or building and heating their own shelter, they were free to do other very clever things, like devote their lives to advancing technology. When energy starts to get out of reach either economically or geologically, then people revert to more basic things, like trying to stay warm
Like every other organism bestowed with abundant food – in this case, fossil fuels that we have converted into food, mobility, shelter, warmth, and a vast array of consumer goods – we first embarked on a remarkable path of exponential population growth. Along with these assorted freedoms from securing the basics of living, we also fashioned monetary and economic systems that are fully dependent on perpetual exponential growth for their vitality and well-being. These, too, owe their very sustenance to energy.
It bears repeating: Not just energy is important here, but net energy. It's the energy left over after we find and produce energy that is available for society to do all of its complicated and clever things.
The really, really big picture: There isn't going to be enough net energy for the economic growth we want (Peak Prosperity)
Such views have also been argued by Richard Heinberg, Jeremy Grantham, John Michael Greer, and others. It's more extreme views are argued by James Howard Kunstler, Michael Ruppert and Guy McPherson (who believes climate change will bring about near-term human extinction). Note that people that do not have access to this abundant source of energy are still living in the Malthusian world. As A Farewell to Alms points out:
English workers of 1800 could purchase much more of most goods than their Malawian counterparts...If a Malawian had tried to purchase the consumption of an English worker in 1800 he would have been able to afford only 40 percent as much. Thus living standards in England were possibly 2.5 times greater than those of current-day Malawi. Yet the meager wage in Malawi is still above the subsistence level for that economy in healthy modern conditions, since the Malawian population continues to grow rapidly...Hundreds of millions of Africans now live on less than 40 percent of the income of preindustrial England. (p. 44)
But need we be limited by fossil fuels? One author doesn't think so. Here's Ramez Naam:
So we’re at a crucial point in human history – a race between destruction and creation. On the one side, we have the pace at which we’re consuming finite resources and warming and polluting the planet – a trend with disastrous consequences should it continue unchecked. On the other side, we have our vigorous progress in innovating to tap more efficiently and cleanly into a truly enormous supply of fundamental natural resources the planet provides.
Are we on track to win this race?
That’s not at all clear. Consider, for a moment, climate and energy. Multiple groups have proposed plans by which the world could be powered almost entirely by renewable energy by 2050, or, in the most ambitions plans, by 2030.
Yet even as those plans are articulated, worldwide CO2 emissions are rising, not falling. In 2012, the planet as a whole emitted a record-breaking 35.6 billion tons of CO2 into the atmosphere. And the concentration of greenhouse gases in the atmosphere is surging along with our annual emissions. In 2012, atmospheric CO2 concentrations rose by the largest amount in 15 years to a new level of 395 ppm, most of the way to the 450ppm that climate scientists have articulated as the threshold for dangerous warming.
The fundamental driver here is economics. Consumers, businesses, and industry want energy. They need energy. That’s true everywhere in the world. And they will buy whatever sort of energy is cheapest. Indeed, if a new source of energy is sufficiently cheaper than the old, consumers will switch their energy consumption from the old to the new.
If we want to win the race against climate change, one thing matters more than all others: make renewable energy (including storage) cheap. Dirt cheap. And do it fast.
How do we do that? Fundamentally, we need to increase the pace of innovation. And there are two clear strategies to do so.
The first is to invest more in clean energy R&D. In 2012, the US suffered $100 billion in damage from the climate-linked disasters of Hurricane Sandy and the still-ongoing drought. Yet we spent only $5 billion on clean energy R&D, an amount that’s roughly half of what we spent in the 1980s. It’s also a small fraction of the $30 billion the US spends each year on medical research and the $80 billion the US spends each year on defense R&D. Yet in a very real sense, clean energy R&D is an investment in both future health and in national security. Bill Gates proposed last year that this amount should be roughly tripled to $16 billion. That’s a fine start.
The second is to be more inclusive in our cost accounting. The market is a brilliant algorithm that does a masterful job of allocating resources and driving incentives – so long as costs are fully transparent to it. But sometimes, a cost is completely missing from the books – missing in such a way that the market can’t see it.
Fossil fuels have substantial side effects that those who burn them aren’t charged for. The damage done to the environment – and thus, to others – is a cost that society pays, which isn’t passed on to the polluter. That cost is high. Peer-reviewed research suggests that every ton of CO2 emitted inflicts somewhere between $55 and $250 of damage on the environment and others.
Because that cost isn’t passed on as part of the price of fossil fuel use, the market misbehaves. The overall cost of coal, natural gas, and oil is higher than the price paid at the pump or on the power bill. But the part that’s missing is being inflicted on others, spread out over billions of people on the planet, and smeared out over years to come.
By driving the cost of renewable energy down, a carbon price has a global effect – those cheaper renewable energy sources become more attractive to consumers around the world, whether their own country has a carbon price or not.
I’ve focused here primarily on climate, because it’s the threat that touches all others. But similar approaches apply to food, to water, and to fish in the ocean. In all of those cases, there’s room for substantially higher federal R&D – to invest in crops that have higher yields, particularly for the developing world; to develop new low-cost ways to cut water usage in farming; and to put more sensible prices and restrictions on the over-fishing of deep ocean fish, and thus accelerate the shift to sustainable fish farming.
Ultimately, there are two paths forward for us, the easy way and the hard way.
In the easy way, we acknowledge the evidence that we are causing real harm to our planet, leaving it worse off for future generations, and flirting with the possibility of sudden and dramatic consequences. We retain our optimism, that we can both address these problems and be far richer in the future than we are today. We take our wildly successful economic system and we fix it so that it recognizes the value of our shared resources and encourages their protection, restoration, and careful, efficient, sustainable use. We invest in action to reduce the risk of even worse future disasters caused by our unwise past. Nothing is certain in life. But on that path, the most likely outcome is that we’ll solve the problems that plague us and grow progressively richer even as we reduce and eventually reverse our negative impact on the planet.
On this path, there’s no sign that economic growth needs to end. There’s no sign that we’re anywhere near the wealth limit of this planet. We have sufficient energy, sufficient water, and the capacity to grow sufficient food to provide 9 or 10 billion people with a level of affluence far beyond what even the richest in the world enjoy today.
The Limits of the Earth, Part 2: Expanding the Limits (Scientific American)
A third view does not look so much at energy, but believes that epoch-changing inventions can only be invented once, and once they are, growth rates will stagnate. They talk about the nineteenth and twentieth centuries, when certain countries escaped the Malthusian trap as a temporary period of extraordinary growth, and that we have now reached a plateau in living standards and innovation. They predict that there will be a "time out" from the manic economic growth we have seen previously, and an economic slowdown that we are ill-prepared to deal with. This has been argued by Robert Gordon, Tyler Cowen and Jan Vijg. As New York Magazine put it in their introduction to the following article, "What if everything we’ve come to think of as American is predicated on a freak coincidence of economic history? And what if that coincidence has run its course?":
Picture this, arranged along a time line.
For all of measurable human history up until the year 1750, nothing happened that mattered. This isn’t to say history was stagnant, or that life was only grim and blank, but the well-being of average people did not perceptibly improve. All of the wars, literature, love affairs, and religious schisms, the schemes for empire-making and ocean-crossing and simple profit and freedom, the entire human theater of ambition and deceit and redemption took place on a scale too small to register, too minor to much improve the lot of ordinary human beings. In England before the middle of the eighteenth century, where industrialization first began, the pace of progress was so slow that it took 350 years for a family to double its standard of living. In Sweden, during a similar 200-year period, there was essentially no improvement at all. By the middle of the eighteenth century, the state of technology and the luxury and quality of life afforded the average individual were little better than they had been two millennia earlier, in ancient Rome.
Then two things happened that did matter, and they were so grand that they dwarfed everything that had come before and encompassed most everything that has come since: the first industrial revolution, beginning in 1750 or so in the north of England, and the second industrial revolution, beginning around 1870 and created mostly in this country. That the second industrial revolution happened just as the first had begun to dissipate was an incredible stroke of good luck. It meant that during the whole modern era from 1750 onward – which contains, not coincidentally, the full life span of the United States – human well-being accelerated at a rate that could barely have been contemplated before. Instead of permanent stagnation, growth became so rapid and so seemingly automatic that by the fifties and sixties the average American would roughly double his or her parents’ standard of living. In the space of a single generation, for most everybody, life was getting twice as good.
At some point in the late sixties or early seventies, this great acceleration began to taper off. The shift was modest at first, and it was concealed in the hectic up-and-down of yearly data. But if you examine the growth data since the early seventies, and if you are mathematically astute enough to fit a curve to it, you can see a clear trend: The rate at which life is improving here, on the frontier of human well-being, has slowed.
If you are like most economists – until a couple of years ago, it was virtually all economists – you are not greatly troubled by this story, which is, with some variation, the consensus long-arc view of economic history. The machinery of innovation, after all, is now more organized and sophisticated than it has ever been, human intelligence is more efficiently marshaled by spreading education and expanding global connectedness, and the examples of the Internet, and perhaps artificial intelligence, suggest that progress continues to be rapid.
But if you are prone to a more radical sense of what is possible, you might begin to follow a different line of thought. If nothing like the first and second industrial revolutions had ever happened before, what is to say that anything similar will happen again? Then, perhaps, the global economic slump that we have endured since 2008 might not merely be the consequence of the burst housing bubble, or financial entanglement and overreach, or the coming generational trauma of the retiring baby boomers, but instead a glimpse at a far broader change, the slow expiration of a historically singular event. Perhaps our fitful post-crisis recovery is no aberration. This line of thinking would make you an acolyte of a 72-year-old economist at Northwestern named Robert Gordon, and you would probably share his view that it would be crazy to expect something on the scale of the second industrial revolution to ever take place again.
“Some things,” Gordon says, and he says it often enough that it has become both a battle cry and a mantra, “can happen only once.”
The Blip (New York Magazine)
Some see this plateau as essentially permanent, others as temporary, with growth heating up again once we have adapted our most recent inventions and integrated them into society. They do not believe that we will collapse back into the Malthusian world of pre-1870, but neither do they see exponential growth continuing forever like the cornucopians. I call these stagnationists. Note that no stagnationist, to my knowledge, has taken dwindling net energy sources into account.
Cornucopians, doomers, stagnationists. Whom to believe? And where does the truth lie? The various camps are talking past each other, but I think that the fundamental difference between the views is, whether the primary driving factor behind change over time is innovation or energy. It's an important question, because whether innovation or energy is the main factor driving growth and progress will largely determine which of these two competing views is a better predictor of humanity’s future trajectory. This fundamental question needs solving if we are ever to resolve this argument.
Well, we’re not going to solve that question here on this blog, obviously, but next time I'll put forward a few concluding thoughts on the subject.