Where will computing go in the coming years? I thought I should find out, so I watched this roundtable and other talks and interviews on the subject (warning: it's pretty dry stuff). I came away underwhelmed. Maybe it's that besides technological utopians of the Ray Kurzweil variety, the visions of the future presented by technologists seem both unappealing (or rather, neither interesting nor uninteresting) while at the same time infeasible and ignorant of global challenges (while Kurzweil's usually just fall in the latter category while also being of questionable value).
On innovation and limits.
As I was watching, I wondered: am I just anti-innovation? I mean that seriously, despite the fact that computing research is what I do. After all, after one of my talks on the energy use of the Internet, I was accused by an audience member of advocating a "Luddite point of view" for suggesting that replacing our electronic gadgets less frequently was a good way to save energy (more on that response another time). I think I do often question the value of technological innovation for its own sake. Maybe I shouldn't? I struggled to figure out what these guys were seeing that I wasn't. I'm not sure I've figured it out.
Eventually I came back to the one key issue that's missing from their roundtable conversation---and that of most conversations among engineers in the computing world---limits, both ecological and material. It's the thing we don't ever talk about, and yet limits are increasingly the key factor in every context in industrial society. What I'd like to do in this post and future posts on this subject is examine how the limits we're up against might reshape computing and technology over the next couple of decades.
Where we're headed.
I should say at the outset that I have no nostalgia for some pre-industrial, pre-technological utopia---life in, say, the 1800s was not only harder, but oftentimes more cruel and evil in ways that it isn't today. It's not something I wish for. (Authors like Kunstler, who coined the clever phrase "The Long Emergency," seem to relish a bit in the thought of a return to such a time perhaps because it would necessarily rid us of the excesses of our own age.) I have a hard time envisioning a broad return to pre-industrial or even pre-computing times in the next couple of decades, barring some extreme low-probability black swan. Instead, I think Sharon Astyk might be more right in that it will happen one person and family at a time:
Peak oil and climate change will hit most of us where it hurts - in our jobs, our pocketbooks, in the homes where we won't be able to make the rent or mortgage payment, in our health because we'll no longer be able to afford routine care, in our choices - instead of "vacation fund or 401K, we'll be wondering "shoes or groceries." Add in that we can expect the price of electricity to rise - carbon sequestration is expensive, nuclear power is expensive initially and dealing with its wastes is very expensive, investment in renewables is not cheap either - we can expect the price of our electricity to rise steadily.
So whether or not we ever have rolling blackouts again or grid failure, lots of us will be having our power turned off. And since electricity for the most part runs luxury items (although we are not accustomed to thinking of them as luxuries) like refrigeration and lights, if it comes down to hard choices like "food or electric," "lights or medicine" we should all recognize that electricity is not essential to (most) human life, and prepare to function well and comfortably without it.
Now I'm not so sure that people will really, at least at first, make the choices she describes---I'm reminded of the anecdote I read last week (and similar situations I've seen personally) of a person standing in their yard screaming into their smartphone at the electric company for turning off the power for three months of non-payment. There was an interesting wish in the roundtable---a hope for a virtual world more appealing than reality so we don't have to deal with the dreariness of reality itself, and it's that wish that I see in the prioritization of smartphones and computing over more basic elements of living. I suppose that's one way out of the puzzle, but in re-reading Mander right now (more on this too another time), there might be some social problems with such an approach. Personally I suppose the best summary is that I'm for all the technology we can afford (ecologically and financially)---no more, no less.
Constraints for computing.
Setting aside what we might wish for, what might it look like? Let's start by only looking 20 years out. Since it's difficult to forecast, I'd like to categorize what might happen as plausible, possible, and likely changes. We can begin by looking at what constraints we might face, which we might summarize as:
The end of Moore's Law is possible within the next decade and likely within the next two as a combined result of technological limits, financial/economic strains, and a lack of adequate advance planning (all of which you might say are the reasons we're facing the limits to growth more broadly). With it comes the likely end or slackening of the trend that's given us more computing power each year for the same amount of money. As a result, there'll be less need to buy new devices (as is already largely true, I think, in the case of laptops and desktops for standard home or business use). The industry that depends upon new purchases will have to adapt; there's already some evidence of this as companies are trying to break into new markets such as tablet PCs and smartphones instead of building more laptops and desktops. I know those who have faith in the free market believe this sort of adaptation can go on forever, but I'm skeptical that people need or will want more and more devices with largely overlapping functionality, especially if the underlying power per dollar isn't increasing measurably. (Though Mander's arguments come to mind again: advertising is a powerful force.)
In the roundtable video I linked above, the panelists spoke briefly about the end of Moore's law, but eventually came back to "because it is good, and has held true in the past, it will hold true in the future." I'm not sure that's a particularly compelling argument, but it's one I hear all the time. Oddly, they eventually acknowledge that Moore's law might end as a result of complexity limits---building new chips these days is a very capital intensive process as the chips are extraordinarily complex---but then they move on to other topics. (Just like later the moderator lets slip "the physical world keeps restraining us no matter how we try to escape it," though the sentiment isn't followed up on.)
It seems that there are a few sectors of the computing industry that will be affected, and in different ways. There are services---the Googles of the computing world---that provide fundamental services basically for free. In a world where individuals have little disposable income, and one with rising energy prices, is it likely that Google can remain free to its users supported only with advertising? In other words, at what price per KWh can Google no longer afford to offer free searches? Some day I'd like to dig through their SEC filings and try to estimate that.
There are network providers---telecoms---that form the backbone of the Internet. Just as studies indicate that the aviation industry is likely to cut back on service to rural airports, it seems likely that telecoms will cut back on maintenance of network service (both mobile and wired) to remote and rural areas at the expense of those in densely populated regions. It's possible this will result in steep rate hikes for those outside of major cities and a drop-off in the use of computing in such areas.
It is plausible that in a couple of decades, on the far side of Hubbert's peak, after three or four cycles of recession and partial recovery, the Internet will barely be working. I can't rule that out, but it seems more likely that it will simply become more costly to access and therefore significantly fewer people will use it regularly, or flat-rate plans will disappear and so people will cut out their more extravagant uses of bandwidth, CPU, and time online. It's in this way I can see the Internet of 2032 being a lot more like the Internet of 1992 that I remember than the Internet of 2012: a lot less flashy, a lot more basic services like email, newsgroups / forums, and simple hypertext-style browsing.
Finally, there's a tension that has become more apparent in recent years: that between the dematerialization that computing enables---the ability to telecommute and teleconference, the ability to replace many physical objects from books to calculators with one general purpose computing device---and the resources it consumes. While most computing devices don't require rare minerals, they do rely upon complex manufacturing processes and as a result have relatively high embodied energy use. At the same time, they can decrease the amount of long-distance transportation required of people, information, and maybe even goods and services. It's a question of complexity limits vs. energy savings, and it's unclear that there's a clear winner, especially when there are uses that straddle the line: does online shopping decrease energy use because there's no physical storefront or distributed inventory, does it increase complexity because of a dependency upon not only the Internet but just-in-time supply chains, or does it increase energy use because of piecemeal shipping of small items and its commoditization of products enabling the continued offshoring of manufacturing?
This just scratches the surface. In a future post I hope to consider some of the more recent trends---cloud computing, "smart" appliances, social networks, TV and broadcast moving online, etc.---how those might change or simply disappear.