We take economic growth for granted — it's one of the defining characteristics of our economy that it grows, year after year. Any period when the economy doesn't grow is called a recession, and this is a kind of temporary economic emergency.
Top image: Kanu101/Flickr.
But sociology/economics professor Robert J. Gordon argues that we shouldn't assume economic growth will continue forever — there was no economic growth before 1700, when the first Industrial Revolution started, and it's entirely possible there will be none after 2050. In a "deliberately provocative" paper, Gordon argues that there were three separate Industrial Revolutions, and only one of them was a massive growth-spurrer:
- IR #1 (steam, railroads) from 1750 to 1830;
- IR #2 (electricity, internal combustion engine, running water, indoor toilets, communications, entertainment, chemicals, petroleum) from 1870 to 1900; and
- IR #3 (computers, the web, mobile phones) from 1960 to present.
... IR #2 was more important than the others and was largely responsible for 80 years of relatively rapid productivity growth between 1890 and 1972.
Once the spin-off inventions from IR #2 (airplanes, air conditioning, interstate highways) had run their course, productivity growth during 1972-96 was much slower than before. In contrast, IR #3 created only a short-lived growth revival between 1996 and 2004. Many of the original and spin-off inventions of IR #2 could happen only once – urbanisation, transportation speed, the freedom of women from the drudgery of carrying tons of water per year, and the role of central heating and air conditioning in achieving a year-round constant temperature.
But Paul Krugman responds that the growth thanks to IR # 3 isn't really over, and may not have started in earnest — because we could soon be getting more massive increases in productivity due to computers taking over tasks previously performed by robots. Says Krugman:
Not that much progress has been made in producing machines that think the way [humans] do. But it turns out that there are other ways of producing very smart machines. In particular, Big Data - the use of huge databases of things like spoken conversations - apparently makes it possible for machines to perform tasks that even a few years ago were really only possible for people. Speech recognition is still imperfect, but vastly better than it was and improving rapidly, not because we've managed to emulate human understanding but because we've found data-intensive ways of interpreting speech in a very non-human way.
And this means that in a sense we are moving toward something like my intelligent-robots world; many, many tasks are becoming machine-friendly. This in turn means that Gordon is probably wrong about diminishing returns to technology.