It is one thing to argue that technological progress has diminished or slowed, quite another to argue that such stagnation is permanent. Recently, I ran across a 2005 analysis by physicist Jonathan Huebner, then working at the Pentagon’s Naval Air Warfare Center. (I have since learned paper is frequently cited by techno-pessimists.) In it, Huebner presents two big observations. The first claim: “The rate of innovation peaked in the year 1873 and is now rapidly declining.” (He arrives at this claim by comparing major innovations and scientific advances versus to world population.)
And the second: “This decline is most likely due to an economic limit of technology or a limit of the human brain that we are approaching. We are now approximately 85% of the way to this limit, and the pace of technological development will diminish with each passing year.” Or as Huebner told New Scientist back in 2005: “My feeling is we’ve discovered most of the major branches on the tree of technology.”
As to the first point, one could argue the subsequent 15 years has left it in a vastly stronger position. As it turns out, 2005 was the final year of the productivity surge (2004 if you look at total factor productivity rather than labor productivity) that began in 1996, thanks to what is often called the Third Industrial Revolution. We’re talking advances in information and communications technology — PCs, the internet, the web, search engines. Since the mid-2000s, however, there has been a marked productivity slowdown, as well as diminished expectations going forward by many forecasters. Maybe universal internet access, AI, advanced robotics, and genetic editing will prove to be massive game-changers, but not yet.
But “not yet” isn’t never. And that second claim immediately reminded me of two things. First, that apocryphal quote from the Commissioner of the US Patent Office in 1899, “Everything that can be invented has been invented.” More importantly, I think, is economist Paul Romer’s work on the economics of ideas. Romer: “Every generation has perceived the limits to growth that finite resources and undesirable side effects would pose if no new recipes or ideas were discovered. And every generation has underestimated the potential for finding new recipes and ideas. We consistently fail to grasp how many ideas remain to be discovered.” The way we can combine and recombine objects and elements is only limited by our imagination.
And that imagination might yet be supercharged by AI. As Warwick University economist Nicholas Crafts explained to me in a new Political Economy podcast:
There’s a related idea that all of the low-hanging fruit has already been picked, so it’s harder to find good ideas than before — you need more money and manpower in order to innovate. Could AI help us go up that ladder and get more fruit? Or could it help us discover there’s a lot more low-hanging fruit than we thought?
There’s an awful lot of knowledge out there. A lot of technological progress comes from hybrid inventions — putting together two things and making some new third. Searching that stock of knowledge is a very fruitful way of making advances. I think the argument might be that there are more trees in the orchard than there used to be, but there are so many trees in the orchard that you and I can’t scan them effectively. We need some help. The help comes, I think, potentially with things like AI. That would be the big potential change, the big potential advantage. That really goes to the way, I think, quite a lot of economic historians think about the idea of an industrial revolution: It’s that the nature of invention changes. And we’ve seen that several times in the past.