The role of computing education in the productivity paradox
This recent article in Slate addresses an old problem in economics: Why hasn’t the computer led to a dramatically new economy? Why hasn’t it led to a boost in productivity? A new book on The Great Stagnation suggests that the American economy hasn’t faltered — rather, the American boom in previous years was due to “low-hanging fruit,” and all of that is gone now. What I’m more interested in is what The Great Stagnation and another recent book The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future suggest about the role of technology in the future economy.
In general, it’s not a pretty picture. Their idea is that computers replaced physical labor, and now are taking on more cognitive labor. For example, in the future, you won’t need as many legal clerks, because a law-aware version of Web search will do the job so much better. These economists argue that our ability to create new jobs won’t grow as quickly as technology’s ability to take over jobs, and with relatively few people creating the job-stripping technology — these economists argue that it doesn’t take many programmers to serve the needs of the IT industry. Thus, these economists predict a future world with 30% unemployment.
I wonder what role computing education might play in the productivity paradox and in these future visions of the technology and economy. I’m not arguing that bad computing education is causing the productivity paradox — rather, I wonder what role that better computing education could play in improving productivity with computing, in ways that computers can’t take over. Alan Kay has argued many times that the real computer revolution hasn’t happened yet. We use very little of the computer’s potential in our daily lives. Certainly, part of the problem there is the lack of enough good, usable software that taps into the real power of the computer (e.g., other than what-if games on Excel, what everyday-usable software has people building models and simulations?). Another way to look at the problem is that maybe we haven’t taught people how to use the computer well. In our computer science courses, we focus so much on how to build scalable, robust software that meets others needs, but we spend relatively little time on how to write small, throw-away programs that meet our needs — and maybe those little bits of programs would lead to a major productivity boost (especially if the languages were better suited to meet those needs, e.g., not public static void main(String args)). I’ll bet that learning to write those little, useful bits of code would lead to transferable learning that could have a major boost in productivity. Employees who know how to use more of the computer’s potential, without waiting for the next release of Microsoft Office, may be employees who keep their jobs.
Could it be that economists have found no productivity boon in the printing press is because there was too little literacy when the press was first created? The press created a reason to become literate, and that literacy led to a productivity boost. Similarly, the computer may create a reason to become computationally literate (maybe even more mathematically literate), and those new literacies could lead to a major productivity boom — but maybe not for another 100 years, as education and society changes.
Consider the case of Gutenberg’s printing press. Though the technology radically transformed how people recorded and transmitted news and information, economists have failed to find evidence it sped up per-capita income or GDP growth in the 15th and 16th centuries.
At one point, some economists thought that an Internet-driven golden age might have finally arrived in the late 1990s. Between 1995 and 1999, productivity growth rates actually exceeded those during the boom from 1913 to 1972—perhaps meaning the Web and computing had finally brought about a “New Economy.” But that high-growth period faded quickly. And some studies found the gains during those years were not as impressive or widespread as initially thought. Robert Gordon, a professor of economics at Northwestern, for instance, has found that computers and the Internet mostly helped boost productivity in durable goods manufacturing—that is, the production of things like computers and semiconductors. “Our central theme is that computers and the Internet do not measure up to the Great Inventions of the late nineteenth and early twentieth century, and in this do not merit the label of Industrial Revolution,” he wrote.
Gordon’s work leads to another theory, one espoused by Cowen himself. Perhaps the Internet is just not as revolutionary as we think it is. Sure, people might derive endless pleasure from it—its tendency to improve people’s quality of life is undeniable. And sure, it might have revolutionized how we find, buy, and sell goods and services. But that still does not necessarily mean it is as transformative of an economy as, say, railroads were.
That is in part because the Internet and computers tend to push costs toward zero, and have the capacity to reduce the need for labor.