The role of computing education in the productivity paradox

April 4, 2011 at 10:48 am 7 comments

This recent article in Slate addresses an old problem in economics: Why hasn’t the computer led to a dramatically new economy?  Why hasn’t it led to a boost in productivity?  A new book on The Great Stagnation suggests that the American economy hasn’t faltered — rather, the American boom in previous years was due to “low-hanging fruit,” and all of that is gone now.  What I’m more interested in is what The Great Stagnation and another recent book The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future suggest about the role of technology in the future economy.

In general, it’s not a pretty picture.  Their idea is that computers replaced physical labor, and now are taking on more cognitive labor.  For example, in the future, you won’t need as many legal clerks, because a law-aware version of Web search will do the job so much better.  These economists argue that our ability to create new jobs won’t grow as quickly as technology’s ability to take over jobs, and with relatively few people creating the job-stripping technology — these economists argue that it doesn’t take many programmers to serve the needs of the IT industry.  Thus, these economists predict a future world with 30% unemployment.

I wonder what role computing education might play in the productivity paradox and in these future visions of the technology and economy.  I’m not arguing that bad computing education is causing the productivity paradox — rather, I wonder what role that better computing education could play in improving productivity with computing, in ways that computers can’t take over.  Alan Kay has argued many times that the real computer revolution hasn’t happened yet.  We use very little of the computer’s potential in our daily lives.  Certainly, part of the problem there is the lack of enough good, usable software that taps into the real power of the computer (e.g., other than what-if games on Excel, what everyday-usable software has people building models and simulations?).  Another way to look at the problem is that maybe we haven’t taught people how to use the computer well.  In our computer science courses, we focus so much on how to build scalable, robust software that meets others needs, but we spend relatively little time on how to write small, throw-away programs that meet our needs — and maybe those little bits of programs would lead to a major productivity boost (especially if the languages were better suited to meet those needs, e.g., not public static void main(String[] args)).  I’ll bet that learning to write those little, useful bits of code would lead to transferable learning that could have a major boost in productivity.  Employees who know how to use more of the computer’s potential, without waiting for the next release of Microsoft Office, may be employees who keep their jobs.

Could it be that economists have found no productivity boon in the printing press is because there was too little literacy when the press was first created?  The press created a reason to become literate, and that literacy led to a productivity boost.  Similarly, the computer may create a reason to become computationally literate (maybe even more mathematically literate), and those new literacies could lead to a major productivity boom — but maybe not for another 100 years, as education and society changes.

Consider the case of Gutenberg’s printing press. Though the technology radically transformed how people recorded and transmitted news and information, economists have failed to find evidence it sped up per-capita income or GDP growth in the 15th and 16th centuries.

At one point, some economists thought that an Internet-driven golden age might have finally arrived in the late 1990s. Between 1995 and 1999, productivity growth rates actually exceeded those during the boom from 1913 to 1972—perhaps meaning the Web and computing had finally brought about a “New Economy.” But that high-growth period faded quickly. And some studies found the gains during those years were not as impressive or widespread as initially thought. Robert Gordon, a professor of economics at Northwestern, for instance, has found that computers and the Internet mostly helped boost productivity in durable goods manufacturing—that is, the production of things like computers and semiconductors. “Our central theme is that computers and the Internet do not measure up to the Great Inventions of the late nineteenth and early twentieth century, and in this do not merit the label of Industrial Revolution,” he wrote.

Gordon’s work leads to another theory, one espoused by Cowen himself. Perhaps the Internet is just not as revolutionary as we think it is. Sure, people might derive endless pleasure from it—its tendency to improve people’s quality of life is undeniable. And sure, it might have revolutionized how we find, buy, and sell goods and services. But that still does not necessarily mean it is as transformative of an economy as, say, railroads were.

That is in part because the Internet and computers tend to push costs toward zero, and have the capacity to reduce the need for labor.

via The productivity paradox: Why hasn’t the Internet helped the American economy grow more? – By Annie Lowrey – Slate Magazine.

Entry filed under: Uncategorized. Tags: , , , .

Incentives Offered to Raise College Graduation Rates – NYTimes.com Millions more to benefit from Greenfoot

7 Comments Add your own

  • 1. Ian Bogost  |  April 4, 2011 at 10:55 am

    Mark, isn’t it likely that computers (like many other things) have simply been incorporated into existing practices, with the interest of different workplace values, not “productivity” but more than anything “sameness?” I wrote a little about a related phenomenon this week. I don’t have much more time today to comment on it though.

    Reply
    • 2. Mark Guzdial  |  April 5, 2011 at 3:32 pm

      I agree, Ian. Seymour Papert explained in The Children’s Machine how Logo became “schoolified” — shuttered into its own subject, cordoned off in the computer lab, which prevent Logo from having an impact on the rest of the school. I appreciated your piece about news channels and newsgames. I’ve given up Facebook and Twitter for Lent, and I’ve been reflecting on how much more quiet my day seems now. I did like having a sense of what was going on in my friends’ lives. However, it’s pleasant not to have that constant hum and churn going by.

      Reply
  • 3. natinja  |  April 4, 2011 at 5:59 pm

    In other words, the Internet did fail to deliver economic growth to an economy anchored to the myth of growth renewed again
    in the scarcity of the book culture meanwhile it has turned everybody into bargain hunters. Economists declare the situation unfair and strip the Internet of its revolutionary attributes, yet they are unable to see what the Internet really is, an engine for turning hardware into software. Steam engine still wins the day, so they say, forgetting that it actually sparked the software revolution when it started to produce electricity.

    Didn’t they get a clue when these 19th century concepts such as the labour market and the GDP fail to produce evidences when applied to any other period of history than the one they were conceived for?

    Reply
  • 4. Tony Garnock-Jones  |  April 5, 2011 at 7:39 pm

    Mark, I’d love to hear your thoughts on how to teach “how to write small, throw-away programs that meet our needs”. People don’t even seem to realise it’s a possibility.

    Reply
    • 5. Mark Guzdial  |  April 5, 2011 at 8:25 pm

      I have some ideas, but the experts that I know in this space are (a) Greg Wilson and his Software Carpentry class for scientists and engineers, and (b) Brian Dorn who studied how graphics designers program and teach themselves computer science in order to do that.

      Reply
    • 6. gasstationwithoutpumps  |  April 5, 2011 at 8:36 pm

      Bioinformatics classes often split between teaching rapid prototyping and teaching memory-efficient programming. Bioinformatics is one of the application fields where both are important. Python provides an excellent language for teaching the use-once (or a few times) coding, but is absolutely abysmal at memory-efficient programming. Some of the current genome assemblers need 500Gbytes of RAM for carefully crafted C-based graphs, so Python’s memory overhead is unacceptable for those tasks.

      Reply
  • 7. Bonnie MacKellar  |  April 7, 2011 at 2:54 pm

    Most efficient software engineers out in industry arevery adept at writing little scripts that help them get their jobs done. You don’t survive if you can’t do that. I actually think that most CS grads are pretty good with that kind of thing, because it makes them feel clever. They are far worse at teamwork, building large scale systems, good design, communication and writing, that kind of thing.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


Recent Posts

April 2011
M T W T F S S
« Mar   May »
 123
45678910
11121314151617
18192021222324
252627282930  

Feeds

Blog Stats

  • 1,452,090 hits

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 5,177 other followers

CS Teaching Tips


%d bloggers like this: