We Need an Economic Study on Lost Productivity from Poor Computing Education
How much does it cost the American economy that most American workers are not computer literate? How much would be saved if all students were taught computer science?
These questions occurred to me when trying to explain why we need ubiquitous computing education. I am not an economist, so I do not know how to measure the costs of lost productivity. I imagine that the methods would be similar to those used in measuring the Productivity Paradox.
We do have evidence that there are costs associated with people not understanding computing:
- Erika Poole documented participants failing at simple tasks (like editing Wikipedia pages) because they didn’t understand basic computing ideas like IP addresses. Her participants gave up on tasks and rebooted their computer, because they were afraid that someone would record their IP address. How much time is lost because users take action out of ignorance of basic computing concepts?
We typically argue for “Computing for All” as part of a jobs argument. That’s what Code.org is arguing, when they talk about the huge gap between those who are majoring in computing and the vast number of jobs that need people who know computing. It’s part of the Computing in the Core argument, too. It’s a good argument, and a strong case, but it’s missing a bigger issue. Everyday people need computing knowledge, even if they are not professional software developers. What is the cost for not having that knowledge?
Consider this a call to economics researchers: How do we measure the lost productivity from computing illiteracy?