Friday, 22 March 2013

We Need an Economic Study on Lost Productivity from Poor Computing Education

How much does it cost the American economy that most American workers are not computer literate? How much would be saved if all students were taught computer science? These questions occurred to me when trying to explain why we need ubiquitous computing education. I am not an economist, so I do not know how to measure the costs of lost productivity. I imagine that the methods would be similar to those used in measuring the Productivity Paradox.

We do have evidence that there are costs associated with people not understanding computing:
  • We know from Scaffidi, Shaw, and Myers that there are a great many end-user programmers in the United States. Brian Dorn’s research on graphic designersidentified examples of lost productivity because of self-taught programming knowledge. Brian’s participants did useless Google searches like “javascript <variablename>” because they didn’t know which variable or function names were meaningful and which were arbitrary. Brian saw one participant spend a half an hour studying a Web resource on Java, before Brian pointed out that he was programming in Javascript which was a different language. I bet that many end-users flail like this — what’s the cost of that exploration time?
  • Erika Poole documented participants failing at simple tasks (like editing Wikipedia pages) because they didn’t understand basic computing ideas like IP addresses. Her participants gave up on tasks and rebooted their computer, because they were afraid that someone would record their IP address. How much time is lost because users take action out of ignorance of basic computing concepts?
We typically argue for “Computing for All” as part of a jobs argument. That’s what Code.org is arguing, when they talk about the huge gap between those who are majoring in computing and the vast number of jobs that need people who know computing. It’s part of the Computing in the Core argument, too. It’s a good argument, and a strong case, but it’s missing a bigger issue. Everyday people need computing knowledge, even if they are not professional software developers. What is the cost for not having that knowledge?

Now, I expect Mike Byrne (and other readers who push back in interesting ways on my “Computing for Everyone” shtick) to point out that people also need to know about probability and statistics (for example), and there may be a greater cost for not understanding those topics. I agree, but I am even harder pressed to imagine how to measure that. One uses knowledge of probability and statistics all the time (e.g., when deciding whether to bring your umbrella to work, and whether you can go another 10K miles on your current tires). How do you identify (a) all the times you need that knowledge and (b) all the times you make a bad prediction because you don’t have the right knowledge? There is also a question of whether having the knowledge would change your decision-making, or whether you would still bepredictably irrational. Can I teach you probability and statistics in such a way that it can influence your everyday decision making? Will you transfer that knowledge? I’m pretty sure that once you know IP addresses and that Java is not the same as JavaScript, you won’t forget those definitions — you don’t need far-transfer for that to be useful. While it is a bit of a “drunk under the streetlight” argument, I can characterize the behaviors where computing knowledge would be useful and when there are costs for not having that knowledge, as in Brian and Erika’s work. I am trying to address problems that I have some idea of how to address.

Deepa Singh
Business Developer
Email Id:-deepa.singh@soarlogic.com

No comments:

Post a Comment