Forbes weighs in on Computational Thinking: I’m one of *those* critics!

March 20, 2016 at 7:27 am 3 comments

Based on the Forbes article (quoted below), I can now be referred to as Reviewer #2 (see post explaining that academic meme).  I am one of *those* critics.

I’m not apologizing — I still don’t see evidence that we can teach computational thinking the way it’s been described (as I discussed here).  For example, is it true that “Computational thinking can also help in understanding and explaining how things work”?  By learning about computational things, students will learn how to understand non-computational things?  Maybe, but I don’t see much research trying to achieve that and how to measure whether it’s happening.  I do believe that you can use computational things to learn about non-computational things, through modeling and simulation.  But that’s different than saying that “computational thinking” will get you there.

The defense offered in Forbes (“Despite almost a decade of efforts”) is a weak one.  There are lots of things that humans have believed for a lot longer than a decade that are still wrong.  Lamarckian theories of evolution?  Spontaneous generation? Flat Earth?  Length of time of a belief is not a measure of its truth.

Young students in grades K-6 should learn the basic ideas in computing and how to solve problems computationally.  Computational thinking can also help in understanding and explaining how things work. Computational thinking can be taught as a complement to science and to principles of engineering design. It can also be taught to support students’ creative expression and artistic talents.  Despite almost a decade of efforts to define computational thinking, there are still critics that suggest we don’t know what computational thinking means or how to measure it. The previously mentioned work in standards setting and assessment is helping to more clearly define computational thinking and how it can be incorporated in the classroom.

Source: Thawing from a Long Winter in Computer Science Education – Forbes

Entry filed under: Uncategorized. Tags: , , .

Brain training, like computational thinking, is unlikely to transfer to everyday problem-solving Infographic: What Happened To Women In Computer Science? 

3 Comments Add your own

  • 1. Raul Miller  |  March 20, 2016 at 11:38 am

    Hmm… I am not sure if, by the time I finish composing this, I will be reviewer #2 or some other reviewer. But maybe there is only one way to find out…

    Anyways, I think I agree with you: computational thinking can be good for understanding how computers work. This can be useful for debugging, design, testing and for implementation. Taken in isolation these wind up being antisocial features.

    So the trick is that you need to engage in something useful. Just as you cannot debug a program if you do not run it, you cannot apply computers usefully unless you use them for something. That sounds a bit too tautological, so let’s give a few examples:

    * In the context of the internet, computers get used for communication. Many (but not all, not by a long shot) internet users are not interested in computers at all.

    * Computers have a wide range of applications in the context of construction. They can represent plans (design drawings, track materials, etc), and (if you understand the math well enough) help make logistics more efficient.

    Computational thinking can also help understand the weaknesses of computers, which can matter both for dealing with computer security issues and for highlighting issues which are best not dealt with using computers at all, but by people instead. If you understand how limited computers are, you can take made up stories about computers with the salt that they deserve.

    But there’s also a downside to computational thinking which has to do with how it can go off the rails.

    Over and over, I’ve seen people trying misapply elementary classroom exercises. Since arithmetic can be defined recursively (for example, consider the peano postulates), let’s build this really elaborate system which does addition poorly using recursion – that kind of thing. And while this kind of approach does have some real usefulness as a training exercise, it also needs a good sense of humor because it’s exactly backwards of what’s normally needed.

    Anyways, if I had to put my finger on what’s worrying here, I’d go with how words can be parotted back by people like the phrasing itself matters more than the underlying idea.

    And this relates back to a crucial limitation of computers, and has to do with definitions. In natural languages, words are symbols which remind us of some collection of experiences (in the case of grammatical words these experiences have to do mostly with understanding each other and that circularity tends to result in many definitions for each grammatical word, which can be difficult for a person to grasp). In computer languages, “words” have but a single definition (though, as people try to make computers “work” for them, we start to see elaborate workarounds – massive conditionals, polymorphism, etc. – which load all sorts of possibilities into a single method or name – most of which never gets used…). Anyways, there’s an inherent tension here and while it has some narrative potential, in practice we get many otherwise intelligent people (and a significant part of the economic scene) wasting their efforts because of this failure mode.

    Short form: untested ideas are often just fantasies.

    So, getting back to the Forbes article: there’s computational fantasy, and there’s useful computation. And computational thinking *taken in isolation* is going to be more fantasy than useful. And while some fantasy can be a good thing, maybe even essential, too much can become horrible.

  • […] students for 2016, and they include computational thinking — with a better definition than the more traditional ones.  It’s not about changing how students think.  It’s about giving students the tools […]

  • […] computer science, and computational thinking has just come out from Digital Promise.  I have been critical of some definitions of computational thinking (as I described in my book). I like the way Digital Promise defined them, and particularly how they […]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 9,052 other followers


Recent Posts

Blog Stats

  • 2,031,372 hits
March 2016

CS Teaching Tips

%d bloggers like this: