A new definition of Computational Thinking: It’s the Friction that we want to Minimize unless it’s Generative,

April 29, 2019 at 7:00 am 14 comments

David Benedetto wrote a blog post about computational thinking for CSTA that gave me new insight into Computational Thinking (thanks to Shuchi Grover whose tweets drew me to it):


David says:

I think this definition of CT is as good a starting point as any:

Computational Thinking is the thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information-processing agent (Cuny, Snyder, Wing, 2010).

He evolves this position until, like Shuchi, he comes up with two definitions of CT:

What are the implications of this? I think there are two clear options for how we define CT:

(A) Restrict what we mean by CT. This is perfectly reasonable and probably necessary for most practical purposes. However, this has the inevitable consequence of fragmenting our understanding of CT. There will be different CTs in different disciplines / fields. We will do this, but we should try to understand the restrictions that we are imposing, and the consequences of imposing them.

(B) Break our concept of CT wide open. I think the scientific community (at least, those who are studying the construct of CT and how it plays out in real cultural contexts) should do this, so that we can explore how CT is understood and practiced in a variety of contexts and for a wide range of purposes.

As a researcher, I’m more in favor of the former — let’s define Computational Thinking precisely.  David’s concern is really about the social context around CT. People want to call lots of things Computational Thinking. Can we come up with a definition for CT that bridges these? That represents the discipline-specific uses of CT, and is well enough defined that we can actually measure something about it?

There are many other “thinkings” that lay claim to providing students with critical skills. Admiral Grace Hopper would likely support “mathematical thinking” more than “computational thinking,” as this interesting essay from Yale points out. Skills like “decomposition” or “abstraction” are included in many definitions of computational thinking (eg this blog post), and it’s true that you need those in computing.  But those skills first belonged to mathematics, engineering, and science, and I’d argue that the teachers in those subjects might be in a better position to teach them and to measure them. Computation can play an important role in learning decomposition and abstraction, but those skills don’t belong uniquely to computation, or to a class on computational thinking. So, what is unique about computation?

The tension between HCI and Computational Thinking

On the computer science side of my life, my research community is human-computer interaction.  I’ve published in CHI, DIS, CSCW, VL/HCC, and UIST. The Cuny, Snyder, and Wing definition is hard for me to reconcile with being an HCI researcher.  The point of HCI research is to minimize the amount that a user has to learn in order to “formulate problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information-processing agent.”  HCI is trying to make it easier for the user to think with a computer whatever they want to think about. Computational Thinking is about what you need to think with a computer.

Over the last few weeks in this blog, I’ve been exploring the notion of task-specific programming languages. I was amazed at how much social studies teachers could do with Vega-Lite in a participatory design session we ran in early March. Sarah Chasin’s work with Helena and Rousillon is absolutely stunning for how much people could achieve with no training. Hariharan Subramonyam sent me this fascinating essay on end-user programming and about how to minimize the effort it takes end users to start programming: https://www.inkandswitch.com/end-user-programming.html. As I talked about in my SIGCSE 2019 keynote, Bootstrap:Algebra and most uses of Scratch actually rely on a small number of computational ideas. There is expressive and learning power in even a small amount of computation.

Michael Mateas wrote an essay back in 2009 that has been influential in my thinking. I blogged about it here: “There will always be friction.” Michael looked at the Alan Perlis talk of 1961 (that I talk and write about often), and particularly, at the exchange with Peter Elias. Elias argued that students shouldn’t have to learn to program — the computer should learn to understand us. Both Perlis and Mateas disagree. The computer can never understand us completely. We have to smooth the communication because the computer cannot. There will always be a challenge to human-computer interaction. There will always be friction, and it’s the human’s job to manage that friction..

A New Definition for Computational Thinking

So, here’s my new take on Computational Thinking: It’s the friction. Let’s take the original Cuny, Snyder, and Wing definition — computational thinking is about framing problems so that computers can solve them. The work around task-specific programming languages is showing us that we can make that amount that the user has to learn in order to use programming for their problem very small.

To meet Alan Kay’s point about generativity, there are some things in computing that we want to teach because they give us new leverage on thinking. We want to teach things that are useful, but not those that are necessary just because we have bad user interfaces.

A minimal definition of Computational Thinking: The stuff that we have to learn in order to communicate our tasks with a computer. It should be small, and the part that we learn should be generative, useful for new problems and new thinking. Everything else should be eliminated by good user interfaces.

You don’t have to master abstraction and decomposition just to use a programming language to help you learn. Our social studies teachers modified Vega-Lite programs, made mistakes (every single one of them) and recovered from them, and tried new things that they figured out on their own — all in 10-20 minutes. They already have problem solving skills. They didn’t need any “computational problem solving skills.” They certainly didn’t learn any special computational abilities to abstract and decompose in 10 minutes. They already know enough to use programming to learn. If we can eliminate the need for a particular skill in order to use computing to learn something else, we should.

This meshes with David Weintrop and Uri Wilesnky’s definition — it’s the computational practices of actual scientists and engineers who use computing. Their definition is particularly strong because it’s empirically grounded. They asked computational scientists and engineers what they actually do. Weintrop and Wilesnky’s participants want to do their work, not programming for its own sake. So they use a minimal subset of computing that buys them something for their thinking and in their tasks.

I like this definition because it’s aspirational.  Today, there’s a lot of stuff that you have to learn to use a computer to solve problems. Philip Guo gave a talk here at Michigan recently (similar to one he gave at U-W) and described how data scientists have to become system administrators to manage all the various packages and databases to do their job.  That’s a problem. That’s not computational thinking. That’s stuff to get rid of. How small can we make computational thinking?


Entry filed under: Uncategorized. Tags: , , .

Congratulations to Chris Stephenson, Outstanding Contribution to ACM Awardee 2018 What’s NOT Computational Thinking? Curly braces, x = x + 1, and else.

14 Comments Add your own

  • 1. Raul Miller  |  April 29, 2019 at 11:27 am

    One issue I have been trying to reason about is the pervasive nature of malware and the underlying blindness we bring to bear that encourages this situation.

    Your writeup here reminded me of this.

    On the one hand, malware involves deep technical insights into the flaws of computational system that result from physical limitations, hastily met deadlines, and so on. These are extremely difficult for most people to reason about, it would seem.

    On the other hand, a consequence of malware is unnecessary and unwanted use of resources – time, bandwidth, etc. Systems tend towards being unusable.

    On the HCI side, this seems to call for summary statistics – counters, graphs, etc. And in network contexts (most all computation, nowadays, is networked) these things should reflect network activity and should be available from “upstream” devices as well as the machines we use (designed so that we could compare things and see when the counts themselves are being tampered with, or when the counts are showing signs of malware infections).

    Useful analogies for this kind of system design, it seems to me, include: double-entry bookkeeping, checksums, the physiology of pain, and early warning systems.

    Of course, it’s easy to be apathetic about these issues – we’ve mostly ignored this stuff for decades, why is it all of a sudden an issue now? (But it’s an issue because we’ve been mostly hoping it would go away if we could properly address some arbitrary details of our systems, but neglect never fixed much of anything.)

    Anyways… not really what you were getting at, but not entirely divorced from it, either…

  • 2. alanone1  |  April 29, 2019 at 9:38 pm

    I think a simpler and more direct guiding question for pedagogy and curriculum is “When should it be easy and when should it be hard?”

    Gratuitous difficulties don’t contribute to growth, but ideas that require real structuring and restructuring one’s brain/mind are going to require a fair amount of work.

    If our goals for education include making big differences between the ears of the learner — not just augmenting with an external tool — then the “important difficulties” must not be avoided.

    As the great educator Tim Gallwey pointed out “Hard doesn’t have to be equal to pain. Just how you go through the work to deal with ‘hard’ can greatly condition how much ‘pain’ is felt.”

    Dealing with the latter is a large part of the job of the real educator, curriculum designer, pedagogue, and user interface designer.

    A big mistake is to try to remove pain by removing important areas that are hard. This not only quashes growth, but also teaches something that is very often “anti-growth”.

    This mistake is made all the time by those who are mainly seeking early success with learners that in the end doesn’t lead to above threshold learning to produce qualitative changes in thinking, doing, and understanding. I.e. the “Guitar Hero” form of anti-education.

    • 3. Mark Guzdial  |  April 30, 2019 at 7:20 am

      I agree, Alan — I’m reflecting on these issues in Friday’s blog post.

  • 4. tdmorley  |  April 30, 2019 at 5:27 am

    Perhaps you are coming around here to a more Mathematical point of view … the specifics of syntax and usage..commas, semicolons, the names of built in features, etc, etc., are not important, and difficulties involving these should be minimized. There is a deeper (more mathematical) level of computational thinking that can be hinted at by examples that minimize the surface difficulties.

  • 5. Kathi Fisler  |  April 30, 2019 at 6:29 am

    I tried posting this comment yesterday, but it doesn’t seem to be
    going through for some reason. Rewriting intro so that WP doesn’t reject saying that I already posted this …

    While the part about managing and reducing friction makes sense, a
    definition of CT based on professional practice does not. Students are
    often trying to learn things that professionals have already learned
    and moved beyond (and hence don’t appear in explicit regular

    In Bootstrap:Physics, our goal is to teach computational modeling. In
    the context of studying Physics concepts, students make (experimental) observations, then write a function-based model (in code) to capture the dynamics of that concept. They can then test, simulate, explore, and predict with their computational model, contrasting the results with those from other physical models.

    A practicing engineer/scientist often wants just the simulation with
    ability to predict outcomes. That suggests only teaching students how
    to use simulation software to match needed practice. But in our case,
    the task of the student is to learn the computational underpinnings of
    the concept. This task is fundamentally different from what the
    professional needs (the professional hopefully learned this while they
    were a student).

    My concern with your definition is that it overlooks the computational
    explanations/descriptions of some phenomena. Working through
    computation here teaches something that the HCI perspective doesn’t
    strictly “need”, but that value to someone in the non-cs discipline. A
    definition that rules out computational modeling throws out too much

    (your argument still applies to the tools taught to develop and use
    computational models, but that’s a separate point)

    • 6. Mark Guzdial  |  April 30, 2019 at 7:20 am

      Thanks, Kathi. I agree with your point, but it didn’t come through here. I plan to rewrite this story for a couple of other venues, so I’ll make sure that I make it clearer in the next iterations.

      If we are designing for professionals and their tasks, then it makes sense to design a task-specific programming language for professional practice. When designing for learners, they are engaged in an activity in which they should learn — the learning is part of the “task.” That’s the explicit idea of Learner Centered Design (Soloway, Hay, Quintana, me). We also recognize that professionals have a set of terms that are comfortable for them, but learning those is an explicit part of the students’ learning task.

  • 7. Briana Morrison  |  May 4, 2019 at 2:03 pm

    So I like the idea of framing CT as the “friction”, but I’m not sold on your new definition because it seems to exclude the person who just wants to program something “because”….they want to create something cool, help them with a personal project, etc. Not to mention it limits it to scientists and engineers.

    I prefer more of a “distributed cognition” (a la Hutchins) model: when can the computer help me solve my problem (and when will it be more of a pain than a help) and how can I offload the problem to a machine?

    If I have to add 10 numbers I’ll likely do it by hand. But if I have 100 numbers and need additional statistics, I’ll turn to an application. If I have 1000 numbers that are already in a file, I’ll write a program to do it. It’s all about knowing what the right tool is and when to use it. I want the world to get to a point where people google for a “what tool will solve this problem” and a collection of answers come up – each a potential separate computing tool / environment / language / whatever and a series of questions (how much data, when do you need the answer, are you familiar with R, etc.) guide you to the right tool for the job. That’s computational thinking!

    • 8. Mark Guzdial  |  May 4, 2019 at 2:22 pm

      Almost all definitions of Computational Thinking, and all that stem from the original Wing definition, are about problem-solving. The Cuny, Snyder, and Wing definition that I’m working from, that CT is the knowledge needed to solve a problem on a computer, is in that same lineage. Just because you want to create something is a GREAT reason to program — but that’s not Computational Thinking.

      Throwing everything that has to do with computing into the same “computational thinking” bin makes for an undefinable and untestable mess.

      • 9. Briana Morrison  |  May 4, 2019 at 2:32 pm

        I’m not trying to throw all things computers into CT – just using computers as a tool to solve problems (even if the “problem” is to create art…).

        I don’t see this as any different than expecting the engineer to know the right formulas / components / tools to solve the problem (which may, of course, be computational). Or medical doctors knowing which tool to use in the operating room (which also may be computational). We already have loads of computational tools that help people solve problems – what we don’t have is a good way for people to learn about them and then use them easily (what prior knowledge do I come with and how can I make the most of that). That’s what I love about your social studies participatory design stuff – the teachers want to know something, what’s the best way to find it out.

        I’m starting to explore Google Studio, Tableau, etc. and other data visualization tools with my teachers (with real world data sets) in terms of what can we find out. Is this CT? I would argue that it is – we’re using computational tools and power to extract knowledge from raw data. Are we problem solving? Maybe….

        • 10. Mark Guzdial  |  May 4, 2019 at 2:36 pm

          I think we are in agreement. Knowing what tool to use is part of problem-solving. What I’m doing in this definition is to put the onus on us, as tool developers. What the domain expert should need to know about computing in order to use the tool is as small as possible. We are not doing a favor to anyone if we say, “Here’s a catalog of tools that match your problem, and most of these require hours of your life just to learn how to use them.” Instead, we should say, “Here are the n tools that meet your needs, and each is usable within 10 minutes.” I think we can do that, and in those 10 minutes — you’re learning CT.

  • […] I’ve written several posts about task-specific programming languages over the last few weeks (here’s the first one), culminating in my new understanding of computational thinking (see that blog post). […]

  • […] Most work on CS in K-12 is focused on either computational thinking or teaching standalone CS classes. I’m focusing on integrating computing into classes with a goal of reducing computational thinking as much as possible. […]

  • […] measure the value (if any) of CT. I blogged about definitions of it in 2011, in 2012, in 2016, and in 2019. I’ve written and lectured on Computational Thinking. The paper I wrote last Fall with Alan Kay, […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 10,184 other subscribers


Recent Posts

Blog Stats

  • 2,049,392 hits
April 2019

CS Teaching Tips

%d bloggers like this: