The meaning of computer science: What our students are thinking about

July 26, 2009 at 1:44 pm 8 comments

If you haven’t seen these blog posts yet by “The Wicked Teacher of the West,” I recommend them.  The author is middle school computer science teacher whom I think is terrific–smart, cares about teaching, trying hard to learn something new.  The challenges she had in her workshop are probably like those of our best students when they are struggling in our computing classes.
http://geek-knitter.blogspot.com/2009/07/lost-in-syntax-part-1-or-omg-im-going.html
http://geek-knitter.blogspot.com/2009/07/lost-in-syntax-part-2-or-omg-im-going.html

I also know a bit about the workshop that she was in, and I also hold it in high regard.  It’s got great ideas in it, and there are definitely things in that workshop that Wicked Teacher might use in her classes.  Here’s the rub as I see it: those two things weren’t connected for her.  As she put it, “I felt like I didn’t ‘get it’ but other than saying that I felt like I had no context, I couldn’t articulate what I meant.”

Today, I’m currently working on the lists of “Big Ideas of Computer Science” for the new APCS “Computer Science: Principles” Commission (and connecting these to Peter Denning’s Great Principles of Computing), and for the last few days, I’ve been working on the lecture slides for the Second Edition of our Python Media Computation book.  The combination of these blogs and these activities have me wondering, “How do we explain the big picture to students, especially when we don’t agree on the big picture?”

Whenever I hear someone saying matter-of-factly, “Computer Science is really just engineering,” I know that they haven’t really thought about what computer science is, at least in terms that students are looking for.  I think the workshop leaders had a reason for telling the Wicked Teacher why they were asking her to do all that she did.  But not only didn’t they tell her, I don’t think it was the same one that  she was looking for.

The workshop leader saw the meaning of the code as being the correct execution.  What Wicked Teacher was looking for was why should she care.  The state education officer who critiqued me for not thinking hard enough about the match of standards to computing principles was right.  If I want teachers to care about computing, I need to show them why it’s important in terms that they consider important.

This connects to my Python slides and Great Principles. In one example, in the book, we show how to decreaseRed in a picture:

def decreaseRed(picture):
  for pixel in getPixels(picture):
    value = getRed(pixel)
    setRed(pixel, value * 0.5)

Then we parameterize that function further:

def changeRed(picture, factor):
  for pixel in getPixels(picture):
    value = getRed(pixel)
    setRed(pixel, value * factor)

Why do that? Why should we add the additional parameter to the function? I completely believe that this is an important part of introductory computing that we should teach. But what’s the story that we give students to give this meaning?

In our Python book, we give the traditional, engineering-based explanation: By adding the additional parameter, we make the code more reusable so we can later build even more complex things more easily.  But what if someone doesn’t care about building more complex things, or building anything later?  What if they care about other things that are just as valuable?

I realized that I really could tell a story about “bindings” here, about associating names with values.  Peter Denning’s Great Principles touch on some of these.  By delaying the association of a value for factor, I maintain flexibility and expressiveness.  That’s a path that leads me to thinking about a wonderful set of abstractions that are powerful and unique to computer science: scopes and namespaces, functions as first class data objects, lambda (as the value for a function binding), creating new kinds of control structures, aggregating data and procedures together to create objects, and the power of “messages” where the function to be invoked for a given “name” is decided later, by the receiver.  That might be engineering, but it’s closer to mathematics to me.

What many of my most serious students really care about is exploring the effects of this new function.  What visual effects can you get by manipulating red?  When do you want to?  What is the power and  limitation of a red/green/blue color model?  What can I say easily, and what is more complex?  They care about the power of representations, and choosing a particular model, and about empirical data resulting from experiments with these program.  Now, computer science looks like science.  That’s yet another, equally valid meaning for adding the factor parameter.

The Wicked Teacher was looking for a meaning in what she was learning, and I suspect that it’s a different meaning than what the workshop leaders were offering.  There are several efforts, like Peter Denning’s and those of the APCS Commission, to define what “Computing” means. The real challenge that we face (in these efforts, and as teachers) is to offer a variety of meanings. We want to encourage deep thought and engagement in the power of computing.

Entry filed under: Uncategorized. Tags: , , , , , .

Media Computation update Media Computation is not “done”

8 Comments Add your own

  • 1. purplespatula  |  July 26, 2009 at 2:24 pm

    I see tension between what you discuss in this post and some of your other posts on who should be high school computer science teachers, how to train them and how to get more of them. You present a couple of convincing and satisfying reasons for adding the ‘factor’ parameter. I’m sure you’d be able to translate those ideas into ‘kid speak’ pretty easily and leave your students intellectually satisfied and trusting in your knowledge and skills.

    Now imagine a teacher who’s had a few programming classes but doesn’t quite understand the subtlety of namespaces or scope or lambda and gives some kind of mangled explanation of why we care using one of those terms. It most likely won’t fly with students and could even be worse than no explanation.

    I think conveying the big ideas you speak of as justification for what we do in CS classes very important. I’m very interested to see what training will be made available so that non-computer scientists can satisfyingly address deeper student inquiry.

    Reply
  • 2. Mark Guzdial  |  July 26, 2009 at 2:37 pm

    It’s a great point, Purple! For all of STEM (+C for Computing) learning, one of our biggest challenges is teaching the teachers enough about the content. When we have trouble teaching the computing practitioners, training the trainers feels like a double challenge — we need to teach both the content and how to teach the content.

    Reply
  • 3. Ian Bogost  |  July 26, 2009 at 3:25 pm

    There’s something important in this post that is left implicit: computing is bigger than “computer science,” and computer scientists can no longer be trusted to decide or police what counts as computing, either in terms of practice or application.

    I think this represents an enormous identity crisis for the discipline, which wants to have its cake and eat it too (“Computing applies to everything” + “We get to decide what real computing looks like”).

    I’d go so far as to suggest that all the so-called problems in CS enrollments and successes are symptoms of this overall trend. At institutions like ours we often argue that “impact” is the prime measure of success. But real impact, the apotheosis of impact, is the commonplace. Real intellectual success means making ones ideas seem obvious, uninteresting even. This requires humility, a virtue that is rare among the driven, prosperous elite of any academic field.

    Reply
    • 4. Mark J. Nelson  |  July 30, 2009 at 3:45 am

      I think this is the crux of the problem: computing, like math, has grown to the point where it pervades everything. But the development of the field of math is not a promising one for computer scientists: mathematicians have relatively little control over what “real math” looks like. Oh, sure, they still own pure math, like computer scientists will undoubtedly forever own big-O analysis. But most math gets done by non-mathematicians, and increasingly whole branches of math, especially applied math, are independently developed in other fields (especially computer science and physics).

      On the other hand, this isn’t really a CS-specific problem. One might argue that philosophy pervades everything, and some philosophers would indeed argue that they really ought to own some areas that other fields research. When computer scientists write about causality, aren’t they doing philosophy in some sense? And in some cases, the philosophers’ criticism would even be justified: there is some pretty shoddy philosophy that gets done in computer science (AI being my area, we have our share of amateur philosophy of mind).

      So I think there are sort of two different questions: 1) in theory, is computing a coherent field that computer scientists could plausibly stake a claim to; versus 2) in practice, is computing maintainable as a coherent field, given that everyone is going to be doing it?

      Reply
  • 5. Hank Greenberg  |  July 26, 2009 at 6:27 pm

    Parameters?

    You don’t need parameters until you need them. Be agile. So you use 0.5 because that’s the problem you’re solving. You don’t introduce/refactor the constant into a parameter until you have a different problem that needs another value. Then when you have such a problem, you replace the 0.5 with 0.25 and notice the methods are the same except for that. You factor out the common code by introducing the parameter and you have one method that solves two problems.

    But you don’t introduce the parameter because it makes code more re-usable. You have no clue whether you need the re-usability and the cost of an extra parameter doesn’t necessarily warrant the added complexity.

    Be agile, read about it, refactor, and don’t make things more general when they don’t have to be just because they can be.

    Reply
    • 6. Mark Guzdial  |  July 27, 2009 at 8:00 am

      Hank, I agree about not needing parameters until you need them. I use a similar approach of letting the problems driving the learning topics. I’m interested in your reliance on Agile Methods for your terminology (e.g., do you talk to your students about “refactoring” early? do they know what that means?) and approach. Do you use this in your class? What kind of response do you get? My concern is that it’s engineering-centric. Do you mostly have engineering-oriented students in your classes?

      Reply
  • 7. Mark Miller  |  July 26, 2009 at 10:59 pm

    I liked this post a lot. Please keep going with pursuing meaning in CS, and the science. You’re on to something powerful here.

    I agree with you that mathematics is involved in what you talked about, and I think a good understanding of it is critical to understanding the significance of the example you used. As I read through your post what came to mind was the concepts of multiplication and amplification. One could contrast this with “pegging a circuit” at a fixed signal level. By introducing the abstraction, you give yourself a “control” that you can vary. I imagined that using this example I could set up an oscillating function. I could start with 0.5, decreasing the red, then invert it and increase the red, setting up a sense of undulation. To further cement the concept I was thinking of an analog metaphor, an amplifier. If you had a stereo playing music, you could talk about reducing the volume by half, then reducing it by half again, then again, and again. Then doubling it, again and again. If nothing else, if it was set up slightly differently, it would be a lesson in the multiplication of fractions, and what that can model.

    @Ian:

    You have a very valid point here. I agree that “outsiders” need to be brought in to the field. The problem is that computer scientists need to be very clear about what Mark talks about–the meaning and science in CS, and be open to the other ways in which these meanings can be expressed, or else the outsiders they introduce to computing may come to understand limited concepts of what computing’s potential is. I think computer scientists do have something to teach as a basis for others to understand this field, but yes, I think that outsiders have ideas to teach us as well.

    Reply
  • 8. Alan Kay  |  July 27, 2009 at 9:27 am

    This difficulty was present in the early 60s when I started working in computing (as a programmer in the Air Force), and it was one of the main goals of the ACM to take the field from ad hoc tinkering and uncertain engineering to something more like a real scientific, mathematical and engineering discipline.

    I’ve talked previously about day to day programming being about “problem solving within coping”, and this is the way it was then also — to the extent that (for most of us in the AF) what was in the CACM and other journals was so different as to be incomprehensible and seem irrelevant.

    So the surprise was monumental when I started grad school a few years later in an ARPA research project to find really important ideas about computing had appeared that were absolutely needed but were quite invisible in the workaday world.

    I think this is still the case, only more so today. The current deadly embrace between businesses and universities makes it very difficult for “needed but invisible” stuff to be surfaced in a way that the students will be what their future jobs *need* as opposed to what their future jobs say they *want*.

    And, because we have a design field (instead of a physical science where the universe has a lot to say about “what’s actually going on”), we have to rely on opinions about styles. And this problem is large because the forces that pull down badly designed bridges are light enough that really badly designed monster gossamer horrors can be kept alive in computers with enormous amounts of life support and intensive care.

    Throw in that these messes are mostly invisible (the programmers can look at about 50 lines of code at a time out of hundreds of millions!!!), and you’ve got today.

    One of the best experiences I had in grad school was with the irascible Bob Barton (the genius designer of the Burroughs B5000 computer, one of the top gems of the last 60s years). In his advanced systems design course he handed out a reading list and said “There are a few things known about systems design, they are written down here, and I expect you to read all and understand all.” Then he said, “But my job is to firmly disabuse you of any fondly held notions you might have brought into this classroom”.

    And this is what he did. He simply took our beliefs about computing one by one and demolished them over the course of the semester. At the end, those who survived were free to actually think about computing from scratch as something that was made from the simplest stuff, and had the most general reach and possibilities.

    This allowed design styles that were not like the ones we were fluent in to be looked at (and also criticized). Under this general criticism, some of them started to look a lot more powerful than some of the others……

    This is not the only way to teach, but there’s a lot to be said to have a lot of criticism in any field that has scientific pretensions, and a lot more criticism for any field whose basis is synthetic-design from materials that are so general they don’t indicate good ways to go.

    This is also the simplest way to assess teachers in such a field: are they good critics, or do they lack what is needed to be good critics? What is the general answer here for computing, even in universities?

    (Those who like to track epistemological shifts in human thinking will recognize that one of the biggest happened in the 17th century when *knowing* moved from “remembering” to “careful skepticism”. Schools have never liked this for many reasons. They like to teach “solid knowledge” and assess by seeing if the students “have got it”. Note that if the students are not taught how to be carefully skeptical — and they generally aren’t — they will wind up being “uncarefully sceptical to uncarefully dogmatic”, which is the general and disastrous condition today.)

    Barton’s ploys worked largely because he was a genius who had done something marvelous (he had great moral authority because of this), and partly because he was the most charming (though cutting) of human beings (kind of a leftist William F. Buckley). As Dave Evans said of him “We don’t care if they are prima donnas as long as they can sing!”. Too many who try to teach have not done enough to gain perspective or moral authority, nor are many of them charming enough to demolish their students’ beliefs and not clear their classroom.

    But I still like the Barton approach here, because it gets around many of the problems of both design fads and legacy traditions. This *outlook* allows actual problems to be looked at in more ways, and helps get rid of misplaced neo-darwinism that falsely concludes that the world we have today is a good one (darwinian processes don’t optimize, they find fits, and the fits don’t have to be particularly good unless the environment is set up for this).

    Just to take one example from today, it amazes me that software people don’t take the Internet (not the web) much more seriously. What other artifact in computing has ever scaled so gracefully from so simple and powerful a kernel? Here we have an example of a multi-billion pure object system (the hardware computers are the objects) using pure messaging to interact in such a way that no object can crash any other object (their has to be bad SW *inside* a computer for this to happen). This system never has to be stopped to fix or improve it (it can and has replaced all its atoms and bits several times without needed to stop).

    And so forth. And yet most software people think that calling fragile procedural code “object-oriented” is somehow going to bestow magic! If they could only learn to criticize their own artifacts …. (but that’s where some healthy Bartonism needs to enter the picture again).

    So my first plank in a new computer science platform would be “teach the kids how to criticize each and every aspect of what has been done and what is going on right now”.

    Cheers,

    Alan

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 9,052 other followers

Feeds

Recent Posts

Blog Stats

  • 2,030,740 hits
July 2009
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

CS Teaching Tips


%d bloggers like this: