A new kind of program visualization tool: Making the student trace

September 29, 2010 at 4:05 pm 16 comments

I’m very excited about this new tool, UUhistle. It supports exactly the kind of student activity that I was thinking that would be great as a the practice component of exploring a bunch of programs in a worked examples curriculum.

Visualizing a program’s execution can aid understanding, but research suggests that visualizations are more effective when learners are actively engaged in manipulating or creating them. To this end, UUhistle supports a novel kind of highly interactive visualization-based activity, the visual program simulation exercise (or a VPS exercise for short).

In a VPS exercise, the student has to ‘do the computer’s job’: read given code and execute its statements in the appropriate order, allocating and using memory to keep track of program state. UUhistle provides the graphical elements that the student directly manipulates to indicate what happens during execution, and where, and when. Any aspect of execution that UUhistle can display can also serve as part of a VPS exercise: the student can create variables and objects in memory, evaluate expressions, assign values, manipulate the call stack, pass parameters and so forth. For instance, to assign a value from a variable to another, the student drags the corresponding graphical element with the mouse from the source variable into the target variable.

via UUhistle.org.

Entry filed under: Uncategorized. Tags: , .

Who’s Teaching the Teachers? More college degrees needed in the South

16 Comments Add your own

  • 1. Alan Kay  |  September 29, 2010 at 4:10 pm

    But of course, this is what David Canfield Smith’s PYGMALION thesis (at Stanford 1975) did (and it created the program via the demonstration to boot — it was the first of this genre, and is celebrated in Alan Cypher’s book “Watch What I Do”)

    Is this another example of people “just making something” without being aware of past great art?

    Cheers,

    Alan

    Reply
    • 2. Mark Guzdial  |  September 29, 2010 at 4:20 pm

      I don’t think it’s the same thing, Alan. PYGMALION was a visual programming language. UUhistle is visualizing traditional textual Python programs, and what’s novel, giving students practice tracing a program. PYGMALION offered us a new way to think about programming. But if we are going to teach traditional textual programming, UUhistle offers a new way of addressing the problem.

      Allison Elliot Tew’s dissertation really convinced me that the majority of students are not understanding the very basics of assignment, conditionals, and looping. We’re seeing a bunch of evidence (both Allison’s work and the “Camel has two humps” paper) that students read “a = b” as “there is a relationship between a and b such that future changes to b will also change a” rather than “copy the value of b into the value of a.”

      One of my hypotheses is that any decent programmer can trace a piece of code — but a student sees too few examples of tracing code in class to internalize it. Based on the visualization literature, I suspect that students have to trace code themselves to internalize how the fundamental notional machine works. But how to give students that practice? UUhistle is a new take on how to do that. I’d really love to try this on iPad with a gestural interface, but no Java for iOS unfortunately.

      Cheers!
      Mark

      Reply
      • 3. Alan Kay  |  September 29, 2010 at 4:45 pm

        Not to keep up with the prior art, but Adele’s first husband Alex Cannara, did this with machine code at Stanford in the early 70s, and it did help.

        An interesting question is: if this is good enough to work, shouldn’t the students just program in it?

        And why try to prop up a very very bad idea in Python (to use “=” as meaning “assignment”)?

        Is this one of those “well, people are using Python so we have to teach it” arguments?

        Going back in the past, substitute “Java” or “C++” or “Pascal” for “Python” to get a long string of bad languages to try to teach beginners with……

        Cheers,

        Alan

        Reply
        • 4. Mark Guzdial  |  September 29, 2010 at 5:01 pm

          I don’t know that it’s good enough to work. It’s good enough to try.

          The original work showing that students don’t get assignment was in Pascal with “:=” and the “Camel” work was with Java. I don’t think that the language matters at this level. We don’t know how to teach the notional machine, for any language, at a level that we can make work at scale. That’s why I find tools like UUhistle interesting.

          Cheers,
          Mark

          Reply
          • 5. Juha Sorva  |  September 30, 2010 at 10:55 am

            Safe to say that no-one knows precisely how well student-driven execution in UUhistle is going to work. But it seems a promising solution, if dealing with the notional machine of a Python-like language is the problem.

            Some students are clearly helped by UUhistle way more than others are, and there are challenges for the teacher: how to help students ‘get’ what the visualization is and how to make the whole simulation activity relevant to their goal of learning programming (or rather, the ultimate goal of doing something meaningful by means of programming).

            What should be quite interesting to explore is the relationship between this new learning activity and students’ location in the two humps graph. Are we mostly helping the strugglers or the ones who were going to pass with flying colors anyway?

            Reply
  • 6. Katrin Becker  |  September 29, 2010 at 4:46 pm

    That looks a lot like the visualization tool one of my 4th year project students built in 2002. It had a full graphical interface (with colors). We were teaching Pascal as the first language at the time and this tool worked on Pascal programs. It showed variables; the run-time stack; the whole nine yards.

    There was very little interest in it at the time.

    In 1981 I developed a set of parsing exercises done by hand on paper (again using Pascal). That’s another area where students have trouble – compilers and interpreters are magic to most of them, but there are many classes of problems that are easy to recognize and fix if you know what the compiler’s doing but devilishly hard if you don’t.

    Around the same time, we also developed a live-action re-enactment of the execution of a small assembler program. Different students got to play the logic unit, ALU, program counter, memory, etc. Once you get them past feeling silly for pretending to be different parts of the computer, it works pretty well – and they remember it.

    Reply
    • 7. Mark Guzdial  |  September 29, 2010 at 4:57 pm

      I don’t think that the visualization is novel. There are lots of tools for showing what goes under the hood. I don’t think that students learn it by just watching it, though. It’s the student tracing that is novel and interesting here. Yes, playing computer is a great way to do this. But how many programs can you do like that in class? Enough to reach automaticity, setting the students up for abstraction? That’s where a tool comes in.

      Cheers,
      Mark

      Reply
  • 8. Alan Kay  |  September 30, 2010 at 12:14 pm

    Middle Out?

    Here’s one that I’ve wondered about many times since but I started so long ago that I lack perspective.

    In the early sixties virtually everyone (and this is probably literally everyone) learned to program in some machine code (even though FORTRAN was around and COBOL almost was, and ALGOL was being worked on).

    This was the “middle” because there was no detailed explanation of “the logical design of a digital computer” (as it was called then). But that minor mystery aside, the machines hardly had any features, and the first course in programming was a week of 8 hour days (in the Air Force) for enlisted folk). During that time you learned how to write simple data processing programs that would read cards (from card reader or from tape) do something with them, and print out the results on the line printer.

    Most (but not all) of these machines had index registers and memory recalls that loops were first taught “bareback” by storage manipulation, and indexing with index registers was introduced as a cleaner way to do a lot of things.

    All subsequent learnings were easy, at least from the POV that everything was grounded in what the hardware was actually doing.

    When you learned a new computer you looked at what it did with what it had before looking at the languages it had or what the macroassembler could do.

    All higher level languages were easy to learn, even LISP, though you had to look at LISP in LISP to really get its significance as something special.

    A wonderful program was Val Shorre’s Meta II* (ca 1964) for the 1401 (8K bytes!) which introduced topdown grammars as programs and his short paper contained Meta II in itself, and two complete examples, one for a hefty subset of Algol.

    That same year Peter Deutsch (age 15) did an interactive version of LISP on the PDP-1 in about 2000 instructions of machine code, and wrote a classic paper explaining how he did it.

    The just mentioned two papers plus knowledge of simple machine architecture opened the world of boostrapping your way from one execution system to another — and there was absolutely no mystery about what was going on.

    Now, aside from “relevance” and that the students think they know what they should learn, what do you think about starting the first course in programming this way? You could get out of machine oriented languages by the end of the first semester ….

    Only half serious, but this keeps popping up as I read about myriads of students who can’t understand or visualize assignment statements and other computer oriented semantics.

    And, back to the subject at hand, a good machine code debugger (like RAID at the Stanford AI project in the 60s) showed the registers and the state as the program was single stepped — again absolutely no mysteries.

    Cheers,

    Alan

    *http://delivery.acm.org/10.1145/810000/808896/pd1-3-schorre.pdf?key1=808896&key2=1182685821&coll=GUIDE&dl=GUIDE&CFID=103894329&CFTOKEN=91048536

    Reply
    • 9. Alan Kay  |  September 30, 2010 at 12:29 pm

      P.S. Here’s the URL for Peter’s LISP implementation — one of the most beautiful and compactly useful machine code programs ever written.

      The paper which precedes it is also very well written.

      Click to access DEC.pdp_1.1964.102650371.pdf

      Reply
    • 10. gasstationwithoutpumps  |  September 30, 2010 at 1:49 pm

      Actually, the top-down vs. bottom-up approaches to programming are still hotly debated. Our Computer Engineering department starts students with machine architecture, assembly language, and C, much as Alan describes (though with newer hardware). Our Computer Science department starts with Java (reluctantly having added some lower-level courses, so that students with no previous programming experience can take a CS course—these lower-level courses are in Java, C, or Python). The various tracks merge into the Java-based CS track, though I believe that the core data structures and analysis of algorithms course then switches to C++. I don’t think that the CS students are ever required to take any courses that are outside the Algol-derived structured programmng family except for one assembly-language course. (Oh, the BS students also have to take a course in comparative programming languages. I wish I could tell you what is in that course, but the instructor hides the web pages behind a password-protected interface, a practice I find to be very bad for maintaining any sort of cohesion in the curriculum.

      Reply
      • 11. Alan Kay  |  September 30, 2010 at 2:12 pm

        A benefit back then was that the architectures were pretty non-standard, so to be a programmer you had to be willing to learn many different machines with different instruction sets.

        This paid off later on when higher level languages started to happen — most people learned at least 20 HLLs without getting imprinted on the first one they learned.

        But, there was a lot of hysteresis in going from LLL to HLL, lots of LLLs just didn’t want to do this. Given where we are today, we can see this with C programmers (and since I think Java is now a LLL, with Java).

        However, Dartmouth BASIC (the first one) implemented on a GE computer, did have the “Lorenz duckling effect” — people who learned it first tended to not like to learn other features in other languages.

        In all cases, this could be an artifact of rather different types of people doing the learning.

        Cheers,

        Alan

        Reply
      • 12. Mark Guzdial  |  September 30, 2010 at 2:36 pm

        Agreed that it’s still a hotly debated topic. Yale Patt’s hardware first approach attempts to go from the transistors, up through logic gates, to registers, machine language, and finally C. It isn’t clear how well it works, or how much the instructor needs to know to implement it well/correctly. One challenge is maintaining student focus when they think they came to learn about computing, and you start out talking by switches, and they don’t see the connection between the two. Today, our experience of computing (e.g., Wii and cell phones) seems impossible to derive from the low-level components, to a student who doesn’t understand either end.

        I think there’s a confounding variable about the early days of HLL’s, Alan. There weren’t many of you, you were all smart, and you were all good in mathematics. My bet is that the top 10% of our intro CS classes looked like the early computer pioneers in terms of number, demographics, intelligence, and prior knowledge (though maybe not point of view), and I’ll bet that they could learn the way you describe. What to do with the other 90%? What can be done that would work for most of the students? I think we have adequate knowledge today that students aren’t learning the notional machines.

        Cheers,
        Mark

        Reply
        • 13. Alan Kay  |  September 30, 2010 at 3:52 pm

          HI Mark,

          I don’t know how to answer your question. If the profession was medicine, it would be easy: don’t let the 90% get anywhere near human bodies!

          In the Air Force at Air Training Command, none of us had college degrees, but we had to pass a screening test (devised by IBM) before we could get into the program and receive training. I think this test must have been pretty good, because everyone did well in the 40 hour week of 1st level training.

          From reading and talking to college professors, it seems that an astounding amount of remediation takes up time and energy in the first years of college (the colleges love this because they have more head count and income this way — but it is anti what college is supposed to be about).

          On the other hand, it seems as though high school should not be the last chance for getting prepped.

          Probably the best thing to do is to embrace reality by finding ways to separate remediation out and not let it affect standards of the main stream courses (I’m talking about more than just courses for prospective majors).

          If the 90% really want to learn computing, then some of them will probably have to spend an extra year or two doing the prep that is needed to handle a main stream course.

          This is certainly the way it works in sports and music, and there is no stigma in it at all. You just can’t really participate until you work up some chops, and there’s nothing wrong spending some extra time at it.

          Best wishes,

          Alan

          Reply
          • 14. Mark Guzdial  |  September 30, 2010 at 3:57 pm

            Hi Alan,

            That “remediation” is where I see my challenge. What would you do in that remediation? How would you teach them the notional machine? This is particularly important when we think about preparing high school CS teachers, who are unlikely to have the chops of a CS major. While it’s a big problem that students interested in education tend to be from the bottom half (sometimes bottom quartile) of the whole student body, that’s the reality of it. Now, how do we teach them about computing so that they can teach the students?

            Cheers,
            Mark

            Reply
  • 15. Alan Kay  |  September 30, 2010 at 4:55 pm

    Hi Mark,

    I don’t know enough about these students to come up with a plan. But we can be justified in guessing that they are not uniform in their backgrounds, attitudes, motivations, etc.

    It took us many years to learn enough about 5th graders to come up with a scheme and environment that would be very successful with more than 90% of the kids. I have a strong feeling that “mediation” in 5th grade is likely to be more effective than “remediation” in 13th and 14th grades.

    But we should ask questions such as “if this were basketball (or some other well established sport or musical instrument), and these students had never seen it before and were learning it for the very first time, and they were in the bottom 50% of ‘athleticism’, then how many hours will they need to put in per week and year to get fluent at basketball, and gain the depth needed to teach it in high school?”

    One can get passably fluent in lots of things, talent or no talent, in a few thousand to 5000 hours (1000 hours is more than 2 hours a day for a year). Those golden hours are hard to come by in college and out in the vocational world — in part because there is enormous competition for them in a learner’s world.

    Of course it takes more than just hours, but for most such learners, the lack of good hours of “practice with reality” will be pretty difficult to surmount. (This is said by a former profession musician of less than spectacular talents who had to practice every day regardless of the amount of playing done in gigs.)

    But I’ll think about your question — and its interesting and difficult side-conditions — and try to come up with a few ideas in the next several days.

    Cheers,

    Alan

    Reply
    • 16. Alan Kay  |  October 1, 2010 at 10:58 am

      More thoughts on “teaching the 90%”.

      I think of this as “teaching the 80%” because the top and bottom 10%s have their own special properties.

      We can get many clues by looking at the studies of what it means to be “functionally innumerate”. Innumeracy is not primarily about being unable to recognize numbers or not be able to add them together, but is much more about the differences between “participant” and “spectator”. For example, “participants” when confronted with numbers in some context automatically do things with them e.g. normalize them in some way to “see what they mean” rather than “see what they are”.

      Thornton and others at Tufts years ago devised a pretest that would predict final grades in first year physics (a pretty interesting test and result).

      The IBM test, made for the Air Force, predicted success but not failure (so we don’t know how many who failed this test still could have learned to program).

      I would guess that ability to deal with “chains of propagating causations” in any mechanical or symbolic endeavor would predict part of how easily programming is learned. The other part might have to do with “abstractions as both metaphors and mechanisms”. There are tests for the “chains”, and probably for the abstractions. Seems as though educators would have (should have) searched for and performed these tests for many years now.

      I think that learning and dealing with the motivations of 5th graders is a lot easier than with college students. With the latter, for many reasons, I’ve found it is a good idea to get them to write several “brain barfs” (in the form of emails) each week on questions I give. They are reasonably motivated to do this because I have the class choose subjects and questions for me to write on, and I write two brain barfs a week also.

      One reason to do this is to get them to start articulating ideas in stronger frameworks than the oral ones they unfortunately are usually still mired in. After a few weeks of this I start giving them suggestions about improving how they write, etc. Another reason is that this is a great way to get a sense of the style of each student, and this leads to modifications of how to go about making the course work.

      I haven’t tried teaching programming for the 80%, but a number of the courses I’ve taught recently at UCLA are essentially about “education of the future” and the class demographic is widespread. Because many of the examples are past experience with children, I get the class to learn Etoys and do some of the examples the 5th graders do, and then get them to come up with projects that have educational content that could be better learned if the children could make models in Etoys.

      We see the 10% 80% 10% in the results of these students. The good news for Mark is that they all easily learn to do things in Etoys — and like the children have no trouble learning the Etoys version of assignment, variables, etc. (This seems to be because many of these concepts are deeply situated in the examples which are concretely embedded in the subjective worlds of the students).

      However, the big differences I’ve seen lie in ability to invent and design. Many of these college students (and elementary school teachers we’ve had experience with) have a lot of difficulty in two main related areas (maybe it’s one area):
      Imagining:
      –the connections/chaining of operations to accomplish something; and
      –the projective design elements — this could be considered as part of this or as a separate difficulty

      If I were designing a curriculum, I’d first try to find out more about who the students generally are and what they can generally do.

      If they follow the experiences above, then I would try to work out a bunch of experiences (including writing ones) that would try to help them gain stronger skills in the above two difficulties.

      This shouldn’t be a big deal. When I was teaching guitar 50 years ago, some of my students were “rhythmically challenged” — the remedy is that they have to build something in their minds to handle what for many is automatic. This is not unlike creating a “stop sign watcher” when you are learning to drive a car — genetics didn’t supply it, and it’s needed. I had some insight into the rhythm problems because I had them as a teenager and since I wanted to play with the big boys I had to do a lot of extra slogging to build a workable rhythm agent.

      I think the 80% can be guided to build the foundations and skills they need to do reasonable programming (or reasonable writing, or reasonable musical playing, etc.)

      The deeper questions are whether we can come up with what they need to do, and whether they can muster the will and motivation to do the extra work.

      Cheers,

      Alan

      Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 8,988 other followers

Feeds

Recent Posts

Blog Stats

  • 1,869,003 hits
September 2010
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
27282930  

CS Teaching Tips


%d bloggers like this: