Program your Apple II! Why not program today?

February 20, 2010 at 4:29 pm 20 comments

Ian Bogost has been reading old Apple Computer ads (from back when they were Apple Computer), and noting how users were encouraged to actually program their Apple II computers.

Once you’ve unlocked the power of the desk-top computer, you’ll be using Apple in ways you never dreamed of. You don’t want to be limited by the availability of pre-programmed cartridges. You’ll want a computer, like Apple, that you can also program yourself. … The more you and your students learn about computers, the more your imagination will demand. So you’ll want a computer that can grow with you as your skills and experience grow.

via Ian Bogost – Pascal Spoken Here.

Where did that attitude go?  Is it because there are so many “pre-programmed cartridges” (there’s an App for that!) that there is little reason to program?  Or is it that programming is too hard? Or that knowing about the internals of a computer is now considered too esoteric or unimportant?

Ian and I have been exchanging email about this, and he reports that he is getting a lot of comments along the line: “I don’t have to know how my car works.  Why should I have to know how my computer works?

Ramesh Jain once gave me a good argument in response.  Marshall McLuhan said that all tools that humans made were extensions of human facilities.  A bicycle extends our legs, to let us move faster and further before tiring.  A car extends both our arms and our legs, because now we can carry more while we go faster and further.

A computer extends our mind.  While it’s certainly possible to use one’s mind without attempting to know oneself, it’s more powerful to be reflective, to consider how one’s mind works, and to identify one’s strengths and failings in thought.  We can use a computer without understanding it, but not as well, and not as effectively.  You can use it for typing and counting and talking without reflection, but if you want to use a computer for thinking, you are better off knowing something about how it works and about how to go beyond those “pre-packaged cartridges.”  When it comes to thinking, there’s no app for that.

Entry filed under: Uncategorized. Tags: , , .

Stanford revamps CS curriculum in a Thread-like manner Now’s the time to hire Computing Education faculty!

20 Comments Add your own

  • 1. Alfred Thompson  |  February 20, 2010 at 8:25 pm

    I posted something that is closely related to (and inspired by) this post on my blog (http://blogs.msdn.com/alfredth/archive/2010/02/20/not-being-able-to-program-a-computer-is-like.aspx)
    The analogy question is one part of it and what I blogged about. But the other part is that it was just so much easier to program in the early PC days. I think that Scratch/Alice and some others are easy and perhaps easier but they are sort of domain specific where BASIC and PASCAL in those days were more general purpose. The closest thing to those early days that lets beginners create more general purpose software is probably Small Basic and after that Visual Basic. Unfortunatly any version of BASIC these days gets a bad rap and beginners are scared away from them. Of course I admit to being a fan of BASIC from way back so I think these are reasonable tools. But the techical 1337 (cool kids) are all pushing PHP, Ruby, PURL and Java which are not quite that approachable for most beginners.

    Reply
  • 2. Owen Astrachan  |  February 20, 2010 at 9:28 pm

    I don’t think this is about ease of use in terms of BASIC being easy and Ruby being hard. What could you do with BASIC that beginners were doing? Print your name a thousand times, or write a text based adventure game. Or peek-and-poke your way to clumsy graphics that were still pretty cool. Folks were saving code on cassette tapes, and then on 5.25″ floppies. The bar was low, so it was fun stepping over it. Now there’s WoW, iPhone apps, the Internet, YouTube, …. The bar is incredibly high. Why would I want to program a text-based adventure game or print my name 1000 times.

    When I was 25 I walked by a radio shack and typed in something like

    10 print “I Love Radio Shack … and you”
    20 goto 10

    then I ran it and stepped back. People stopped to watch the screen print over and over and over… What can you do today in 20 seconds that people will watch? It’s not about the simplicity it’s about expectations. You could be advanced in 1983 after programming for 100 hours. Now it takes 10,000 hours.

    Reply
    • 3. Mark Guzdial  |  February 20, 2010 at 9:56 pm

      I can’t match that two-liner, for minimal amount of effort and for (at the time) amount of coolness. But we *can* get to something interesting in a pretty small amount of code:

      def negate(picture):
      for pixel in getPixels(picture):
      newcolor = makeColor(255-getRed(pixel), 255-getGreen(pixel), 255-getBlue(pixel))
      setColor(pixel,newcolor)

      Certainly, this is a “trick,” but the “trick” is (a) providing scaffolding so that something interesting can be done quickly and easily and (b) choosing a domain that has a fairly low bar of interestingness. Few students know how an image is negated, so it’s a visual surprise when it happens, and intriguing that it’s so easy to make happen. (We can do similar easy and surprising results in audio.) I’m sure that we can invent similar “tricks.”

      Reply
    • 4. Ian Bogost  |  February 20, 2010 at 11:14 pm

      I talk about this in my post. We’ll never recover a situation like the one that existed in the early 80s, but we might be able to reproduce it.

      This is why I like teaching on older systems. I can write an interesting minimalist 57 byte Atari game in 5 minutes, and it’s still really an Atari game, not just an exercise.

      Reply
    • 5. Ian Bogost  |  February 20, 2010 at 11:32 pm

      Incidentally, my favorite one-liner BASIC program is as follows, but this one requires a C64 (PETSCII and all).

      10 PRINT CHR$(109+RND(0)*2);:GOTO 10

      Reply
    • 6. Alfred Thompson  |  February 20, 2010 at 11:36 pm

      Actually I was trying to say what you can do in context. What was impressive 20 years ago is not so today. That is why I didn’t say generic BASIC but Small Basic and Visual Basic. You can create programs that look like and are “real Windows” programs in them as easily as you could create white letters on back background BASIC code 20 years ago. Python does some of the same things with some scaffold around them as well.
      We need more scaffolding so that students can do today what is as impressive for them as you loop was a few years ago. It’s best when you can hide a lot of it from students until they are ready for it though.

      Reply
      • 7. Ian Bogost  |  February 20, 2010 at 11:47 pm

        Alfred, I’m not that familiar with SmallBasic yet, mostly because I spend most of my time in MacOS and UNIX, but that’s interesting to know about it. I think the ease with which one can create platform-native applications is important indeed.

        But there’s something else of course, and that’s the path to sophistication. Python and Processing are very popular right now and for good reason, but they don’t provide a path to computational prowess in the same way that BASIC->Pascal/asm did in times past.

        Reply
  • 8. Alfred Thompson  |  February 20, 2010 at 11:56 pm

    Providing a path to more computational prowess is a limitation of many good and valuable introductory tools. It is something that several of us at Microsoft have been discussing for several years now.
    The Alice team is trying to address that in Alice 3.0 with a path to real Java. The latest version of Small Basic has a “graduate” option that creates Visual Basic projects for the Visual Studio IDE. It will be interesting to see how these methods work in the classroom. They sound good in theory but we’ll have to see over time.

    BTW It probably wouldn’t be hard to make Small Basic create C# or even C++ projects. Java would be a bit more work since it doesn’t share the same IL and libraries as VB does.

    Reply
  • 9. Jeff Graham  |  February 21, 2010 at 9:08 am

    I’ve said this before on this blog, but I think that if it takes 2 minutes with google to find a program that will do what you want, why program your own? In order for the computer to be useful to the masses, I think this is the way it has to be. If all we had now were Apple II’s the market would be limited to those who found that computer interesting and/or useful and I think that would be a small batch of people. I am pretty sure that Lotus 123 sold way more computers than any version of basic has.

    I think Mark has the right idea about spreading programming throughout the disciplines. I don’t think there is any way to make a one size fits all CS1 course. I still suspect there is an aptitude issue, but it makes sense to me to try everything we can think of first, before we give up.

    Reply
    • 10. Ian Bogost  |  February 21, 2010 at 11:56 am

      But Jeff, even if you are adept, it *doesn’t* take 2 minutes with Google to find a program that will do what you want. There are so many results, its often hard even to find the code within them. Only those who are already very adept succeed at such an act.

      One of the implications of Apple’s early approach to computing is precisely that there was no need for a CS1 course. It was equivalent to owning a personal computer.

      Reply
      • 11. Jeff Graham  |  February 24, 2010 at 11:42 am

        I’ve found most of our students here to be pretty adept of finding things with google, especially things that they were supposed to do themselves. You’re right though, it might take them longer than it takes me. It might even take them longer than just doing the work.

        Reply
  • 12. Mark Guzdial  |  February 21, 2010 at 12:34 pm

    Aren’t we comparing apples-and-oranges here? The 1-3 liners that Owen and Ian are mentioning are not the same as applications, even at the level of Bank Street Writer, AppleWorks, and PrintShop. They’re just a cute bon mot — it’s a quick and interesting thing. Can’t we still do that today?

    Reply
    • 13. Ian Bogost  |  February 21, 2010 at 2:05 pm

      Sure we can. But in times past, when you turned on your computer, it practically urged you to write such a little program.

      Reply
  • 14. Alan Kay  |  February 22, 2010 at 10:12 am

    Squeak Smalltalk includes Etoys as part of its general user interface.

    A fun one liner “that people will look at” is

    foo turn 5

    (the “” is implicit in the Etoys scripts), where foo can be *any object* on the screen, including things that people think are “built-in” such as any window, or panes within the windows, scripts themselves, etc.

    I do this all the time in talks to universities and it never fails to draw shocked gasps from the students.

    I also create this program in front of their eyes (a favorite starter for 5th graders).

    car forward 10
    car turn steeringWheel’s heading

    in which the car and the steering wheel are drawn as pictures, and the “steeringWheel’s heading” is a tile that is dragged out of the “viewer” for the “steeringWheel” and dropped after “turn” to instantly feed values into the process and teach the children what a variable is for after just one exposure.

    This also elicits shock from the college students, because they have been bamboozled into thinking that “real computing” is doing everything the hard way just as in the early 60s with simulated punched cards and separate compile and load and test runs.

    We can see that what happened to the Apple II is not complexity, but that much of the idea of personal computing got invaded and conquered by bad old ideas from both industry and university — an old fashioned “land grab”.

    Cheers,

    Alan

    Reply
  • 15. Alan Kay  |  February 22, 2010 at 10:20 am

    Yikes!

    Just to underline my points about bad old ideas taking over. I’m typing into a non-WYSIWYG UI which will not let me see what the result is going to be (and it won’t even let me go back to fix what it did to my text) — which was to have it’s own meanings for the “less than” and “greater than” brackets — so it disappeared what I wrote.

    I will try to use a tilde to enclose what I actually wrote.

    First example:

    “foo turn by 5
    ~over and over~

    (the ~over and over~ is implicit in Etoys scripts)”

    Second example:

    “car forward 10
    car turn by steeringWheel’s heading
    ~over and over~

    I have a deep feeling of disgust at the low level computing has sunk to each and every time I use the web for something.

    And a deeper feeling of disgust at both the programmers who are willing to put out such crap (long after better UIs were invented) and to people who use this crap without complaining vociferously!

    Reply
  • 16. Fred Martin  |  February 22, 2010 at 10:23 am

    one word: robots!

    here’s a domain where it’s meaningful to write one-liners and other short programs — getting stuff to happen in the real world is great fun.

    fred

    Reply
    • 17. Jeff Graham  |  February 24, 2010 at 12:04 pm

      I’m afraid that unless you’re whole curriculum is robots they’ll drop out of the program. How can you transition smoothly from Scratch or robots to the regular cs curriculum? Maybe we shouldn’t?

      Reply
  • 18. Aaron Lanterman  |  February 24, 2010 at 12:22 am

    With the old machines, you had to dig into the internals to get them to do anything. Not every affordance was available with a standard BASIC command – you needed to PEEK, POKE, and CALL and get into the memory map. Nowadays the machines *fight* you if you try to get into the internals – perhaps good reason, since it poses stability and security risks.

    Even many commercial programs begged you to explore them. You could list a game like Temple of Apshai or Akalabeth and see what made it tick. (At one point I compiled Temple of Apshai using a BASIC compiler, and it became much less tedious!)

    Perhaps BASIC was “harmful” – but the GOTO and GOSUB was good preparation for moving to the JMP and JSR of assembly code.

    Reply
  • […] other two candidates say anything about the responsibility of a College of Computing for improving the state of computing education across the society.  Of course, I agree that we do have a responsibility here, to figure out what people should know […]

    Reply
  • 20. links for 2010-03-08 « Blarney Fellow  |  March 8, 2010 at 9:32 pm

    […] Program your Apple II! Why not program today? « Computing Education Blog (tags: philosophy programming thinking writing) […]

    Reply

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 11.4K other subscribers

Feeds

Recent Posts

Blog Stats

  • 2,096,809 hits
February 2010
M T W T F S S
1234567
891011121314
15161718192021
22232425262728

CS Teaching Tips