Supporting Creativity but maybe not Creation

April 25, 2011 at 8:17 am 8 comments

The cited blog post is critiquing Apple for having wonderfully creative technology but not well supporting software creation — and what does that mean for the future of computing, as Apple becomes the copied model.  Apple’s tools are used often by professionals in the creativity profession, but too often, those professionals aren’t also involved in creating new technology, even if just for themselves, and Apple isn’t really helping them make that move.  We saw a form of that in Brian Dorn’s dissertation work, where graphics artists had wonderful tools for creating digital media, but fended for themselves in learning to create software.

The concern voiced in this blog is that so-goes-Apple then so-goes-the-industry. This does seem to be a problem in our industry (is it true for all industries?) that ,when one company pulls ahead into a virtual monopoloy, everyone else adopts the approaches and strengths of the front-runner.  How many “next Microsofts” or “next Googles” or “next Facebooks” have you heard about?  The strengths and weaknesses of that company’s approach becomes the model that everyone copies.

Apple’s abysmally, disastrously worst ideas will be mindlessly copied along with their best.  To some extent this is already happening.  And if current trends continue, there will come a time when nothing resembling a programmable personal computer will be within the financial (or perhaps even legal!) reach of ordinary people.

The user-programmer dichotomy will be permanently cemented in place – even now, most computer owners don’t think of the expensive space heater on their desks as something programmable.  But in the future it won’t even occur to a curious child that the behavior of his, let’s say, schoolpad can be altered in ways unforeseen by its makers – the essence of the creative act we call programming.  We will be stuck with computers – machines which, within certain limits, are capable of literally anything – which have been deliberately – artfully! – crippled into being far less meaningfully-modifiable than our cars and houses.

via Loper OS » On the Still-Undefeated Tyranny of Apple..

Entry filed under: Uncategorized. Tags: , , .

Python MediaComp book is #2 in CS1 The best-researched CS textbook ever

8 Comments Add your own

  • 1. Cameron Fadjo  |  April 25, 2011 at 8:46 am

    Thanks for sharing this post. Is this criticism not of the company being emulated, but of the nature of enterprising thought in business? Blaming the current trendsetter for not setting trends that others perceive to be of critical importance deflects attention away from the individual and toward the ones who set the pace. Apple does not, however, set the tone (as, I’m sure, a dearth of smaller development houses would argue). Does corporate responsibility extend to creating a position that others would emulate for the sake of emulation? Or do other corporations have the responsibility to set their own pace and tone according to their own goals?

    Unfortunately, the point of this blog post gets muddied by the emphasis on Apple and not on the individual. Is it not the point of this cited blog post that the user needs to also become a creator?

    Individual reflection is expensive (in terms of resources expended), so it is much easier to emulate than innovate. The goal is to advocate for becoming a user who also creates. Rushkoff makes this point, albeit from a stark vantage point, in his book ‘Program or be Programmed.’ Others (myself included) are pursuing similar tracks to bring innovation to the areas where learning to create often, but not always, take place.

    Who is partially responsible for encouraging creation: businesses and industries (e.g. Apple, In.c), schools (K-12 or higher education), or both?

    Reply
  • 2. Alan Kay  |  April 25, 2011 at 8:59 am

    Yep …

    As Mark pointed out years ago, it’s the casual perceptions from general use that form the mental cages (he caused gasps and a sense of disorientation but using Squeak’s lower level built in graphics to simply draw a line across the display over “windows” etc., which the students had thought were somehow built-in givens).

    On the other hand, something along the same lines and nicer than Hypercard was can now be set up to run in the browser (treating Javascript and the bad graphics environment there as just an ugly low level machine).

    If Google ever carries Native Client all the way to where it should be (they are getting closer — hopefully not asymptotically), then this will allow safe access to lower high speed levels so that competitive programming environments can be downloaded automatically.

    We have been experimenting to see how much of the lower levels of Etoys (such as the massively parallel particle system) and STEPS (such as the Nile/Gezira from scratch high quality graphics) can be successfully run in NaCl.

    Note that Apple can still be a problem here — for example, they could restrict the iPad to Safari, and not allow Safari to have an NaCl like access.

    Cheers,

    Alan

    Reply
  • 3. Alfred Thompson  |  April 25, 2011 at 9:24 am

    I’m going to avodi talking about Apple because of my biases both obvious and not obvious. But I would like to say that I do think that companies in the technology industry do have a vested self-interest in helping to encouraage and prepare the next generation of creators. At Microsoft this is something we talk about all the time. Our programs like DreamSpark http://www.dreamspark.com for providing free development software is only one small part of that. We also provide a lot of online learning tools from example code to videos to course curriculum for teachers and professors. To be fair I see Google doing a lot to promote CS education and creation of technology as well.

    Reply
    • 4. Mark Miller  |  May 8, 2011 at 12:19 am

      “I would like to say that I do think that companies in the technology industry do have a vested self-interest in helping to encourage and prepare the next generation of creators.”

      This is something I’ve wondered about for a long time. What happened to the idea of including a development environment with the computer, letting buyers know it exists, and inviting them to learn about it? In the 1970s and 80s it was a given that computers would come with a programming language (usually Basic). Computers came with introductory manuals that would teach the reader about the language with examples and exercises. On the low-end machines, the Basic language doubled as the default command language for the machine, to carry out basic user operations. By the mid-1990s this idea (having a programming language come with the machine, and it doubling as a command language for the computer) went out the window. I’ve long had my suspicions about why that happened. One was obvious. GUIs won out over command-line interfaces. My suspicion is that in the 1970s, as I understood things, “computer literacy” was widely defined as being able to program a computer, but this was running into problems, as a lot of people tried programming (at least with Basic), but didn’t really take to it. I figured QuickBasic was removed because Microsoft thought people wouldn’t want access to it anymore. It would have no useful function in a GUI, and besides, making its existence obvious would probably intimidate their intended customer base.

      Probably the reason computers came with languages in the early days was due to the fact that there was a hobbyist culture that preceded their existence, and it was their primary market.

      As I recall, Apple discontinued Hypercard when Steve Jobs took over as CEO in 1997, but when they released OS X, they started including programming languages. Something I discovered is that Apple had been including an implementation of Ruby as standard, and I wondered if maybe that was the reason it was becoming more popular (though the Rails framework certainly helped with that). I have no idea if they’re still doing this, though I know that they have continued to include XCode as an optional install with every Mac sold. They don’t make it real obvious that it’s available, though.

      Looking back on the era of the 1990s, it seemed to me that the computer industry was knocking out the lower rungs of the ladder, making it more difficult for novices to start learning about programming. Why would they do this if they understood that they had an interest in encouraging and preparing the next generation of creators, as you say?

      Reply
      • 5. Alfred Thompson  |  May 8, 2011 at 11:22 am

        Well first off I’m not so sure that companies really understood (or understand completely today) that they have a vested interest in the next generation of developers. At Microsoft we are starting to come around. A now retired very senior exec told me that we (Microsoft) had been focusing so hard on making things better for professional developers that we had made it difficult for beginners. Since that conversation several things have happened. One of the growth of the promotion of the free Visual Studio Express editions (http://www.microsoft.com/express/) which is a start. An other thing has been the release of Small Basic (http://smallbasic.com/) which is a simple IDE with a simple version of BASIC. This tool also has an easy transition for Visual Basic as well. There is curriculm (also free) available. Kodu (http://research.microsoft.com/en-us/projects/kodu/) is available for free for educational uses for very young children.
        Most copies of Windows since the release of the .NET Framework also have command line compiliers standard for C#, Visual Basic and C++. This seems not to be widely know though and with the preference for GUI development tools the Express editions which have to be downloaded seperately get most of the attention. I don’t know why the Express editions are not bundled with Windows but I can imagine some concerns of several levels. None of them having to do with discouraging new developers.
        Lastly I would bring up the DreamSpark program (https://www.dreamspark.com/default.aspx) which provides a wealth of professional development tools including server software free for students. This program includes high school students as well as university students.

        Reply
        • 6. Mark Miller  |  May 8, 2011 at 5:50 pm

          One answer that comes to my mind now as to why dev. environments are not included, or if they are included, are not made more obvious, is that the computing platform has become more complex (because of the way industry has organized it), and the landscape keeps changing. There’s no strong academic support for a) finding unifying, expanded “themes” in the architecture (and I mean this in the fullest sense: display, input, processing/manipulation, persistence, and transmission of information) that can be rendered through a new language, and b) developing pedagogical or “easy to use” languages that can be used by “children of all ages” to manipulate the system. However, there are languages of these types, as have been discussed on this blog, which present programming as a value in and of itself, separate from the system, or the system is only accessible through an FFI.

          One exception I just remembered is that on the Mac there is a facility called “Automator,” which is basically a graphical scripting environment, and it’s designed to interact with elements of the overall system. It’s made rather obvious to new users, but so far I haven’t found it to be that useful. Maybe it’s me, but I don’t think it was designed well, and it’s not something that I think would invite people into programming.

          The 60s and 70s saw an intellectual focus on pedagogical, or “easy to use” languages, which became very popular, both in and out of schools, in the 1980s. It just so turned out that the machine architecture that the languages from the 60s and 70s assumed mapped well onto the microcomputers of the 80s era. Also, these languages more or less rendered some aspects of the machine architecture in their features.

          In the early 80s there was a strong academic backing for programming. Schools at a few different levels perceived some value in it, and pursued it with vigor. As I noted in another discussion on this blog, John Maxwell, in his Ph.D. thesis, “Tracing the Dynabook,” attributed the falloff in educational support for programming to a misunderstanding of its educational benefits. When the fallacy was made plain, rather than re-examine their assumptions, and look at the computer as a “new thing,” they decided to discard programming as useless to education. It seems to have been a “one shot” deal.

          When GUIs were widely implemented on microcomputers, the “language included” feature didn’t come along for the ride, and industry didn’t see the opportunity to bring that same sense of there being a close tie between a language and the system, even though there were a couple fleshed out examples out there, if anybody wanted to learn from them.

          Reply
  • 7. Alan Kay  |  April 25, 2011 at 10:33 am

    Another slant on this is that “simple but not too simple” is critical.

    At PARC, after a few hundred successes with children, Adele and I encountered two who were quite sure that “computers could not do X”. This was so unusual that we looked into the backgrounds of these kids and found that they both had parents at HP, and had both learned a little BASIC, and this formed their notion of “computer”.

    This has always worried me about the widespread distribution of Etoys and Scratch, both of which were designed with very restricted contexts in mind, and both have some barriers that should not be there for the general purpose of providing ways to make things, explore what computing is all about, and to help build strong intuitions about “what you should be able to do with computing”.

    Hypercard had other limitations along these lines — for example, it did not have a strong metalevel or lower level, and required “X-Commands” to be made in some other language — like C or Pascal). Many of today’s systems have problems along these lines — just to pick one: Python.

    Cheers,

    Alan

    Reply
  • […] world of letters better.  But how does it make it better?  For me, I’m still attracted to the innovation argument: we use code as a medium to say, share, and test ideas that we can’t in other media.  That […]

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 9,004 other followers

Feeds

Recent Posts

Blog Stats

  • 1,875,570 hits
April 2011
M T W T F S S
 123
45678910
11121314151617
18192021222324
252627282930  

CS Teaching Tips


%d bloggers like this: