Facts that conflict with identity can lead to rejection: Teaching outside the mainstream

March 31, 2014 at 1:13 am 32 comments

Thought-provoking piece on NPR.  Take parents who believe that the MMR vaccine causes autism.  Show them the evidence that that’s not true.  They might tell you that they believe you — but they become even less likely to vaccinate future children.  What?!?

The explanation (quoted below) is that these parents found a sense of identity in their role as vaccine-deniers.  They rejected the evidence at a deeply personal level, even if they cognitively seemed to buy it.

I wonder if this explains a phenomenon I’ve seen several times in CS education: teaching with a non-traditional but pedagogically-useful tool leads to rejection because it’s not the authentic/accepted tool.  I saw it as an issue of students being legitimate peripheral participants in a community of practice. Identity conflict offers a different explanation for why students (especially the most experienced) reject Scheme in CS1, or the use of IDE’s other than Eclipse, or even CS teacher reaction when asked not to use the UNIX command line.  It’s a rejection of their identity.

An example: I used to teach object-oriented programming and user interface software using Squeak.  I had empirical evidence that it really worked well for student learning.  But students hated it — especially  the students who knew something about OOP and UI software.  “Why aren’t we using a real language?  Real OOP practitioners use Java or C++!”  I could point to Alan Kay’s quote, “I invented the term Object-Oriented, and I can tell you I did not have C++ in mind.”  That didn’t squelch their anger and outrage.  I’ve always interpreted their reaction to the perceived inauthenticity of Squeak — it’s not what the majority of programmers used.  But I now wonder if it’s about a rejection of an identity.  Students might be thinking, “I already know more about OOP than this bozo of a teacher! This is who I am! And I know that you use Java or C++!”  Even showing them evidence that Squeak was more OOP, or that it could do anything they could do in Java or C++ (and some things that they couldn’t do in Java or C++) didn’t matter.  I was telling them facts, and they were arguing about identity.

What Nyhan seems to be finding is that when you’re confronted by information that you don’t like, at a certain level you accept that the information might be true, but it damages your sense of self-esteem. It damages something about your identity. And so what you do is you fight back against the new information. You try and martial other kinds of information that would counter the new information coming in. In the political realm, Nyhan is exploring the possibility that if you boost people’s self-esteem before you give them this disconfirming information, it might help them take in the new information because they don’t feel as threatened as they might have been otherwise.

via When It Comes To Vaccines, Science Can Run Into A Brick Wall : NPR.

About these ads

Entry filed under: Uncategorized. Tags: , , , .

Computer coding more in demand than languages: Survey of UK adults How do we make programming languages more usable and learnable?

32 Comments Add your own

  • 1. Bart  |  March 31, 2014 at 6:55 am

    This is so profound and insightful … great thinking !! Thanks for sharing this.

    Reply
  • 2. Janet  |  March 31, 2014 at 8:30 am

    Speaking for myself, if I had been in that class and been told to use Squeak, I’d have thrown a conniption too. But not because I consider myself an “OOP expert” or it’s part of my identity. It would be because I’d be facing the wholly unnecessary learning curve of a new language, which will then be totally worthless to me later. Instead, I can use what I already know about C++ or Java, which everyone else is using (i.e. I have lots of help from my friends/my textbooks/the internet, and I can put “coding experience in C++” on my future resume without embarrassment) and I can even expand my skills with C++ or Java to really get good at OOP in it.

    In other words, I wouldn’t have taken the OOP class simply for the joy of learning OOP. I would have taken it because it would help me improve my C++ or Java coding skills by getting good with a better design strategy. The people who are taking this to get a prerequisite out of the way and don’t expect to use it again might be more mellow… but to me, you’d seem to be 1) pulling a bait-and-switch, and 2) putting heavy “cognitive load” on me (learning a new language) which did no good whatsoever for me.

    If I had gotten fairly good with Eclipse, and then somebody told me I couldn’t use Eclipse because I know better than you, that’s why, I’d have been up in his face and then some. And if he then compared me to vaccine deniers in my rational thought– hoo boy.

    The last time I took classes, I had three back-to-back classes MWF: one used MATLAB, one used a horrible blend of C and C++ (team taught, one professor used C and the other C++), and one Python. One other class notionally had no coding, but the final project turned out to have a 10,000-line Java program at the heart of it (which, of course, was buggy and temperamental). The language choices were optimized for the professors’ comfort, not for any benefit for the student. (PS, the only IDE which would have worked for me would have been Eclipse. Something else might be “better” for you, but it’s not “better” for me to have three different IDEs, plus MATLAB, and trying to toggle between them every night and weekend for 14 weeks.)

    As for empirical tests: empirically speaking, nobody but CS professors uses Squeak. Hence, if you don’t want to become a CS professor, there’s no need to ever touch it. Things that have an actual utility to the wider world get invented, it goes “fast and furious”. For example: Django was two guys at a small Kansas newspaper who put out some Python code about building websites in late 2003, went open source in late 2005, had a foundation in 2008, and is now basically everywhere. For that matter, Python itself is a good example of a wildfire spread. Since Squeak has not done so, empirically speaking, it’s a dead end. And it’s totally rational for your students to hate spending time there, when there are perfectly useful alternatives immediately to hand.

    Reply
    • 3. Steve Tate  |  March 31, 2014 at 9:11 am

      This comment I think illustrates something we’re seeing more and more of these days: the desire by many to make college more about vocational skills than about learning to think. The concerns about what someone can put “on their future resume” and the idea in the last paragraph that if no one uses it in the “real world” then there’s no point in learning it. The fact that many people, from politicians to students, seem to think that a university education is about vocational skills or learning about certain tools is, I think, a poor statement on what universities should be about.

      Instead of asking yourself “is Squeak used in the real world?”, the real question should be “does using Squeak lead me to think about things in a different way?” Education is about doing lots of things that you will never do again in the “real world” – do most people dissect frogs in their daily lives? Do most people read Shakespeare in their daily lives? Do most people use even a tiny part of the math they learned? In all of these cases, later use isn’t the point – developing the ability to think in various contexts is. I’m an old guy, so I learned Pascal when I was in college. But even then Pascal wasn’t used in industry (except for Mac programmers, who have always been a little weird…. :-) ) – it was still a great educational experience.

      Some amounts of this attitude are positive. If you can gain skills with real-world tools while you are having an educationally meaningful experience, then it’s a win-win. But if the pedagogical value of a language/tool/system is strong enough, then that’s what should be used in education, regardless of whether it is used outside academia at all.

      On the other hand, sometimes the pedagogical value isn’t there, and it’s just faculty taking the path of least resistance because it’s what THEY know. I have less sympathy for that, with the caveat that course design (or re-design) is a huge effort, and faculty are pretty overworked these days. The number of hours in a day is certainly a “real world concern” that affects a lot of education.

      Reply
      • 4. Michael S. Kirkpatrick  |  March 31, 2014 at 11:36 am

        While I generally agree that this is spot on (the focus of an OOP course should be on object-oriented concepts…not particular languages that fit the paradigm), Janet does hint at a point that you have missed: the importance of scaffolding. In her case, the important evidence to consider is whether or not Squeak is superior for learning OOP concepts, given the prior knowledge of Java and/or C/C++. That’s a different question than whether or not Squeak is superior given an absence of prior knowledge. Teaching OOP or CS1 isn’t my area, so I’m not aware of the evidence here.

        Reply
        • 5. Michael S. Kirkpatrick  |  March 31, 2014 at 11:43 am

          I stand corrected. I have now looked at the cited paper (should have done that first!), which states that students had prior knowledge of both Java and C (one semester of each). So the data here do support the claim that Squeak is superior, even given prior knowledge.

          Reply
          • 6. Steve Tate  |  April 1, 2014 at 9:35 am

            You do make a good point about scaffolding though, and Janet referred to cognitive load. There is certainly a question of whether the investment in time to get into a new language could better be spent on fundamental concepts, even in the presence of a less “pure” language. Personally, I think that jostling out of the comfort zone regularly (new languages, etc.) makes you look at things in a different way, and that is in itself a valuable learning experience. The danger comes when you have five different faculty in five different courses all doing this for the same student. It would be great if this could be coordinated across classes in a more intentional way, but that’s probably just not possible in a practical sense.

            Reply
    • 7. shriramkrishnamurthi  |  April 8, 2014 at 12:08 am

      > As for empirical tests: empirically speaking, nobody but CS professors uses [...]

      This same sort of argument used to be made about a whole bunch of languages that are now not only used, they’re outright filthy lucrative. Fill in OCaml, Haskell, various Lisp descendants, etc., in the hole.

      And people who are hardly “academics”, from Paul Graham to Eric Raymond to Joel Spolsky, point to the importance of being exposed to exactly such languages—and those are in essays written _before_ these languages exploded onto the marketplace.

      The problem with this form of empiricism is that the past is not a very good predictor of the future (it’s not even a perfect predictor of the past).

      Reply
  • 8. Michael S. Kirkpatrick  |  March 31, 2014 at 11:49 am

    I hear similar objections in the area of computer organization. There are a variety of toy architectures (Pep/8, MARIE, Y86), some of which are explicit subsets of real architectures like x86 or ARM. The comments I get back always include someone complaining about learning an assembly language they’ll never use. (Of course, they’ll probably never use x86 either…) I’ve never thought about that complaint as an identity issue, but I can see how it could be. They view themselves as apprentice software developers that need to learn specific job skills. Fascinating to think about the implications of this perspective…

    Reply
    • 9. Elizabeth Patitsas  |  March 31, 2014 at 7:09 pm

      I personally never really “got” assembly programming until I took a microprocessing course and had to program directly on an actual microprocessor. Programming in Y86 on an emulator may have been /easier/ but it never felt ‘real’ to me.

      In this case it’s not just an identity thing, it’s also a matter of concreteness (and relevance?). The microprocessor was concrete. Running something on an emulator just felt contrived and abstract to me.

      Reply
      • 10. Michael S. Kirkpatrick  |  April 2, 2014 at 10:08 am

        The pedagogical question here, though, is whether or not first learning on the emulator provided the foundation for you to program on bare metal. Would you have “gotten” assembly programming as easily without the Y86 background first?

        Also, more importantly, your story is anecdote, while data is more valuable for curricular decisions. That is, while you might have been fine skipping Y86, is that true for a majority of students? My suspicion is no. In my experience, I have found students grasp concepts significantly better based on graphical representations, which these toy architectures provide.

        Reply
  • 11. Keith Decker  |  March 31, 2014 at 3:56 pm

    I have found (over 15 years of teaching some course with a “weird” language) that the most important things are to (1) be very clear (on day one) why you made this decision, and what the benefits to the student are but also (2) keep reminding them of this, as these benefits appear, for the whole course.

    That being said, I hear you that maybe facts aren’t enough. Maybe something along the lines of “for those of you who have a lot of programming experience, you can become even better/faster/more efficient by learning these concepts and applying them to your own practice so you can go from good to great”:-).

    It is a little harder when the “odd” language appears later in the curriculum (as in your Squeak case). I actually used to spend almost the entire first class surveying the class about what languages they knew, and got paid to write in, and why they thought there were so many and what was good and bad about each, before moving onto why we were using what we were using. Maybe this helped with the identity issue “I will validate on this blackboard that your personal experience was worthwhile—notice that there are many different experiences in this room, and they are all worthwhile—lets have another, shared experience together in this class; you will be able to apply it to your experience in the following ways…”. Repeat variations as the course covers specific material.

    Now the “odd” language for me is in the very first semester, so I really don’t have this issue very much any more. I let them know that the second semester is industrial Java, and that we do transition to Java at the end, showing the Java version of each concept we cover (only about 7 concepts, but enough to write a distributed/networked, multi-player graphical game, so motivating, and we are also stressing a specific, gradable data and function design process (independent of language; this is Program by Design/HTDP2e).

    The good Java students really do seem to get the idea that there is more in CS than just learning a programming language (I suppose I should remember to affirm how smart they are to have picked it up on their own/in HS, in that case), at least for the very first course ever. If someone is bored we assign a more challenging project (seldom happens after week 4 :-). Best practice would tell me to separate these advanced students into a separate lecture, but we just don’t have the staffing…

    Reply
  • 12. Bonnie  |  April 1, 2014 at 3:03 pm

    For me, one of the big issues is that eventually, we have to teach an “employable” language, and at least for the local job market, students had better come out REALLY proficient in that language, because the tech interviews are brutal. So if we teach them first in some pedagogically pure language, we will then have to waste precious course real estate re-teaching the “employable” language. I realize that at Georgia Tech, the students probably take to second and third languages like ducks to water, but I don’t have that luxury. I have found the hard way that our students will require a full EXTRA semester course to learn the “employable” language to minimal levels, which means they don’t take some other course, like databases or operating systems.

    Reply
    • 13. Keith Decker  |  April 1, 2014 at 9:05 pm

      But surely a requirement of a 4-year CS major is the ability to adapt quickly to any “employable” language? Presumably databases is taught in SQL, and OS in C. Here the only “employable” language for ChemEng is Matlab, the only employable language for EE is C, for Physics Fortran, for CS it used to be C++ 10 years ago, but now is Java, except when students know it is Javascript for the web, or Objective C for IOS, or… And then the Neuro/Psych majors require Labview to be “employable”…

      This leads to either 10 different intro courses (problems with staffing and problems when students discover they ENJOY programming and want to switch to CS) or a small number that use a simpler language or subset to teach the core concepts (and then later courses can/should reinforce these concepts in context of “employable”). For CS majors it seems [citation needed :-)] like the *third* language is the point when they get the “aha” moment that these core concepts re-appear over and over— I agree that moving from language 1 to 2 is nothing but pain for the majority.

      I think if we don’t force them to make these linguistic moves—with our support and encouragement and reinforcement that really they have seen this before—that they really aren’t going to do very well when they get into the job market anyway, with only one language, even if they know it really really well. [and what happens, say, when they add features, e.g. lambda in Java 8?! I'm not worried about our students :-)]

      This goes back to the example of teaching odd languages in mid-curriculum: I find that with 30-40 students in a room (even as Sophomores and Juniors) that there is no single “employable” language; that that is not my experience (my pre-PhD jobs at Harris, IBM, and GE, all in different languages and all different from my academic undergrad and grad languages!!), and not my graduating undergrads experience either. They need to be adaptable.

      Reply
      • 14. Michael S. Kirkpatrick  |  April 2, 2014 at 10:46 am

        There is a persistent (I almost said inherent, but it’s not) conflict between the notions of teaching languages or teaching concepts. As academics, we tend to favor the latter. If you master the concepts of encapsulation, inheritance, etc., then you can learn the syntax of C++, Java, C#, Objective C, or whatever other language seems to be hot at the moment. However, if you focus on learning the (horrific!) particulars of C++, it may become harder for students to make the conceptual leap to another OO language. That’s not a problem for students that get hired as C++ programmers, but is a huge problem for others.

        Thus, the solution (from an academic perspective) is to adopt the language that best helps students learn the underlying concepts. Maybe that’s Scheme for CS1, Python for CS2/data structures, Squeak for OOP, Haskell for PL, etc. At the same time, though, we would be remiss if we failed to teach C/C++ and Java, which collectively account for 40% of the usage market (according to TIOBE). Hopefully there are natural courses where these fit (such as C for OS and computer organization, Java for design patterns or software engineering), even if there are other languages better for that material. So, yeah, we should focus on the concepts, but we cannot ignore the societal and job context, either.

        As for Bonnie’s point about the technical interviews, I’d be curious to find out more about what makes them “brutal.” Are they focusing on subtle implications of algorithm choice, data structure selection, etc.? Or are they drilling students on syntactical minutiae or API memorization (e.g., what’s the difference between a DigestInputStream and a ProgressMonitorInputStream in Java)? If it’s the former, then perhaps more emphasis on the learning languages would actually be beneficial. If it’s the latter, you would have to ask how much you want to serve as a training program for that company.

        Reply
        • 15. Keith Decker  |  April 2, 2014 at 12:04 pm

          “At the same time, though, we would be remiss if we failed to teach C/C++ and Java”

          Agreed. *Most* classes for majors use these (e.g. Java for OOP and two required software engineering classes, C/C++ in adv. algorithms, architecture, OS). Other languages appear in electives (e.g. SQL in DB, Javascript in Advanced Web Development, a logic language in AI or logic, etc.) I don’t feel that these students have wasted, or need an “extra” semester, after starting in one of the two intro courses using simpler languages. [Also, the PBD/HTDP2e curriculum is prototyped at Northeastern, which is a co-op university, and their industry partners were much happier with the students who started in BSL/ISL/ASL and then moved to Java, then they were with students who were all Java all the time]

          Reply
        • 16. Bonnie  |  April 4, 2014 at 12:37 pm

          Tech interviews tend to focus a lot on the particulars of the desired programming language, with some nods to algorithms and data structures thrown in. It is common to have an initial screen that focuses on the programming language, and then an in-person interview that goes into more depth. Employers are very focused on the programming language, that is for sure!

          Reply
          • 17. shriramkrishnamurthi  |  April 8, 2014 at 12:01 am

            Maybe some employers. Other, very very lucrative ones, are very willing to let students write their code in anything reasonable at an interview. And still others are very focused on the language but it’s not ANY of the languages mentioned above. Therefore, I’d prefer that we carefully qualify these kinds of comments. The world of computing looks very different today than it used to two decades ago, and it also looks very, very different depending on the industry, the company, and so on doing the interviewing.

            Reply
    • 18. shriramkrishnamurthi  |  April 8, 2014 at 12:04 am

      The major languages in industry change every five to seven years. If these students can’t adapt between languages, what hope have they twenty years out?

      Therefore, I don’t think teaching them concepts from multiple perspectives is at all a “waste” of “precious course real estate”. A student graduating today without at least some understanding of the world beyond Java is, I think, a student who is tomorrow’s moral equivalent of yesterday’s COBOL programmer.

      Yes, it takes an “EXTRA” semester. That’s just as true at Brown as it is at Georgia Tech or elsewhere. But we don’t think of that time as “wasted”. We consider it a feature.

      Reply
      • 19. sheila miguez (@codersquid)  |  April 8, 2014 at 10:40 am

        It likely isn’t an extra semester. Your student might only be able to take 12 hours a semester due to a job, and obtaining all the credits to qualify for a degree will stretch for a few extra semesters.

        Reply
  • 20. sheila miguez (@codersquid)  |  April 8, 2014 at 10:16 am

    How many of you who are knocking the pragmatics of vocational aspects are from middle to upper class strata of society? I did not come from that background, and spent a lot of extra time in my college career working part time to pay for college along with incurring debt. An extra semester would mean incurring more debt.

    So, while I did enjoy being able to take classes where I was able to use languages like ml, smalltalk, prolog, and lisp, I have empathy with people who could not take those classes. I incurred an additional cost to take these things, not everyone will make the same choice I did.

    I’d like to ask those of you who find it easy to dismiss the needs of your students to consider rather what you can do to change things. Consider learning how tech interviews for college hires work in your city, not just for special dream companies. Your students aren’t going to have the luxury to get hired there. They are going to have the luxury to interact with local small companies who aren’t as wise. Heck, maybe they will be interviewed by people who read blogs by non-academics like Jeff Atwood who admonishes interviewers to require interviewees to write syntatically correct Java. see “1) Coding. The candidate has to write some simple code, with correct syntax, in C, C++, or Java.” from http://blog.codinghorror.com/getting-the-interview-phone-screen-right/

    Educate yourself on what it is like in a company who will be on-boarding college hires. Maybe you will be able to get the pedagogy to match up with some pragmatic aspects.

    As for teaching Java, I have encountered students who are learning program by being taught java, and the course work is stuck on 1.4 or some older version. Java has moved on and this may make it easier to teach important concepts in it. or not.

    Please do consider pragmatism along with what what would be an ideal world. Please try and convince non academics to be better at accepting students who haven’t learned the languages they are using at their companies.

    I was lucky, I got snarky and told someone I could program in any Turing Complete language and was given a chance to join a company that used Java despite my complete lack of experience in it.

    Reply
    • 21. Keith Decker  |  April 8, 2014 at 12:48 pm

      (1) I was the first person in my family to go to college; I paid with loans and work-study (including grading/TAing Pascal, but later RA stuff). I worked over the summers as well; my first summer at the gas company stabbing pipe and clearing brush, but the next two interning at Harris and IBM. In no case did I ever use any language taught in school (Harris was assembler and my first Fortran[!]; IBM was 8086 assembler and PL/C). I didn’t use any college languages when I got to GE either, although I did teach some C while in the training program (GE at the time took most entry-level people in as part of group training programs).

      (2) The article you cite (Jeff Atwood’s Coding Horror blog) says:

      “Candidates who only know one particular language or programming environment, and protest complete ignorance of everything else, are a giant red warning flag.”

      The coding he asks for (based on Steve Yegge’s article) is not “what’s the difference between a DigestInputStream and a ProgressMonitorInputStream in Java”, but stuff on the FizzBuzz or easier end (and of course we do FizzBuzz in the first course; that’s kind of a required skill demo for CS0/1, no? Loops and Conditionals! :-)

      Yegge’s original article that Atwood is summarizing states:

      “Candidates who have programmed mostly in a single language (e.g. C/C++), platform (e.g. AIX) or framework (e.g. J2EE) usually have major, gaping holes in their skills lineup. These candidates will fail their interviews here because our interviews cover a broad range of skill areas.”

      (3) I AGREE that we should be in touch with our big employers. My experience, when I was undergrad program director, was that the hiring people I met with were way more concerned with how to attract our students to even apply, as opposed to what level of Java we were teaching; not all of them used Java. (We did have the State wanting a Cobol course…)

      (4) I’m still not convinced that there is an “extra” semester involved here. I don’t think we can just remove CS0/1 and drop them into CS2 so that they graduate faster. I think working students trying to save money are more annoyed at the math we require (who needs calculus?), or those pesky University-level liberal arts courses. No employer cares about Modern European History from 1950 to the Present…?

      Reply
      • 22. shriramkrishnamurthi  |  April 8, 2014 at 1:04 pm

        Keith, here are the five articles I point students to (some overlap above):

        http://www.paulgraham.com/avg.html

        Beating the Averages, Paul Graham

        http://norvig.com/21-days.html

        Teach Yourself Programming in Ten Years, Peter Norvig

        http://catb.org/~esr/faqs/hacker-howto.html

        How to Become a Hacker, Eric Raymond

        http://www.joelonsoftware.com/articles/ThePerilsofJavaSchools.html

        The Peril of JavaSchools, Joel Spolsky

        https://sites.google.com/site/steveyegge2/scheming-is-believing

        Scheming is Believing, Steve Yegge

        which all even mention specific languages including some under discussion here.

        Reply
        • 23. Keith Decker  |  April 9, 2014 at 3:25 pm

          Yes, love these!

          I actually use Beating the Averages in class (the intro HTDP2e/PBD class). Also (in class) Graham’s Revenge of the Nerds, and Spolsky’s “Can your language do this?”. (Sometimes Yegge’s “Kingdom of Nouns” also after we do Java :-) :-) We do these toward the end of class when they have whatever perspective they can have after 13 weeks. (I have a short quiz to make sure they skimmed them, and then we discuss for 20 minutes or so).

          I kind of made my own versions, formatted nicely and with some other pictures added. Earlier in the course we read/discuss more history of CS, or ways that CS gets used everywhere (clothing e.g. Jhane Bharnes, journalism e.g. Holovaty or Silver), or how CS will destroy us all (e-voting, programmed trading, unibomber manifesto :-) :-)

          My favorite assigned reading is probably the one about the IT guy almost killed by the hospital’s IT system (well, socio-technical system as a whole); I have them write very short essays on how the course changed their reactions to that, and I always get many really nice responses that tie into the big focus on Data Design in HTDP2e/PBD. (again, I have a copy I’ve edited for length, but a copy of the original is here http://securehealth.freshdefense.net/content/data-model-killme.pdf)

          That’s a good idea to put these other ones at least up on a page for them to take a look at if they are interested.

          Reply
      • 24. sheila miguez (@codersquid)  |  April 8, 2014 at 1:54 pm

        (1) I am happy that we can share perspective.

        (2) Yes. Though I will say that I’d be a lot more forgiving of people who were not well prepared by their family and friends to attend a big school.

        (3) Agree with you.

        (4) Less certain on that one. I think one semester may be adequate. More semesters would be better, but not pragmatic.

        It’s easy to see my cognitive dissonance due to identity. It’s easy for me to identify with low-class college students.. Is the system going to break down for under-represented people? I feel lucky that I escaped with the education I did. Hence my defensive reaction.

        Reply
    • 25. shriramkrishnamurthi  |  April 8, 2014 at 1:03 pm

      > How many of you who are knocking the pragmatics of vocational aspects are from middle to upper class strata of society?

      I’m not; do you think someone is? I’m acutely conscious of jobs, and of the curricular design constraints that go into a student getting one. Our paper “The Structure and Interpretation of the Computer Science Curriculum” (http://cs.brown.edu/~sk/Publications/Papers/Published/fffk-htdp-vs-sicp-journal/) talks about the interaction between jobs and programming language diversity in some detail (look for the word “intern” and read on).

      If anything, I’m happy to work in a department that is proud of its placements, and indeed I look for ways to expand those (currently, abundant) opportunities.

      So, I would respectfully suggest you’ve created a strange model of some writers here that doesn’t match our reality.

      Reply
      • 26. sheila miguez (@codersquid)  |  April 8, 2014 at 7:42 pm

        I reacted with a mental model based on reading “This comment I think illustrates something we’re seeing more and more of these days: the desire by many to make college more about vocational skills than about learning to think.” and also from the fact that many (all?) of you have jobs in academia rather than a blue collar job such as facilities maintenance or some such.

        Thank you for the paper, I started reading through it and I the section on principles versus pragmatics with the special focus on the internship and the last year seems sound.

        Reply
        • 27. shriramkrishnamurthi  |  April 8, 2014 at 11:26 pm

          It’s entirely reasonable to hold us accountable. Also, it’s difficult at some point to tell what someone is replying to, the way WordPress is set up. Thanks for the clarification.

          Reply
        • 28. Michael S. Kirkpatrick  |  April 9, 2014 at 9:34 am

          I would like to warn you not to jump to conclusions and use our academic credentials as a way to discredit our points. Yes, I am an academic. I also worked for several years at IBM. Like Keith above, I am (almost) the first in my family to go to college (I have a couple of cousins that went…most didn’t finish…but no one in my immediate family). I come from a family of firefighters, most of my male relatives worked in factories, and most of my female relatives did clerical/secretary (a title they insist on) work.

          So I understand traditional blue collar work ethics and values. That is my heritage. However, as an educator, I know that emphasizing vocational skills within the context of a CS curriculum is unnecessarily limiting to our students. The goal of a CS education is to provide them with the appropriate skill set–that is, the mental models and concepts–that create the foundation for those vocational skills. Without that conceptual understanding, teaching them all of the intricacies of Java 1.7 will limit what they can do in the future. Specifically, without the experience of self-exploration of language features, they would rely on someone else–and there may actually be no one available–to teach them how to use lambdas when they are introduced.

          Reply
          • 29. shriramkrishnamurthi  |  April 9, 2014 at 10:37 am

            “The soft bigotry of low expectations.”

            Reply
            • 30. sheila miguez (@codersquid)  |  April 9, 2014 at 2:52 pm

              Okay, I am corrected.

              Though please do not think I meant you should teach the intricacies of a specific language.

              Reply
            • 31. sheila miguez (@codersquid)  |  April 9, 2014 at 2:56 pm

              Once again I screwed up with wordpress replies. apologies.

              That is a good quote.

              Reply
  • 32. alfredtwo  |  April 8, 2014 at 10:48 am

    When I got out of college I knew several of the popular languages (FORTRAN and COBOL) as well as some proprietary ones (Mostly BASIC PLUS). It was the proprietary one that get me my first job but during my first year someone handed me a program specification and a language manual for a language I’d never heard of and said to get busy. I’m pretty sure that if I’d only learned one or two languages I would have been in hot water. As it was I had a broad base in languages and picking up the new one (which actually seemed to borrow some from COBOL and some from FORTRAN) was pretty easy. I don’t see how someone can afford NOT to learn a bunch of languages in college.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


Recent Posts

March 2014
M T W T F S S
« Feb   Apr »
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Feeds

Blog Stats

  • 967,138 hits

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 3,144 other followers

CS Teaching Tips


Follow

Get every new post delivered to your Inbox.

Join 3,144 other followers

%d bloggers like this: