Cramming the first semester until the students burst
August 6, 2009 at 7:46 pm 21 comments
For several days now, I’ve been thinking about this comment made to my blog post last weekend:
“But a computer scientist or software engineer needs to know much, much more. A suitable introduction for a CS major is very different than for someone who just wants to learn some stuff about computing.”
I believe that’s common among faculty–the conviction that they have to cram so much into the first semester that students burst. I just don’t understand why. Is every semester so full of material in a four year Computer Science program that the first semester has to be just as chock full? Or do we believe that the first semester has to be especially over-full? Enrollments are plummeting, so we don’t have the luxury of using CS1 as a weed-out course, to force out all students who don’t deserve to be computer scientists–a claim I’ve heard too often. (BTW, I believe in the UCLA HERI data — the uptick in enrollment this year is a statistical aberration. There’s no change in frosh attitudes about CS, and enrollments will continue to decline.) With failure rates nearing 50% at many institutions for CS1, there is every reason in the world to push off some of the complexity and content for later semesters, when students have more experience to deal with it, rather than cramming so much into the first semester.
I build into my Media Computation workshops several discussion periods. Today’s discussion was on whether Media Computation has much to offer the participants’ school’s intro courses. Today’s discussion was no different from others, but maybe it’s that I’ve done this three times in two weeks that it made more of an impact on me.
Faculty in the room said that they saw a lot of use for Media Computation Python in CS0, in the first course for non-majors, but not in CS1. For those who would want to use some form of Media Computation in CS1, there were plans to use “part of it,” or “some of the classes,” and especially, “in addition to our traditional text, like Lewis and Loftus.” I’m glad that they’re willing to use any of the material at all, but I was curious. I (as facilitator, trying hard not to inflict my opinions) asked “why?” (Below are paraphrased answers, of course — I didn’t jot down exact quotes, and not all comments are from just today.)
- “I’m not convinced that we’d get to all of the concepts that we want students to learn in CS1, like design.”
- “We want students to learn the standard Java libraries, and Media Computation doesn’t touch on all of them.”
- “It’s important to learn good coding style and all the right programming habits.”
If we really believe that computer science is not about programming, why do we make decisions about whether students should pass introductory computing dependent on whether students learn libraries and coding style?
Then one of my participants asked the question that I wanted to ask, but felt that I couldn’t as facilitator. “Why do all of you think that Python is good for CS0, but not for CS1? Why do you have to use Java in CS1?” The answer about Python was consistent — Python was easier and made it easier to focus on the “concepts” rather than on “coding.” The answers about Java were also quite honest, though distressing: “There’s such a critical mass behind Java” and “Everyone else does it that way” and “We want students to be able to transfer their credit.” No one offered a pedagogical answer. No one could say why Java helped students learn anything in particular. Instead, the answers were that the herd had decided, and all the CS departments represented were going to follow the pack.
I’ve just completed my second textbook in Java, and I’ve ended up with a much more positive attitude about Java than I had years ago. So it’s not Java per se that bothers me about these answers. It’s the lack of rationality for the overloading of CS1, the reliance on herd mentality, and the assumption that CS1 has to be an awful experience. Maybe it was bad for us, but is that a good reason to make it bad for our students? When enrollments are growing and our major is in great demand, there is room to be sloppy in our reasoning for what we’re doing. With failure rates rising and student interest falling, we need to question what we’re doing. It makes no sense to cram so many demands on CS1 students, no matter what their major.
Entry filed under: Uncategorized. Tags: CS1, Media Computation, undergraduate enrollment.
1.
Jim Huggins | August 6, 2009 at 9:10 pm
Part of the answer, to me, to the question of “why?” is what I’ve discovered is a common trap for educators. It’s easy to think that because I said something in class, students not only heard it, but understood it and incorporated it into their way of thinking. So, consequently, the key to giving students a grand view of the discipline of computing is just to say more stuff in the introductory course … since, of course, they’re absorbing all of it.
Of course, when you put it that way, it’s preposterous. But many of us in the business of computer science education actually know very little about education, and so it’s easy to succumb to that sort of myth.
2.
M. J. Fromberger | August 6, 2009 at 11:58 pm
As a (former) instructor in computer science, I have experienced the front-loading of early CS courses you are describing. Beyond simply “following the herd,” it seems to me that another chief caus of front-loading is that the (more senior) instructors in later courses want to be able to assume their students have a great deal more detail in their heads and experience at their fingertips than is reasonable at that point in their education. Being senior, their voices carry a great deal of influence.
I served as a very junior lecturer in our department, so many (most) of the courses I taught were among the introductory sequence. The first two courses in our program were quite intensive; a fact that was amplified by our ten-week semesters, but which was fundamentally driven by the demands from more senior faculty to make the students “better prepared” for the second- and third-year courses.
I think what Jim Huggins wrote is spot-on: It takes time and practice for fundamental skills and concepts to really sink in. Stuffing the syllabus of introductory courses is like trying to grab more sand — it doesn’t work, and it only makes a mess.
3.
Erik Engbrecht | August 13, 2009 at 1:11 pm
The problem is that a crammed semester for one student is tedium for another, and the rate of the first classes ultimately determines how advanced a student’s undergraduate education can be. Boring a potentially great student is a greater crime than failing a mediocre one who could have made it through.
Stretching out the curriculum sacrifices the top for the middle. I think universities have an obligation to challenge all their students, not just the average ones. Those senior professors really want to identify who’s going to make a great grad student.
I think what’s really needed instead of a “weeding out” is a greater separation of majors so that students can be steered towards what’s appropriate for them based on their skills and educational goals.
4.
Mark Guzdial | August 13, 2009 at 8:02 pm
Hi Eric! Your sentence really gave me a lot of insight: “Boring a potentially great student is a greater crime than failing a mediocre one who could have made it through.” I feel *exactly* the opposite of that! It’s not hard to educate the great students — they’ll learn no matter what I do, and they tend to challenge themselves with their own projects and interests. My challenge and focus as an educator is to make the average and low-end students succeed. I’m at a state school — my job is to produce the IT professionals to further the state’s economy, not just a handful of brilliant professionals. I don’t typically look to my undergrads for my future grad students. Computing Education Research is too small of a field, so the odds are against one of my undergrads deciding to focus on that field.
5.
Erik Engbrecht | August 14, 2009 at 7:47 am
I figured we felt differently on that point, but what I want to know is: Why is it a matter of either/or?
6.
Leigh Ann | August 14, 2009 at 6:07 pm
But this is where differentiated instruction comes in. You can easily give assignments for which the last 10-20 points are open ended and up for grabs in an “impress me” kind of way. That still gives opportunity for the weaker students to succeed and offers great challenges for the strong students. I used to use this all the time when I was teaching HS and dealing with an inverse bell curve of student ability in classes.
7.
Erik Engbrecht | August 14, 2009 at 8:18 pm
Leigh,
There are a couple problems with that approach:
1. Very bright students are often also very lazy, so without a push they won’t go above and beyond if they don’t have to
2. If students get similar grades in the same classes, it leaves those of us in industry scratching our heads as to the graduates’ capabilities
On a slight tangent, I don’t think being better or worse at Computer Science necessarily makes an individual a better or worse computing professional. The majority of work in computing, and even in software development, doesn’t require a huge amount of CS, and indeed require skills largely ignored in most CS curricula.
I’m focusing the core CS part because it’s already hard to find good (and willing) computer scientists, and I’m concerned that a CS curriculum that eases up on certain requirements will make the situation worse. But that could be turned around, as the individual with lower aptitude/interest in core CS would likely be better served learning valuable skills like requirements analysis and technical communications than bludgeoning themselves with computational theory or the internals of operating systems. Pushing everyone down the same track just leads to muddied waters.
8.
Alan Kay | August 14, 2009 at 11:16 pm
Hi Eric,
Let’s leave aside labels for a second (I’m quite at odds with what is called “Computer Science” in most secondary and post-secondary education today). And it would go far afield here for me to try to define “Computer Science” as I think it should be done.
But, let’s ask whether the “computing professional” you mention is really prepared for what they *need* to be able to do (as opposed to what people think they should be able to do).
I spend some time consulting for very large companies and my conclusion from decades of off and on contact, is that the IT professionals are not up to meeting their companies’ needs. This is not an IQ problem. It’s not a quantity of knowledge problem (they know a lot). But it is a quality of knowledge and weak outlook problem.
Many of these companies are superbly organize wrt IT – but unfortunately much of this organization is needed because what is lacking in outlook and quality knowledge. This quickly became a vicious cycle and really ramped up when universities started listening to company’s desires for various kinds of foot soldiers, not realizing that the companies had normalized to an overall bad way of doing things.
In the days of the clipper ships the technology was low and they required 150 humans superbly trained to run the show. Today there is an interesting split. We have high tech vehicles such as nuclear submarines which have retained a large crew. And on the other hand we have 747s which can be flown by one person, and could be flown by a novice with a little more work (and could not be flown by 150 trained people).
Main point here: only graduates with a better outlook and more quality knowledge are going to be able to stem and then turn the tide of software disasters, otherwise they will just fight the experts who do know what should be done.
Two of the criteria for a real science are “strongest outlooks” and “highest quality knowledge”. This is not what educational institutions generally deliver regardless of the lables they attach to their product for cachet (this is similer to the highjacking of the term “object-oriented” after the success as Xerox PARC and just painting on any old system that was trying to market itself as au courant).
Best wishes,
Alan
9.
Erik Engbrecht | August 15, 2009 at 4:11 pm
Alan,
I don’t think a university, no matter how good its professors and curriculum, can impart a better outlook upon its students. All it can do is teach the best knowledge it knows, with the right balance, and foster a culture that encourages a student to develop his own outlook rather than simply mimic (or mock) his professors. One cannot have a strong outlook without independence of thought, a willingness to unlearn beliefs in the when presented with powerful ideas, and the judgement to know when such unlearning is appropriate.
Graduates must be both willing and able to challenge the experts. In the short term the inability to do so will prevent them from discerning who the real experts even are. An expert’s ideas should withstand (or, rather, should have already withstood, given that expert status should be earned) intensive intellectual scrutiny, and an expert should be eager to refine or even discard his ideas in the presence of strong counter-evidence or powerful insights. Of course most of the time the graduate will lose this challenge, but in most cases the challenge will earn him a deeper understanding of the ideas and hopefully a measure of wisdom. Which brings us to the long term, during which some of these graduates must become the next generation of experts with a new crop of powerful ideas.
Graduates must also understand the subtle differences between challenging ideas through intense scrutiny and fighting them through such means as organizational politics and hijacking.
This process is critical to producing the highest quality knowledge, and participation in it coupled with the resulting knowledge is the only way to develop the strongest outlooks. This process is the essence of a real science.
Enabling this process requires individuals to have cognitive abilities that are not inherent in our nature. But that’s ok. Many of our common cognitive abilities are not inherent in our nature. We learn them at a young age, when our minds are most malleable, and thus as adults they feel natural.
I think strong parallels can be drawn between learning computer science and reaching fluency in a second (or third or fourth) natural language. It’s trivial to do as a young child, but as an adult can only be done through immersion.
Very few children are educated in computational thinking, and for many of the few that are the education is ad-hoc. Consequently most new CS students require immersion to force their minds to develop computation thinking capabilities rather than just limp along through memorization and weak analogies to other skills.
In many ways, “immersion” is just a nicer, and possibly more useful, way of saying “cramming.”
10.
Mark Miller | August 7, 2009 at 2:28 am
I don’t know what to say. Reading your post I felt like I was in topsy-turvy land. When I took CS in the late 1980s and early 90s there was a heavy emphasis on skills and concepts, and not so much knowledge. There was no science to it. That was the downside. At least then they didn’t care so much what language was used to teach a course. They expected students to learn languages as they went. It seemed like they wanted to put the emphasis on languages that were popular in industry, but they were not beholden to that. In fact there was an aversion to kowtowing to what languages industry wanted them to teach. They recognized that industry had different priorities from educators, and that in industry language popularity was like “the flavor of the month”. It’s fleeting.
Two aspects I recognize from my time in CS are the emphasis on good coding style and certain methods of programming, and this idea that to be a real coder you had to deal with tough stuff. I can understand “tough” in the sense of learning concepts that are important but can be difficult to grasp. I don’t understand “tough” for the sake of making more work for yourself. The latter is something that’s just in the computing culture generally. It’s not just in universities. I was struck by a comment to one of your previous posts saying they were offended that you were using Python with African-American students. “Why not Java? You think that’s too hard for them?” That kind of thinking is endemic in our field. It has been for decades. I had the thought just now that such people should be challenged to give K&R C a try (if such a compiler even exists anymore) just to give them some sense of humility. The Java language is easy compared to that. If I wanted to be merciful I’d suggest they learn assembly first.
The idea that someone’s a master programmer (and that’s the only goal) because they can do something difficult I think is flawed. It depends on the sophistication of the model, but sense there’s little discernment about that, anything goes. It’s like someone claiming they’re an expert engineer when they build a car mock up out of Mechano sets. Yes, that’s difficult. It’s certainly an accomplishment as a novelty, but a serious engineer would take that as “a good start”. It shows they can plan, have a sense of design, the ability to deal with complexity, and they have perseverance. But they’re using an amateur’s system (in the modern sense). They still have a lot to learn. Real engineers don’t try to build real cars out of a system of small girders, and nuts and bolts, because they know it can’t stand up to the stresses that are required of real cars. And plus it’s not economical.
It used to be that if you got into what’s seen as hard stuff, like assembly programming (it actually isn’t as tough as most people think once you learn some basic ideas), you would learn something about the computer you were working with. It wasn’t just a “boot camp” exercise, and perhaps that’s how this thing about toughness came about.
It’s out of place applying it to Java, because you’re not really learning about the underlying computing processes by working with it, unless of course you focus on computing principles in it, which I doubt these educators you describe are doing. It’s just seen as “tough” because OOP as it’s presented in Java is probably difficult for beginners to grasp, as I think you’ve written before, not to mention that I’m sure there are some things students will just have to ignore and take on faith just to get started. The APIs are large and numerous as well, and I’m sure these educators were concerned about teaching some of them as well. This is toughness without a purpose. It’s “tough” the way that some teachers like to make tests “tough” by asking trick questions, rather than getting students to think about the subject at hand.
I think what you’re describing is a CS culture that no longer has the pretense of teaching CS. It’s more like a professional trade school whose goal is educating software engineers, of course with no sense of a science backing up the engineering.
11.
Mark Miller | August 7, 2009 at 2:32 am
“It depends on the sophistication of the model, but sense there’s little discernment about that, anything goes.”
Oops. Meant to say, “It depends on the sophistication of the model, but *since* there’s little discernment about that, anything goes.”
12.
Alan Kay | August 7, 2009 at 3:05 pm
Hi Mark,
Yep, among other things, Miller’s good old 7+-2 still obtains in the very short term, and there are most definitely versions of this that obtain over months.
Given that this problem also is found in the teaching of other sciences (and other sciences that are more like “real sciences”, such as Modern Biology), I wonder how well the cramming problem is dealt with. A few years ago I did look at about a dozen 8th and 9th grade, etc., Biology texts to see how they were done, and “cramming without science” was almost the most apparent property of these texts.
On the other hand, it is hard to imagine a first course in Biology in college that did not discuss (a) the molecular basis of life, and this basis really does require an understanding of how and why chemistry works; and these are required for any reasonable treatment of (b) why evolution might be plausible.
It’s also easy to imagine the myriad of important stuff that could be safely left out of the first year, including , whole subfields of Biology, and zillions of critical details including most details of chemistry.
Now, if this analogy is forced on computing, what do we get?
What is the equivalent of molecular basis of life, how and why chemistry works, and why evolution should be plausible — that cannot be omitted from a first course?
Cheers,
Alan
13.
Aaron D'Amico | August 7, 2009 at 5:22 pm
As a current CS student, what made the first year so overwhelming to me was that every concept I learned was a new concept. I had no prior exposer to the material as I had in other subjects (math, history, etc) during my time in high school. Now, I am 28, with a military background in intelligence analysis, so I am definitely an outlier.
What I find as a larger issue (an one you hint at in your post) is the use of the beginning courses as weed out courses. I am not saying it should be easy. I am not saying it should come without hard work. It just strikes me as counter intuitive to make exams difficult enough to produce an average grade in the 60s. I have seen it go beyond just complex questions on exams. One instructor admitted to me that some material is introduced to distract from the more likely test questions. This summer quarter alone I have seen CS attendance drop by about half.
Respectfully,
Aaron D’Amico
14.
Alfred Thompson | August 11, 2009 at 10:44 am
I see two huge possibilities. One is that we really do need students to know too much from their first course to fit it in one semester. The second possibility is that we just can’t agree on what the key first semester’s worth of knowledge so CS1 teachers wind up trying to please everyone. TO make it worsewe can’t decide on the order in which things should be taught. Object early or objects late? Loops first or conditionals first? Does it even matter? Recursion before loops, after loops or not until the second course? The list goes on. There is far too little research going on so people tend to teach it the way they learned it. Often that doesn’t work for a lot of people.
15.
Alan Kay | August 11, 2009 at 11:09 am
Hi Alfred,
I’d say all of the above. One semester isn’t enough and there is enormous confusion about what should be in a decent first course.
I’ve been quite interested that no one has tried to answer the questions about analogies to Modern Biology raised in comment 5. (Mark did provide some meta criteria, but not content criteria.)
Cheers,
Alan
16.
Mark Guzdial | August 11, 2009 at 11:21 am
Guilty as charged, Alan. I have been thinking about the challenge you raised (hence the meta-comment), but don’t have any kind of good answer yet. I’ve been reading similar issues being raised on the Self discussion list, with the respect to the tension between have only a few rules or principles (but using them in sophisticated ways) versus having a a more complicated system that provides more clues about how things can be used by a beginner. I agree with both you and Alfred, that there isn’t resolution on what should be in the first semester.
17.
Alan Kay | August 11, 2009 at 11:42 am
Hi Mark,
Well, I’d be curious to hear how you and other readers would make an analogy to e.g. Modern Biology. I have a few ideas, but in this case waiting for other opinions is a good tactic I think.
An important part of this process is to separate content issues from curriculum issues. But I think we really do have to include the actual learning latencies require to gain enough fluency for real thinking by most students.
(So I personally don’t care whether this is one semester or 8. But my opinion is that a first course is really a two year course for those getting into this for the first time …..)
Cheers,
Alan
18. Tune Up Your PC » Post Topic » Beware Boring The Smart Kids | August 14, 2009 at 5:05 am
[…] is an interesting conversation going on in the comments of Mark Guzdial’s blog that I wanted to engage in a small part of that conversation in more depth than what fits in a […]
19.
Lisp and Smalltalk are dead: It’s C all the way down. « Computing Education Blog | August 14, 2009 at 12:10 pm
[…] being the least C-like of the popular languages in computing education today, is mostly seen as a language for the NON-computing major. It’s like faculty are saying, “Oh sure, those simpler and more conceptual ways of […]
20.
John Maloney | September 7, 2009 at 12:55 pm
Hi, Mark.
I’m glad you are speaking up for non-intimidation, inclusion, and programming languages that are both conceptually elegant and easy to use. The battle between what Dick Gabriel calls the “MIT/Stanford” philosophy vs. the “New Jersey” philosophy appears to have gone to “New Jersey” for large parts of the software industry, but that doesn’t mean educational institutions should follow.
In fact, I believe that there is growing frustration with the C legacy in industry. Increasingly, nimble companies are switching to dynamic languages such as Python and Lua. And the iPhone is programmed in Objective C, which more like Smalltalk than C. So I think industry is slowly moving away from C/C++, just as it moved from assembly language to higher level languages in the early 1970’s. If that’s true, we’d do both students AND industry a great disservice to teach only C++ or Java, languages that will disappear in five to ten years. Better to focus on concepts and core ideas rather than a single language. And it would be best to expose students to many programming languages to keep them flexible.
To me, the joy of programming stems from the ability to create, to realize ideas I see in my mind’s eye and then to refine and polish those ideas. I don’t enjoy learning complex languages, tools, libraries, and API’s; I’d rather save my mental energy for thinking about the design of my application. My “CS1″ course was half in Algol 60 and half in LISP. But it was the LISP half that showed that a computer program could be elegant and intellectually beautiful — an insight that made me switch my major to CS.
21.
Alan Kay | September 7, 2009 at 2:51 pm
Hi John,
I couldn’t agree more to all of your main points!
It seems that the big pull is for “automobile mechanics” rather than “automobile designers” and the all important “automobile design learner mechanics”. This sucks everything into a whirlpool of miniscule incrementality.
Maybe one of the biggest problems is just that the departments that call themselves “Computer Science” and claim to be teaching it are much too much aligned with the existing defacto standards most of which are mundane down to bad.
I’d say this is one of the big differences between Physics and Engineering — the former has a good sense of the important differences and synergies between the two, and most definitely has a pretty good idea of what it is about.
But I don’t think that a good argument for not-C is that some companies are starting to experiment with some of the scripting languages like Python or Lua. The companies are essentially having difficulties with everything, and in no small part because they emply mostly “automobile mechanics” who work with vendor supplied mechanisms, instead of designers and makers who can come up with more suitable mechanisms and make them.
I have a feeling that “official CS” has too much at stake and too much baggage to reform itself. I’ve been advocating that Computing do what Biology does to escape conformity and dogma, and that is to start new departments not necessarily ideologically tied to the existing ones.
For example, I could imagine a department of “Systems Science” that took a more relational and architectural view of complexity (where algorithms, etc. are useful tools, but are not the center of the subject).
And, this relational and architectural view would extend to how “meta” is thought about in computing.
For the current topic of discussion, I think this is one of the most important foundations for what it might mean to start learning computing as a scientific and mathematical discipline. with applications to engineering.
Put simply, too much of computing is ruled by what is relatively easy to express in much too inflexible tools. Both the minds of the tool users and the nature of the tools themselves need to be made more flexible. It is a supreme irony that we have the one man made machine that can rexpress itself completely, and almost no one is actually rexpressing, but just simply using whatever level of expression is exposed to them. This seems egregiously wrong.
Cheers,
Alan