The Day the Purpose of College Changed: What was the impact on CS Education?

March 27, 2015 at 7:50 am 13 comments

The article linked below makes the argument that then-Governor Ronald Reagan changed perception higher education in the United States when he said on February 28, 1967 that the purpose of higher education was jobs, not “intellectual curiosity.”  The author presents evidence that date marks a turning point in how Americans thought about higher education.

Most of CS education came after that date, and the focus in CS Education has always been jobs and meeting industry needs.  Could CS Education been different if it had started before that date?  Might we have had a CS education that was more like a liberal education?  This is an issue for me since I teach mostly liberal arts students, and I believe that computing education is important for giving people powerful new tools for expression and thought.  I wonder if the focus on tech jobs is why it’s been hard to establish computing requirements in universities (as I argued in this Blog@CACM post). If the purpose of computing education in post-Reagan higher education is about jobs, not about enhancing people’s lives, and most higher-education students aren’t going to become programmers, then it doesn’t make sense to teach everyone programming.

The Chronicle of Higher Education ran a similar piece on research (see post here).  Research today is about “grand challenges,” not about Reagan’s “intellectual curiosity.”  It’s structured, and it’s focused.  The Chronicle piece argues that some of these structured and focused efforts at the Gates Foundation were more successful at basic research than they were at achieving the project goals.

“If a university is not a place where intellectual curiosity is to be encouraged, and subsidized,” the editors wrote, “then it is nothing.”

The Times was giving voice to the ideal of liberal education, in which college is a vehicle for intellectual development, for cultivating a flexible mind, and, no matter the focus of study, for fostering a broad set of knowledge and skills whose value is not always immediately apparent.

Reagan was staking out a competing vision. Learning for learning’s sake might be nice, but the rest of us shouldn’t have to pay for it. A higher education should prepare students for jobs.

via The Day the Purpose of College Changed – Faculty – The Chronicle of Higher Education.

Entry filed under: Uncategorized. Tags: , , .

AP is a national curriculum: Lawmakers Vote Overwhelmingly To Ban AP US History Computer Science Looks Beyond Nerds: Yeah, we still need to say it

13 Comments Add your own

  • 1. Bonnie  |  March 27, 2015 at 8:23 am

    American higher education has always been an uneasy amalgamation of liberal arts and job-focused education. Remember the wording of the Morrill Act that established so many of our universities: “without excluding other scientific and classical studies and including military tactic, to teach such branches of learning as are related to agriculture and the mechanic arts, in such manner as the legislatures of the States may respectively prescribe, in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life”. That phrase “the liberal and practical education of the industrial classes” is key, and I think pretty much defines our higher education system.

    Reply
  • 2. dennisfrailey  |  March 27, 2015 at 9:10 am

    I’m one of the few who started my CS education in the early 1960’s – before Reagan’s statement – at a time when CS was, indeed, viewed as a “curiosity” rather than a “legitimate academic discipline”. I began by learning about Turing machines from a math professor who was seen as a sort of outcast by the math department faculty. There was nothing at all like today’s CS0 or CS1. I was expected to pick up programming in conventional languages (high level and assembly) on my own (Dan McCracken’s Fortran book and the computer’s assembly language reference manual were all I needed).

    I also took a number of related math courses – along with several other people getting degrees in math (mostly BSs but a few BAs – a less common degree today). CS courses were rare and there was no degree in the field. But what there was helped pique my curiosity because it was presented as a sort of mathematical curiosity.

    Later, when I went to graduate school at Purdue and took more actual CS courses, there was still an emphasis on topics that were interesting. Getting a job was not the focus (in fact, many advised me not to major in CS as a graduate student because it was a fad and I’d end up with a worthless degree, unable to get a job).

    Years later as a CS professor I’ve often lamented the fact that we don’t teach CS as an intellectually interesting topic that might appeal to non-technical students. To me, there are fascinating concepts in computer science (computability, completeness, the Blum speedup theorem, and of course Turing machines) that could readily be taught to intellectually curious students (when one can find them) without resorting to dense mathematics or courses in programming.

    (By the way, not only did my CS degree get me a job – without ever having had a formal course in progrmaming in any language – but it got me a life long career. And along the way I hired a lot of former math professors who couldn’t find jobs.)

    Reply
  • 3. alanone1  |  March 27, 2015 at 9:28 am

    Alexis de Tocqueville 101

    Reply
  • 4. Graham Lee  |  March 27, 2015 at 10:00 am

    I decided to have a look at ACM’s Curriculum ’68 to see how quick that change was. This appears to be the relevant paragraph:

    > The demand for substantially increased numbers of persons to work in all areas of computing has been noted in a report of the National Academy of Sciences- National Research Council [6] (commonly known as the “Rosser Report”) and in a report of the President’s Science Advisory Committee [7] (often called the “Pierce Report”). Although programs based on the recommendations of the Curriculum Committee can contribute substantially to satisfying this demand, such programs will not cover the full breadth of the need for personnel. For example, these recommendations are not directed to the training of computer operators, coders, and other service personnel. Training for such positions, as well as for many programming positions, can probably be supplied best by applied technology programs, vocational institutes, or junior colleges. It is also likely that the majority of applications programmers in such areas as business data processing, scientific research, and engineering analysis will continue to be specialists educated in the related subject matter areas, although such students can undoubtedly profit by taking a number of computer science courses.

    But most interesting is that the committee’s 1965 preliminary report (An undergraduate program in computer science—preliminary recommendations) doesn’t talk about jobs, focusing on computer science as “a distinct field of study” and the needs to be met “in computer-related education”, including “work on applications programming” which is explicitly out of the scope of Curriculum 68.

    Anyway, it appears that there _was_ an explicit refocus from the needs of education to the needs of the workforce in computing between 1965 and 1968.

    Reply
    • 5. Bonnie  |  March 27, 2015 at 11:28 am

      I graduated with a CS major in the midst of Reagan’s first term, so I doubt his ideas had much impact on the structure of my CS program. Most of the students were majoring in CS for the jobs. We all joked about it. Most importantly, the structure of the program was almost identical to today’s programs. We took CS1 to learn Pascal, and CS2 to learn the same data structures the students learn today. We took computer architecture, and theory of programming languages, and software engineering, and artificial intelligence, and databases. We took more math than students get today – a full course in mathematical logic on top of another course in automata theory and another in analysis of algorithms. I think that was because the CS major was housed in a math department.

      I have had to wade through ABET requiremetns and ACM 2013 recently, and what amazes me most is how little has changed. There are new topics – security was not a concern back in 1983 – but the bulk of the curriculum is IDENTICAL to what I remember in my program. So I would say, no, Reagan’s idea that universities should focus on jobs had little impact CS programs post-80’s.

      Reply
      • 6. alanone1  |  March 27, 2015 at 11:48 am

        Hi Bonnie

        I agree with your guess that Reagan had little to do with this. I got to watch this from a somewhat limited viewpoint (from within ARPA-IPTO and then Parc) with much less awareness of the general scene.

        That being said, I think that the two largest forces were — independently and partially in combination — (a) the rapid rise of IBM from the early 60s after they decided to dominate computing, and the competitive reaction of other vendors, and (b) the ACM itself, which decided to put considerable effort into defining curricula in the late sixties into the early 80s (this is where Pascal came from). My impression from the outside is that the ACM actually had more influence academically than IBM did.

        This led to what some of us from that era called “curriculum wars”, which could be oversimplified (but not to the point of great error) by “the wars of the early binders vs the late binders” or “the Algolists vs the Lispists”. This came to a peak in the late 70s, where from my point of view anyway, the good side lost and CS in universities, especially in first classes became the province of “the 50s”, and has pretty much remained that way to the present time.

        The vendors were somewhat to blame because they persisted in putting out old style architectures that were not efficient for late-bound programming. The architectures that were good (such as the B5000 in 1961, and the Parc Alto in the 70s) were suppressed or not noticed despite considerable effort to inform and teach.

        Sic transit gloria mundi

        Reply
        • 7. gasstationwithoutpumps  |  March 27, 2015 at 1:07 pm

          The “vendors” have largely designed architectures that were pushed forward by academics. (The ARM architecture owes a lot to the RISC concepts coming out of Berkeley and Stanford, for example). Various attempts to push other architectures (like the Alto) fizzled, and it is not clear that the reasons were that vendors weren’t interested (though the dysfunction of Xerox in not capitalizing on their research arm is legendary).

          Reply
          • 8. alanone1  |  March 27, 2015 at 1:32 pm

            This is a complicated history, but certainly among the earliest “RISC” architectures was the Control Data 6600, which I spent quite a bit of time programming in the mid-60s. The term was invented by Patterson in the late 70s, but not the architectural ideas. (We were quite involved in this because the Berkeley work used the Mead-Conway process methodology that was developed jointly by Parc and CalTech.)

            And having been in the middle of trying to convince Intel, Motorola and AMD to understand the Alto architecture — and our efforts didn’t succeed — the reasons *are* quite clear. In this case, Xerox’s problems with trying to make the Star were quite orthogonal.

            With all due respect, both of these examples are also examples of the difference between guessing (the favorite web version of history), and actual presence at the events in question.

            Reply
      • 9. gasstationwithoutpumps  |  March 27, 2015 at 1:03 pm

        I got into CS rather late (switching from math after my MS in 1976), but I too have noticed that the current curricula for computer science has not changed much from when I was a student. The initial teaching language has changed a little (from Pascal or C to Java or C++), and I think we are about to see another shift (to Python), but these changes are quite minor. The overall structure of the curriculum is not much different—big chunks could probably be taught from 30-year-old books without change.

        This is not entirely bad, as it is common in mature fields—intro physics, chemistry, and math have changed even less. The problem is that CS still wants to pretend that it is a young, rapidly changing field, which no longer seems to be true, except in niche areas.

        Reply
  • 10. Kathi Fisler  |  March 28, 2015 at 6:31 am

    The recent proposal that SweetBriar shift to being a STEM-focused institution seems case in point for a discussion on the shifting role and/or perception of education.

    As one who teaches at a STEM-focused university, this entire thread has me reflecting on the ways in which we try to promote the “liberal education” perspectives within our curricula. I’m not saying that STEM-focused necessarily equals jobs-focused, but I expect that association is fairly strong in practice. My institution does it largely through a requirement that every student finish a significant project in the interaction of society and technology. Each project spans 3 courses (in time and credit), is typically done in teams, and is often done in another country (doubling as our study abroad program).

    Many students who go off-campus for this feel deeply changed by their project experience. It goes beyond the basic bit about being abroad. Somehow, being away seems to make them pay more attention to the societal aspect of their projects. Most still go on to take typical domestic STEM-jobs, but this approach gets through to many of them that there is interesting “stuff” beyond STEM, in ways that courses don’t do as effectively.

    So perhaps the question is what curricular architectures could we use to inject more of a liberal education perspective into a jobs-focused mentality (realizing this question is fraught with landmines around different kinds of institutions, etc).

    Reply
  • 11. alanone1  |  March 28, 2015 at 8:29 am

    Another perspective on the larger process of change in universities is to be found in books about the baby boom and its effects on higher education, such as “Imposters In The Temple” written by a Stanford professor (and originally appearing in the early 90s).

    Part of the argument here is that the baby boom in colleges was not prepared for: some of the results were a large expansion in new profs and journals with a considerable lowering of standards and goals, and that this pressure extended into fiscal problems which brought more business types into university administration.

    Reply
    • 12. dennisfrailey  |  March 28, 2015 at 1:42 pm

      One could argue the other side as well, and I’ll take a stab at it. (Forgive the somewhat exaggerated perspective – I hope to start a discussion!)

      The academy had become so stultified and hidebound over the years that it isn’t surprising there were efforts to return it to its origins. The “baby boom” provided the opportunity because changes were needed to accommodate the growth in enrollments. The original purpose of college was to prepare people in very practical ways to live in the world. The “liberal arts” were originally instituted as a means to prepare people for the practical side of life. People needed to know how to read and write, how to do basic mathematics, and so forth. Here’s a quote from Wikipedia that sums it up: “The liberal arts (Latin: artes liberales) are those subjects or skills that in classical antiquity were considered essential for a free person (Latin: liberal, “worthy of a free person”) to know in order to take an active part in civic life, something that (for Ancient Greece) included participating in public debate, defending oneself in court, serving on juries, and most importantly, military service. Grammar, logic, and rhetoric were the core liberal arts, while arithmetic, geometry, the theory of music, and astronomy also played a (somewhat lesser) part in education.”

      How far we have strayed from this! How many college graduates know good grammar, can argue in a logical manner, can write well, can do basic arithmetic and geometry, understand much about music, or anything at all about astronomy? In today’s world, how many college graduates know the basics of financial management (a key skill for success in modern society)? The practical and pragmatic “active part in civic life” purposes have often been supplanted by an emphasis on intellectual inquiry into increasingly more abstruse and obscure topics that occasionally prove valuable to humanity, but more often serve to provide employment for individuals poorly suited to success in normal human society. Tenure rules often preclude the use of faculty members who know a lot about the practical side of life (I’ve witnessed a number of fabulous teachers who could not get tenure and were thus dismissed from university faculties, after having spent long, successful careers in industry or government). After layers and layers of the “publish or perish” mentality, colleges have often become bastions of mediocre scholarship and arcane criteria for success (number of papers in recognized journals that hardly anybody ever reads). “Academic Freedom” was introduced to protect academic inquiry and teaching from the whims of totalitarian political regimes, but it has become an excuse for failure to apply reasonable standards of performance to those who serve on university faculty.

      All that being said, as a student I loved the opportunity to delve into topics that I had little knowledge of. It was fun for me to take extra courses so I could learn about topics well outside the bounds of my chosen major (mathematics). College was, indeed, a bountiful resource for intellectual inquiry by those who wanted to take advantage of that – but that doesn’t mean we should overlook the more practical purposes of a college education.

      A good CS program should strike an appropriate balance – providing important skills for success in life (much more than programming skill, by the way) while providing the opportunity for engaging in intellectual curiosity. But how many students take good advantage of that opportunity? I’ve had many students who won’t take a truly interesting course because it “might hurt their GPA”. And years of serving as an accreditation program evaluator have shown me that although quite a few students take courses well outside of their major, all too often those courses seem to be of the “gut” variety, designed to help them avoid hard work and improve their class rank.

      To sum up: college should prepare one for life. That includes many practical skills as well as a deep understanding of how much more there is to learn.

      Reply
  • 13. Taylor Williams  |  April 6, 2015 at 1:56 am

    A timely question… I wrote a blog post about this around the same time in response to an op-ed by Fareed Zakaria last week. Among other issues, he seems to believe that CS and other STEM fields are purely “skills-based” rather than what he calls “broad-based”. I’m with you in believing that CS education can and should be more than job training. Check it out:

    http://www.mrwilliamsstem.com/posts/258796-stem-and-the-liberal-education

    Reply

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 11.4K other subscribers

Feeds

Recent Posts

Blog Stats

  • 2,097,074 hits
March 2015
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

CS Teaching Tips