Archive for January 30, 2011
Closing down computer science at the Minnesota State University
Max Hailperin passed on this story to the SIGCSE-Members list. He added that: “About 40 students will graduate from the program in May. But that will leave about 40 who haven’t. They hope to get those students through within two years. But even if they do, the students may be forced to take upper-level computer science classes from faculty who may not have taught them before.” Interesting that Aviation was going to be cancelled, too, but the local business community worked to save that program. But not CS.
It’s been a bit blue in Minnesota State University’s computer science department.
But it’s not hard to understand why.
“Everyone in the department has either been fired, retired or has resigned,” said Dean Kelley, one of those faculty members. “Two took retirement — one effective last year, one this year — one who was on a leave of absence and has resigned. As for the remaining three, the word they used was ‘retrenched.’”
Computer science as a functioning program at MSU will cease to exist at the end of this semester. So will astronomy (although they’ll still have a minor and will still offer low-level astronomy courses). And the word “journalism” will disappear entirely from the mass communications program as it transforms itself into a program of mass media.
Other programs have been retired as well. All of it, of course, was done in hopes of mitigating the damage that will be dealt to higher education across the state when the $6 billion budget shortfall is dealt with. For MSU, that means trimming roughly $10 million.
The decline effect and the scientific method : The New Yorker
Education has never been much for replication studies, but given what this article says about psychology, I’d bet that we would have trouble replicating some of our earlier education findings. I don’t see that this article condemning the scientific method as much as condemning our ability to find, define, and control all independent variables. The world changes, people change. Anything which relies on a steady-state world or human being is going to be hard to replicate over time.
Before the effectiveness of a drug can be confirmed, it must be tested and tested again. Different scientists in different labs need to repeat the protocols and publish their results. The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology. In the field of medicine, the phenomenon seems extremely widespread, affecting not only antipsychotics but also therapies ranging from cardiac stents to Vitamin E and antidepressants: Davis has a forthcoming analysis demonstrating that the efficacy of antidepressants has gone down as much as threefold in recent decades.
via The decline effect and the scientific method : The New Yorker.
Recent Comments