Posts tagged ‘image of computing’

Wolfram on the importance of computing for understanding the world

Spending too much time in airports lately, I’ve been catching up on some of my TED video watching — the ones that everyone says I have to watch, but I didn’t have time until now. One of those that I watched recently was Stephen Wolfram’s on A New Kind of Science and Wolfram-Alpha. I realized that he’s really making a computing education argument. He explicitly is saying that computing is necessary for understanding the natural world, and all scientists need to learn about computation in order to make the next round of discoveries about how our universe works.

http://video.ted.com/assets/player/swf/EmbedPlayer.swf

June 6, 2011 at 10:39 am 3 comments

Intercultural Computer Science Education

Thanks to Sarita Yardi for these. Talk about CS Unplugged!

April 20, 2011 at 11:52 am 1 comment

New Draft K-12 Model CS Curriculum Available for Comment

The below announcement was posted by Dr. Chris Stephenson, Executive Director of the Computer Science Teachers Association (CSTA), on the SIGCSE-Members list.  This is really important — the whole Running on Empty report came from a comparison of state curricula to the current model curriculum.

I am glad that the draft is available for comment and encourage everyone to review it.  I’ve read through it once, and don’t quite understand it.  Why is it part of computational thinking that all high schools know how to convert between decimal, binary, octal, and hexadecimal (page 23 and 60)?  Is it really necessary for all students to learn how to program mobile devices and write client- and server-side scripts (page 23)?  I like the bullet about representation and trade-offs on digital information, but I would have liked some specifics on what students will learn, like the kinds of error that occur.  The current draft seems tied to current technology and not to big ideas or principles. (Are most K-12 standards like this?  The AAAS standards aren’t, but maybe they are the anomaly.)

I’m planning to re-read it, because I might not have got the big picture.  I strongly encourage all of you to read and comment on it.

Since it was first released in 2003, the ACM/CSTA Model Curriculum for K-12 Computer Science has served as the national standards for pre-college computer science education. This year, CSTA formed a committee of specialists (co-chaired by Allen Tucker and Deborah Seehorn) from all educational levels to review and revise these standards.

Based on the following definition of computer science:

Computer science is the study of computers and algorithmic processes, including their principles, their hardware and software designs, their applications, and their impact on society and includes the following elements:

 • programming,

• hardware design,

• networks,

• graphics, 

• databases and information retrieval,

• computer security,

• software design,

• programming languages,

• logic,

• programming paradigms,

• translation between levels of abstraction,

• artificial intelligence,

• the limits of computation (what computers can’t do),

• applications in information technology and information systems, and

• social issues (Internet security, privacy, intellectual property, etc.).

The K-12 Computer Science Standards provide learning outcomes for students in grade K through 12. These learning outcomes are divided into three levels:

· Level 1 (grades K–6) Computer Science and Me

· Level 2 (grades 6–9) Computer Science and Community

· Level 3 (grades 9–12) Applying concepts and creating real-world solutions

o Level 3A: (grades 9 or 10) Computer Science in the Modern World

o Level 3B: (grades 10 or 11) Computer Science Principles

o Level 3C: (grades 11 or 12) Topics in Computer Science

The learning outcomes within each level are organized into the following strands:

· Computational Thinking

· Collaboration

· Computing Practice

· Computers and Communications Devices

· Community, Global, and Ethical Impacts

CSTA invites you to review and submit comments on the review draft of the new CSTA K-12 Computer Science Learning Standards: Revised 2011. A copy of the document is available for download at:

http://csta.acm.org/includes/Other/CS_Standards.html

This site also provides access to an online form that will be used to collect all reader comments and suggestions. The review process will be open until June 15, 2011.

Allen Tucker

Deborah Seehorn

Chairs, CSTA Standards Task Force

April 15, 2011 at 8:21 am 7 comments

Commodification of Academic Research

I suspect that this is a bigger issue in computer science (and computing, broadly) than in other parts of academia, since our work is so easily commoditized.  It’s certainly the case that in my School, creating companies is highly valued and faculty are often encouraged to be entrepreneurs (e.g., see the article our new Dean sent to the whole faculty Saturday.)

Q: Academic research has always cost money to produce, and led to products that made money for others. How is the “commodification” of research different today than in past periods?

A: Commodification means that all kinds of activities and their results are predominantly interpreted and assessed on the basis of economic criteria. In this sense, recent academic research is far more commodified than it was in the past. In general terms, one can say that the relation between “money” and specific academic activity has become much more direct. Consider the following examples: first, the amount of external funding acquired is often used as a measure of individual academic quality; second, specific assessments by individual scientists have a direct impact on departmental budgets; for instance, if I now pass this doctoral dissertation, my department receives a substantial sum of money; if not, it ends up with a budget deficit; third, the growing practice of patenting the results of academic research is explicitly aimed at acquiring commercial monopolies. Related to these financial issues are important and substantial changes of academic culture. Universities are increasingly being run as big corporations. They have a top-down command structure and an academic culture in which individual university scientists are forced to behave like mini-capitalists in order to survive, guided by an entrepreneurial ethos aimed at maximizing the capitalization of their knowledge.

via News: ‘Commodification of Academic Research’ – Inside Higher Ed.

October 25, 2010 at 11:05 am 1 comment

The Utilitarian CS Ed Imperative and HyperCard on the Web

I just discovered TileStack, which is HyperCard on the Web.  Very cool, but the first comment on the introductory stack is something I heard a good bit these last few weeks at my workshops:

Python, for instance, is very easy to pick up.  You might make the argument that it’s much easier to learn Speak [the HyperCard-like language in TileStack], but even if it takes twice as long to learn Python to do the equivalent of making a Stack with Speak, you can at least apply what you learned in many other places other than tilestack.com.  Just seems pointless for people to waste their time learning something that only applies to a single website when they could learn something that they could use for many other applications.

via TileStack – Intro To TileStack – Start Here!.

Based on my experience, most computer science teachers (much more at the undergraduate faculty level than at the high school level!) believe that they only things worth learning in computer science are those that can be used to make applications.

  • As soon as I started teaching about JES and Jython, a set of faculty in every workshop I taught this summer (five workshops, all pretty much full!) asked me, “But how do I build applications?” or “How can I run this outside of JES?”  I explained that this was all possible, but that we don’t teach in the first semester how to build standalone applications.  Several faculty insisted that I show them how to run Jython with our media libraries separate from JES, and were frankly not interested in listening to anything more I had to say unless they could be convinced that what I was showing them could lead to building standalone applications.
  • Several faculty asked me, “But this isn’t Python 3.0, is it?  When will you be covering Python 3.0?”  That one particularly got my goat.  I started responding, “I’m barely covering Python 1.0 in here!  I’m trying to teach computer science with the minimum language features, much less whatever special features are in the latest version of a language!”  That response seemed to carry some weight.

I was really surprised about that.  I hear people regularly decrying the fact that computer science in most states is classified under vocational education.  But it’s certainly the case that many university faculty buy into that model!  I regularly was told by faculty at these workshops that computer science is only worth learning if it leads to job skills and application-building capabilities.  CS education is purely utilitarian, in this model.

Why do we teach people the difference between mitosis and meiosis, or about evolution, or that planets orbit the sun?  None of those are job skills, and they certainly won’t lead to building marketable products.  Isn’t knowing about computer science and one’s virtual world at least as important as understanding this level of detail about the natural world?  I’m going to bet that, if someone were to do a survey, most university faculty don’t really believe in computational thinking, that knowing about computing at some beyond-applications level is important for everyone.

Grumble, grumble…

July 12, 2010 at 11:25 am 4 comments

Talks and Trips: Learning Computing Concepts vs. Skills?

I’m writing from Chicago where I’m attending the International Conference of the Learning Sciences 2010. It’s pretty exciting for me to be back here. I helped co-chair the 1998 ICLS in Atlanta, but I haven’t been at this conference since 2002, when my focus shifted from general educational technology to specifically computing education. The theme this week is “Learning in the Disciplines.” I’m here at the invitation of Tom Moher to be part of a panel on Friday morning on computing education, with Yasmin Kafai, Ulrich Hoppe, and Sally Fincher. The questions for the panel are:

  • What specific type of knowledge is characteristic of computer science? Is there a specific epistemology?
  • Are there unique challenges or characteristics of learning in and teaching about computer science?
  • What does learning about computing look like for different audiences: young children, high school, undergraduate, and beyond (e.g., professional scientists, or professionals from non-computing disciplines)? In the case of “non-computing professionals,” what do they learn, and how do they learn it (e.g.,what information ecologies do they draw upon, and how do they find useful information)?
  • How do we support (broadly) learning about computer science?

In a couple weeks, I’m giving the keynote talk at the first AAI-10: The First Symposium on Educational Advances in Artificial Intelligence. I’m no AI person, but this conference has a strong computing education focus. I’m planning to use this as an opportunity to identifying challenges in computing education where I think AI researchers have a particularly strong lever for making things better. Not much travel for that one — I get to stay in Atlanta for a whole week!

In getting ready for my talk Friday, I’ve been trying to use themes from learning sciences to think about learning computing. For example, physics educators (BTW, Carl Weiman is here for the opening keynote tonight) have identified which physics concepts are particularly hard to understand. The challenge to learning those concepts is due in part to misconceptions that students have developed from years of trying to understand the physical world in their daily lives. I’ve realized that I don’t know about computing education research that’s looked at what’s hard about learning concepts in computing, rather than skills. We have lots of studies that have explored how students do (not?) learn how to program, such as in Mike McCracken’s, Ray Lister’s, and Allison Tew’s studies. But how about how well students learn concepts like:

  • “All information in a computer is made up of bytes, so any single byte could be anything from the red channel of a pixel in a picture, to an instruction to the processor.” Or
  • “All Internet traffic is made up of packets. So while it may seem like you have a continuous closed connection to your grandmother via Skype, you really don’t.”

Does anybody have any pointers to studies that have explored students learning conceptual (not skill-based) knowledge about computing?

I know that there is an argument says, “Computing is different from Physics because students have probably never seen low-level computer science before entering our classes, so they have few relevant preconceptions.” I believed that until I saw Mike Hewner’s data from his study of high school students in our Georgia Computes! mentoring program this last year. These are high school students who are being trained to be mentors in our younger student (e.g., middle school kids, Girl Scouts) workshops. They’re getting to see a lot of cool tools and learning a bunch about computer science. Mike found that they had persistent misconceptions about what computer science is, such as “Someone who is really great at Photoshop is a great computer scientist.” While that’s not a misconception about bytes or packets, that’s a misconception that influences what they think is relevant. The concept about bytes might seem relevant if students think that CS is all about great graphics design, but the packet concept interferes with their perception of Skype and doesn’t help with Photoshop — students might ignore or dismiss that, just as physics students say to themselves, “Yeah, in class and on exams, gravity pulls the projectile down, but I know that it’s really about air pressing down on the projectile.” So students’ misconceptions about what’s important about computing might be influencing what they pay attention to, even if they still know nothing about computer science.

June 29, 2010 at 3:33 pm 3 comments

Computing at odds with getting Faster

I just finished reading James Gleick’s book Faster: The Acceleration of Just About Everything.  It’s a 10 year old book now, but the story is still valid today.  I didn’t enjoy it as much as his books Chaos or Genius. However, the points of Faster are particularly relevant for computing education.

One of Gleick’s anecdotes was on how AT&T sold Touch Tone dialing in 1964 as saving an average of ten seconds per seven-digit number dialed.  Now, we have speed dialing.

In the post-Touch Tone generation, you probably have speed-dial buttons on your telephone.  Investing a half-hour in learning to program them is like advancing a hundred dollars to buy a year’s supply of light at a penny discount…To save time, you must invest time.

Do some students and end-user programmers invest time in learning to program to “advance a hundred dollars to buy a year’s supply of light at a penny discount”?  Are they looking to program in order to save time, to do things faster and more efficiently?  Do they give up on learning to program when they realize that it doesn’t work that way?

The problem is that I don’t think that ever really happens for the individual writing code for him or herself.  It’s hard to program.  The time cost of programming amortizes over users.  The development cost of Microsoft Office, amortized over millions of users, results in a profit for Microsoft.  A few hours of a programmer’s time on some feature of Excel enables many hours of use of that feature by many users.  But for any individual writing code for him or herself?  Takes a lot more than 30 minutes of programming software to get the same usefulness of 30 minutes of programming speed-deal buttons.

So why program?  In the Media Computation Python CS1 class, we tell students that they should program in order to create a replicable process (if you need something to be done the same way, maybe by others, many times), to create a process that many people can use (like when commercial software is created), or to communicate a process (like when trying to explain a theory of how something dynamic happens, like DNA transcription or evolution).  Paul Graham tells us that hackers write software to create beauty.  But few people successfully program in order to save time for themselves — you’d have to do something many times to make the benefits of use outweigh the cost of development.

Maybe it shouldn’t be that way.  Maybe software development should be easier.  I wonder if you could make it easier, and still keep all the fun, all the communicative power of programing languages, all the “Passion, Beauty, Joy, and Awe“?

The overall story of Faster may be relevant for understanding the decline in interest in computer science.  Gleick claims that “boredom” is actually a modern word and concept.  “To bore meant, at first, something another person could do to you, specifically by speaking too long, too rudely, and too irrelevantly.”  Today, we are bored by simple silence — by not enough challenges, not enough multi-tasking, by too many choices.  We have so many options for entertainment that we choose many at once, so that we drive, while listening to the radio, and talking on the cell phone (not texting or doing email, of course).  Gleick (naturally, as an author) bemoans the death of the book, because readers are too easily bored to pay attention to a whole book, and always have the options of magazines or blogs or just 140 character “tweets.”  Why would anyone make a career choice like “computer science” when there are so many other choices that are less boring, take less concentrated focus, take less time?

Gleick provides an afterword for the electronic version of the book (I read it on my Kindle), where he speaks to some of these concerns:

I believed when I began Faster, and believe now more than ever, that we are reckless in closing our eyes to the acceleration of our world. We think we know this stuff, and we fail to see connections. We struggle to perceive the process of change even as we ourselves are changing.

Tags: ,

Powered by Qumana

June 2, 2010 at 11:14 pm 8 comments

Albion College eliminates Computer Science

Budget cuts and low enrollment have led to this:

In similar letters from Paul Tobias (Chairman, Albion College Board of Trustees) sent to the Albion faculty and the Albion family, the Board of Trustees reported that they have eliminated computer science as a major at Albion College and that Albion College may continue to offer a computer science minor. In the process, an untenured Assistant Professor has been notified his position will be discontinued after the 2010-2011 academic year. The letter to students also indicated “Students who are currently enrolled in the affected programs will receive personalized advising to enable them to accomplish their academic goals and fulfill their graduation requirements for their major in a timely manner.”

via Albion College Math/CS – News.

In other news coverage, they detail the cuts overall:

Majors in computer science and physical education and minors in dance, journalism and physical education will not be part of the college’s curriculum moving forward — a reduction strategy that will eliminate about 12 courses, said Dr. Donna Randall, the college’s president.

via MLive news: Albion College officials defend decisions.

That comparison point really hit home.  Newspapers are dying, so journalism is less valued and on the chopping block.  Okay, I get that.  Physical education is the least rigorous field of education to prepare teachers for, so if you have to chop one, that’s the least valued.  And computer science is in that group.

To me, this is a sign of the dire straits of computer science and university budgets these days.  More than that, it’s a sign that computing literacy among the general public is at an all time low.  The uproar about these decisions is that they were made by a governing board, against the wishes of the faculty.  This governing board sees computer science as being so useless, so lacking in value?  The board made this decision based on “”how do we best prepare our students for meaningful … work in the 21st century?” What do they think computer science is?

May 20, 2010 at 7:23 am 6 comments

Is Media Computation “bait and switch”?

The question that Jennifer Kay raised in her AAAI Spring Symposium paper is about robotics, but her question on the SIGCSE Members list is more general: “Do we have any empirical evidence that cool stuff genuinely does attract more students?”  Bruce Barton changed the question slightly in his message on the list:

Are we doing a disservice to our students by teaching them robotics, animation, game development, etc. when most of the industry is performing fairly mundane computer programming tasks?  I understand that we are trying to increase enrollment and also retention.  But are we perpetrating a bait and switch scam on our students?  Back when I first started out (late 60’s), data processing was where it was at and we enjoyed what we were doing.  Has the video generation had their attention span so decreased that they can only learn if we make the learning experience play-time?  I have heard the reports about video gaming drawing in the students and that video gaming is the new big thing in the industry.  But each year we put out many thousands of graduates who want to become game developers and there are certainly not that many jobs available in that specialty.  Where do the graduates who don’t make it into game development go?  Should we be the voice of reality for them?  Would we really lose that many students if we approached the subject in a less fanciful way?

There is evidence that more engaging approaches in the first semester do lead to improved retention in later classes, even in more traditional classes.  Charlie McDowell found that with pair programming. Beth Simon’s ITICSE 2010 paper shows Media Computation CS1 students succeeding more in a (traditional) CS2, than students in a traditional CS1.

Why does this happen?  Why is it that students stick with computer science, after an engaging start, even if those latter courses are no different than they have ever been?

  • One theory is that we simply have to get students engaged, and then they see the value of computing in a broader sense. Once they see computing in the form of a concrete and engaging application area, then maybe they see the value of computer science in its general form.
  • Alternatively, maybe the first course sets up the carrot, and students are willing to bear with the rest in order to achieve that carrot.  Students in our Computational Media degree program want to go off to Electronic Arts or Pixar, and they are willing to go through courses that they find less engaging, and even (in their opinion) less valuable, in order to achieve their degree in order to improve their access to the careers they want.  Maybe the first course (in robotics, in media computation, with pair programming) shows them the best that they might find in computer science, and that makes it all worthwhile.

The implication in these statements is that the rest of the curriculum is boring and unengaging, and that most jobs in computing are similar.  Is it true that most computing jobs are boring and unengaging?  That’s counter to what we’ve been telling students the last few years.  Does the curriculum have to be boring and unengaging?  Maybe some students want the pure computing.  In Lana Yarosh’s paper on our Media Computation Data Structures course, we found that about 10% of the students didn’t want the engaging media context — they wanted pure data structures.  In the paper by Allison Tew and others on the use of a Nintendo Gameboy context for a computer organization course, they found that students were much more excited about the “boring” topic of computer organization with the engaging context — and they still learned the computer organization pretty well.

Do we really believe that computer science is inherently boring and unengaging?  Why is that?  Why would we believe that about ourselves and our field?

May 2, 2010 at 5:14 am 5 comments

Alan Kay on Hoping That “Simple” is not “Too Simple”

Alan wanted to make this longer comment, but couldn’t figure out where it fit naturally, so he kindly forwarded it to me to provide here:

Mark in his blog has provided a cornucopia of useful topics and questions about teaching computing to a wide demographic. It’s all very complex and (to me at least) difficult to think about. My simple minded approach for dealing with this looks at “humans making/doing things” as having three main aspects:

1. Bricks, mortar, and bricklaying
2. Architectures
3. Models of the above

And we can think of the “model” category as being composed of the same three categories.
1. Bricks, mortar, and bricklaying of models
2. Architectures for models
3. (Meta) Models of the above

If we stop here we have a perhaps overly simplistic outline of the kinds of things to be learned in computing (and many other activities as well).

Questions I would ask about these include:

  • How many ideas are there here, and especially, how many ideas at a time can learners handle?
  • How much real practice of each of these is required for real understanding and operational usage?
  • Where can we look for useful parallels that will help us think about our own relatively undeveloped area?
    • Music?
    • Sports?
    • Science?
    • Engineering?

To take the last first, we would (or I would) be very surprised to be able to prepare someone as a professional in 4 years of college if they started from scratch in any of the possible parallels listed above. To go to the really simplistic idea of “hours put in”, there just aren’t enough actual hours available per year (3 practice hours a day is about 1000 hours a year) and professional fluency in any of the above will require more than 4000 hours of practice from most learners. And it’s not just a question of hours. There are longitudinal requirements (time for certain ideas and skills to “sink in”) which probably represent real latencies in both the “notional” and physiological  parts of learner’s minds.

A large number of those going into any of the four areas started learning, training, and practicing in childhood. And for those who try to start as a first year college student ….

a. This “problem” is “solved” for music partly by the existence of “pop music” much of which does not require deep fluency in music for participation. (And it is certainly not hard to see real parallels and the existence of “pop computing” in our culture.) Classical and jazz music simply require a lot more time and work.

b. The problem is solved for professional sports by excluding the not skilled enough (and even quite a few of those with skills, and who did start in childhood). The last census listed about 65,000 professional athletes in all US sports. This is a small job market.

c. The problem is solved for the hard sciences (and medicine) most often with extensive postgraduate learning, training and practicing (and by high thresholds at the end). Should we ask where those who, for one reason or another didn’t make the cut, wind up?

d. I don’t know what the engineering demographics are (but would like to). Engineering has always had a strong ad hoc nature (which is what allowed it to be invented and practiced long before mathematics and science were fully invented). Architecture is harder than bricklaying, so one could imagine many with engineering UG degrees winding up in technical companies in what would be essentially apprentice processes.

I’m guessing that this is where similar computer students with undergraduate degrees might wind up — essentially doing bricklaying in some corporate notion of architecture.

Both of these last two seem to me to be dead ends — but it would be good to have more than personal and anecdotal evidence. My own observations would generalize to “they don’t learn much that is good” in their undergraduate experience, and “they learn even less that is good when on the job”.

I think universities have a moral obligation to try to deal with the “they don’t learn much that is good” part of this problem. And doing this well enough could cause large useful and important changes in industry over the next decade or two.

If I were going to get started on this, I would try to put forth a very clear outline of the six aspects of computing I listed above, show how they work together — and try to sketch out what it actually takes to learn them for most college students.

In my thinking about this I keep on coming back — not to the problems of “coverage” over 4 years — but what seems to me to be the larger problem of getting in enough real practicing of the various kinds needed to actually ground the ideas into thoughtful and operational tools.

Best wishes,

Alan

April 23, 2010 at 12:51 pm 24 comments

The Millenials are like the adults, only more so

I’ve been thinking about the Pew study of Millenials since it came out in February.  Are Millenials really different in some significant way from previous generations?  From the perspective of computing education, I see the same cognitive issues today as in years past.  The problems with loops that Lister’s ITICSE working group study found look pretty similar to the problems that Elliot Soloway and Jim Spohrer identified among Yale undergraduates working on the Rainfall problem in the early 1980’s.  I look at my 1995 SIGCSE paper on the challenges that students face in learning object-oriented programming, and I see those exact same problems among the seniors in my Capstone Design class this semester.

The most detailed study to date of the 18- to 29-year-old Millennial generation finds this group probably will be the most educated in American history. But the 50 million Millennials also have the highest share who are unemployed or out of the workforce in almost four decades, according to the study, released today by the Pew Research Center.

via Study: Millennial generation more educated, less employed: USA Today.

There is one place where I see a problem with Millenials–not unique to them, but even stronger with them than among the adults.  My students and I have been working on papers for ICER 2010 over the last couple weeks.  A common theme that we’re seeing in several different studies is a perception of our participants that Computer Science is about advanced use of applications.  If you really know how to use Photoshop, then that’s Computer Science.  It’s a hard misconception to deal with because an expert on Photoshop probably has picked up a lot of what we would recognize as Computer Science knowledge — about digital representation of data, about processing, about efficiency.  It’s not that the perception is wrong, it’s just missing an important perspective.

What’s striking about this misperception is that it shows up in several studies, from high school students to adults.  The Millenials might have it a bit stronger, a bit more persistently than the adults, because they have used computer applications for so long.  The Millenials hear us talk about real computer science, and they give us the “Yeah, yeah — I’ll tell that back to you on the test, but I know what really matters.”  They listen to us, but don’t think it’s all that important.  If they don’t think it’s important, they make little effort to really learn it. We find that this perception is strong among the adults, too.  The adults care about employment.  If you finally understand the difference between arrays and linked lists, you have made an important intellectual step, but you haven’t generated a new line in your resume.  If you take a class on “Advanced Photoshop,” you do have a new claim that can lead to a new job.  The adults in our studies, too, see advanced application use as being “Computer Science,” and far more valuable than a degree in Computer Science. The adults don’t give us the “Yeah, yeah” bit — they just ignore “Computer Science” entirely.

Both Millenials and adults are practical.  What gives me the most benefit for the least cost?  Learning computer science is hard, and its value is indeterminate, especially to someone who doesn’t understand the IT industry.  Learning to use applications better is an obvious job skill.  The fact that the advanced levels of the latter overlap with some levels of the former makes it even harder for we educators to make our case.

April 22, 2010 at 8:41 am 5 comments

Can computing curricula be neutral?

Erik asked a great question in a comment to the “White Boys are Boring” post (a post which was clearly accompanied by a healthy serving of hyperbole, as Kurt pointed out):

Has anyone looked at the comparative efficacies of race/gender neutral programs to increase participation versus ones targeted at specific races or at women?

I do know that curriculum designed to address the needs of women and members of underrepresented minorities work better at attracting those students than the traditional ones — that’s one of the directions that the NSF BPC program has been exploring.  That’s not answering Erik’s question, though. The traditional computing curriculum is not neutral.

Media Computation was not designed explicitly to attract women and minority students.  We designed Media Computation to attract Liberal Arts, Architecture, and Management majors, and we used sources like Margolis and Fisher’s Unlocking the Clubhouse to inform our decisions.  The result is that no published study has found a difference in success rates due to gender or ethnicity, and the published studies show that women are more likely to succeed with Media Computation than with whatever was the traditional curriculum.  That doesn’t mean that Media Computation is neutral — some students dislike it.  The distinction doesn’t seem to be due to gender or ethnicity.

When we design computing curricula, most teachers aim to make assignments and examples motivating and interesting, and in so doing, we speak to some members of our audience, and not others.  When we use video games or robots in examples, for example, we tend to get the boys more engaged than the girls.  I’ve found that it’s hard to be culturally neutral in my own assignments.  One year, I used an example in an object-oriented design course about parts of a car (lots of opportunity for aggregation and part-of relationships there), only to find that my students from the developing world didn’t have much experience with cars and didn’t know anything about parts of an engine.  Our introductory courses used to build assignments around board games like Yahtzee and Risk, which were really engaging for students who knew those games, and a drudgery for those who didn’t know the games.  (Implementing pages of rules for a game you’ve never played is dull.)  There were cultural biases in the choices of games, e.g., favoring the kinds of games that, in the US, middle class kids in Suburbia played.

The question to which I don’t know the answer is whether it’s possible to build “neutral” curriculum.  The academic answer seems to be “no,” but it’s still an issue being explored.  Some of what I’ve found from some digging:

Simply put, teaching math in a neutral manner is not possible. No math teaching — no teaching of any kind, for that matter — is actually “neutral,” although some teachers may be unaware of this. As historian Howard Zinn once wrote: “In a world where justice is maldistributed, there is no such thing as a neutral or representative recapitulation of the facts.”

Bottom line is that I don’t think that anyone can answer Erik’s question.  Maybe the academics are wrong and it’s possible to build neutral curricula — there certainly. are attempts today.  However, if we don’t know if we can build it, then we definitely don’t have any to compare.

March 22, 2010 at 7:58 am 4 comments

US Dept of Ed says CS is part of STEM

A note follows from Susan Rodger to ACM SIGCSE members, from her position on the ACM Education Policy Committee.  This is great news!  Cameron Wilson showed us this at the ACM Education Council meeting last weekend — the quoted statement showed up in the Federal Register, so it’s citable:

As a member of the ACM Education Policy Committee I wanted to make SIGCSE members aware of two important items.

1) First, the Department of Education has recognized computer science as a
science part of STEM. This is important for applying for funds related to
STEM.

“Consistent with the Race to the Top Fund program, the Department interprets
the core academic subject of science under section 9101(11) to include
STEM education (science, technology, engineering and mathematics) which
encompasses a wide-range of disciplines, including computer science.”

2) The Department of Education has two funds to apply for:

a) Invest in Innovation Fund (I3)
You can apply for these funds. A letter of intent is due April 1.

b) Race to the Top
Only states can apply for these funds, but you can contact your
state department of education and point out to them that computer
science is an eligible discipline and ask how computer science
education fits into your state’s plan.

For more details, please see this memorandum from ACM:

http://www.cs.duke.edu/csed/acmpolicy/RTTT_i3_Funding_Memo_v2.pdf

Susan

===========================================================================
Susan Rodger, Professor of the Practice
Dept. of Computer Science, Box 90129
LSRC Room D237
Duke University, Durham, NC 27708-0129

March 21, 2010 at 7:57 pm 3 comments

Creation of the School of Computational Science and Engineering

Announcement from Georgia Tech today — related to an earlier blog post.

Dear Faculty, Staff & Students,

It’s my pleasure to announce the formal creation of the School of Computational Science & Engineering within the College of Computing at Georgia Tech. The new School will operate under the direction of Chair Richard Fujimoto and in close cooperation with the colleges of Engineering and Science here at Tech.

In addition to focusing on its core research areas—high performance computing, modeling and simulation, and massive data analysis—the School of CSE’s mission will include producing a new type of computational scholar. Indeed, by creating this School, we once again take a leadership role in defining the field of computing itself. As a university, we are stating clearly that CSE is an academic discipline in its own right, with a distinct body of knowledge that lies at the confluence of computing, math, science and engineering. Many of our School of CSE faculty will have joint appointments around campus, and they will continue to pursue the kind of interdisciplinary work that has come to define this School, this College and Georgia Tech.

Finally, let us all express our appreciation to former John P. Imlay Dean Rich DeMillo, who first conceived of CSE as a separate unit of the College. Rich’s foresight has (again) allowed us to stake an important intellectual claim before our peers, and the College will reap the benefits of his prescience for years to come.

Congratulations to all of the faculty, staff and administrators in CSE on this achievement. Great work!
Best regards,
Jim Foley

Interim Dean & Professor

Stephen Fleming Chair of Telecommunications

March 8, 2010 at 10:01 pm 3 comments

The Future of Computing is People, People, and People

The College of Computing has interviewed three Dean candidates over the last two weeks.  All three gave us lots to think about, good advice, and plenty of blog-fodder — but we’re not supposed to name them, so the blog-potential is smaller than it might be.  Still, this last one made a comment that I found so striking that I want to talk about it anonymously.

“Do you want to know the top three areas of Computer Science?
Algorithms. Algorithms. Algorithms.”

Historically, that’s an accurate view.  Certainly, viewing the world in terms of its algorithms has enabled computing to change the way many disciplines think about their work.  However, is that view the one that will push computing forward?  Is that where the next great advances in computing will come from?

I suggest that the future of computing is people, people, and people.

  • People as co-processors. Luis von Ahn’s home page says that he focuses on “human computation.” What is it that humans can do, that is hard to capture (captcha?) in a computer’s algorithms, that we can then use in concert with computation?  The DARPA Network Challenge is a fascinating example of using people as probes, and technology as the networking and processing glue between them.  What makes this so powerful is that we can’t understand this as algorithms, but we can use algorithms to leverage human computation.
  • People as many, many users.  One of our other Dean candidates emphasized the importance of multi-core processing in the future of computing.  I think he missed a different massively-parallel phenomenon which is even more fundamentally deeply changing our society.People is different than persons, and social media is more than just individual users being addressed in old-style HCI terms.  What emerges when we connect up millions of people through rapid telecommunications networks?  Certainly, new things — I’m amazed at the number of press reports I read these days that reference gathering information through Twitter, blogs, and Facebook postings.

    There are a lot of research issues to explore here. One that I’ve been thinking about lately:  Based on “Nudge,” I predict that a broad range of opinions may initially appear when a new topic arises in a rapid-response social medium like Twitter or Facebook, but the majority of respondents will quickly converge on a small range of opinion. In other words, within a social group, there is no “long tail” effect — friends & followers quickly conform to a few dominant positions, and they do it more quickly than in non-Internet media.  Whether or not I’m right, characterizing the behavior of these new forms of media is important, so that we can understand how they’re influencing us.

  • Finally, people need to learn about computing.  Our first Dean candidate spent a significant amount of time talking about computing education.  A particular claim was made that I found interesting.  Higher education costs are soaring. They might be capped or limited in some way, or society may expect more from higher education in the United States — like expecting Universities to play a larger role in improving the dismal state of K-12 education, especially in computing.  I didn’t hear either of the other two candidates say anything about the responsibility of a College of Computing for improving the state of computing education across the society.  Of course, I agree that we do have a responsibility here, to figure out what people should know about computing, to help people learn about computing, and to figure out how to improve computing learning, for both the major and the non-major.

Our past was about algorithms.  Our future is about people.

March 5, 2010 at 11:16 am 3 comments

Older Posts Newer Posts


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 8,457 other followers

Feeds

Recent Posts

Blog Stats

  • 1,855,273 hits
May 2021
M T W T F S S
 12
3456789
10111213141516
17181920212223
24252627282930
31  

CS Teaching Tips