Posts filed under ‘Uncategorized’

A biased attempt at measuring failure rates in introductory programming

Do students fail intro CS at higher rates than in comparable classes (e.g., intro Physics, or Calculus, or History)?  We’ve been trying to answer that question for years.  I studied that question here at Georgia Tech (see my Media Computation retrospective paper at last year’s ICER).  Jens Bennedsen and Michael Caspersen answered that question with a big international survey (see paper here).  They recognized the limitations of their study — it was surveying on the SIGCSE member’s list and similar email lists (i.e., to teachers biased toward being informed about the latest in computing education), and they got few responses.

This last year’s ITiCSE best paper awardee tried to measure failure rates again (see link below), by studying published accounts of pass rates.  While they got a larger sample size this way, it’s even more limited than the Bennedsen and Caspersen study:

  1. Nobody publishes a paper saying, “Hey, we’ve had lousy retention rates for 10 years running!”  Analyzing publications means that you’re biasing your sample to teachers and researchers who are trying to improve those retention rates, and they’re probably publishing positive results.  You’re not really getting the large numbers of classes whose results aren’t published and whose teachers aren’t on the SIGCSE members list.
  2. I recognized many of the papers in the meta-analysis.  I was co-author on several of them.  The same class retention data appeared in several of those papers.  There was no funny business going on.  We reported on retention data from our baseline classes.  We then tried a variety of interventions, e.g., with Media Computation and with Robotics.  The baseline then appears in both papers.  The authors say that they made sure that that didn’t double count any classes that appeared in two papers, but I can’t see how they could possibly tell.
  3. Finally, the authors do not explicitly cite the papers used in their meta-analysis.  Instead, they’re included on a separate page (see here).  SIGCSE shouldn’t publish papers that do this.  Meta-analyses should be given enough pages to list all their sources, or they shouldn’t be published.   Including them on a separate page makes it much harder to check the work, to see what data got used in the analysis.  Second, they are referencing work that won’t appear in any reverse citation indices or in the authors’ H-index calculations.  I know some of the authors of those papers who are up for promotion or tenure decisions this coming year.  Those authors are having impact through this secondary publication, but they are receiving no credit for it.

This paper is exploring an important question, and does make a contribution.  But it’s a much more limited study than what has come before.

Whilst working on an upcoming meta-analysis that synthesized fifty years of research on predictors of programming performance, we made an interesting discovery. Despite several studies citing a motivation for research as the high failure rates of introductory programming courses, to date, the majority of available evidence on this phenomenon is at best anecdotal in nature, and only a single study by Bennedsen and Caspersen has attempted to determine a worldwide pass rate of introductory programming courses.In this paper, we answer the call for further substantial evidence on the CS1 failure rate phenomenon, by performing a systematic review of introductory programming literature, and a statistical analysis on pass rate data extracted from relevant articles. Pass rates describing the outcomes of 161 CS1 courses that ran in 15 different countries, across 51 institutions were extracted and analysed. An almost identical mean worldwide pass rate of 67.7% was found. Moderator analysis revealed significant, but perhaps not substantial differences in pass rates based upon: grade level, country, and class size. However, pass rates were found not to have significantly differed over time, or based upon the programming language taught in the course. This paper serves as a motivation for researchers of introductory programming education, and provides much needed quantitative evidence on the potential difficulties and failure rates of this course.

via Failure rates in introductory programming revisited.

September 30, 2014 at 8:41 am Leave a comment

Why one school district decided giving laptops to students is a terrible idea

A really fascinating piece about all the problems that Hoboken had with their one-laptop-per-child program.  The quote listed below describes the problems with breakage and pornography.  The article goes on to describe problems with too little memory, bad educational software, wireless network overload, anti-virus expense, and teacher professional learning costs.  I firmly believe in the vision of one-laptop-per-student.  I also firmly believe that it’s crazy-expensive and hard to make work right, especially in large school districts.

We had “half a dozen kids in a day, on a regular basis, bringing laptops down, going ‘my books fell on top of it, somebody sat on it, I dropped it,’ ” said Crocamo. Screens cracked. Batteries died. Keys popped off. Viruses attacked. Crocamo found that teenagers with laptops are still… teenagers. “We bought laptops that had reinforced hard-shell cases so that we could try to offset some of the damage these kids were going to do,” said Crocamo. “I was pretty impressed with some of the damage they did anyway. Some of the laptops would come back to us completely destroyed.”

Crocamo’s time was also eaten up with theft. Despite the anti-theft tracking software he installed, some laptops were never found. Crocamo had to file police reports and even testify in court.

Hoboken school officials were also worried they couldn’t control which websites students would visit. Crocamo installed software called Net Nanny to block pornography, gaming sites and Facebook. He disabled the built-in web cameras. He even installed software to block students from undoing these controls. But Crocamo says students found forums on the Internet that showed them how to access everything.“There is no more determined hacker, so to speak, than a 12-year-old who has a computer,” said Crocamo.

via Why a New Jersey school district decided giving laptops to students is a terrible idea | The Hechinger Report.

September 28, 2014 at 7:58 am 10 comments

Interview on the CS Education Zoo: Livecoding, HyperCard, Parsons Problems, and student misconceptions

Steven Wolfman and Will Byrd host a podcast series called the “CS Education Zoo.”  I was just interviewed on it yesterday.  Will couldn’t make it, but Dutch Meyer filled in.  We covered a lot of ground, from livecoding music to HyperCard to the importance of having even low-quality CS education in schools to Parsons’s Problems to ebooks and MOOCs, to how to address student misconceptions in class.  I had great fun and appreciate their invitation to join in!

September 26, 2014 at 8:45 am Leave a comment

Pushback in California on Computing in Schools

I’ve been thrilled to see the legislative progress in California around CS education issues.  The governor has now signed Senate Bill 1200 which starts the process of CS counting for UC/CSU admissions.  Dan Lewis’s article in The Mercury News tempered that enthusiasm (linked below).  I wasn’t aware that UC was pushing back, nor how the number of CS classes and teachers is dropping in California.  Lots more work to do there.

The Legislature just passed two bills to address these issues. Senate Bill 1200 allows but does not require the University of California to count computer science toward the math requirements for admission. However, there’s been a lot of push back from UC on this, so for now, all we really have is an expression of intent from the Legislature. Thankfully, AB 1764 allows high schools to count computer science toward graduation requirements. Of course, that may not mean much for students applying to UC.

For these reasons, computer science isn’t a priority for students. Nor is it a priority for schools when determining course offerings based on limited budgets: While California high school enrollment has risen 15 percent since 2000, the number of classes on computer science or programming fell 34 percent, and the number of teachers assigned to those courses fell 51 percent.

via Computer science: It’s where the jobs are, but schools don’t teach it – San Jose Mercury News.

A new policy brief was just released from the California STEM Learning Network on the state of CS education in California (see here).  California actually lags behind the rest of the US on some important indicators like number of CS degrees conferred.  That’s pretty scary for Silicon Valley.

September 24, 2014 at 8:53 am 9 comments

Challenges of using Big Data to inform education

The story below is interesting, but not too surprising.  Researchers are having trouble using MOOC data to inform our understanding of student behavior and learning.  Lots of data doesn’t necessarily mean lots of insight.

I watched Charlie Rose interview the Freakonomics guys (view here), Dubner and Levitt, and found Levitt’s comments about “big data” intriguing.  He’s concerned that we don’t really have the methods for analyzing such large pools of data, and there’s a real chance that Big Data could lead us to Big Mistakes, because we might act in response to our “findings,” when we don’t really have good methods for arriving at (and testing) those “findings.”

Coursera isn’t the only MOOC provider to leave researchers longing for better data collection procedures. When Harvard University and the Massachusetts Institute of Technology last week released student data collected by edX, some higher education consultants remarked that the data provided “no insight into learner patterns of behavior over time.”“It’s not as simple as them providing better data,” Whitmer said. “They should have some skin in it, because this is their job. They should be helping us with this.”

via After grappling with data, MOOC Research Initiative participants release results @insidehighered.

An FTC commissioner (see article) just pointed out the possibility of big data to lead to discriminatory practices.  How much more is education at risk?

During a conference held yesterday in Washington, DC, called “Big Data: A Tool for Inclusion or Exclusion?” FTC Commissioner Julie Brill declared that regulatory agencies should shift their critical lens to what she described as the “unregulated world of data brokers.” According to Brill, there is a “clear potential” for the profiles of low-income and racialized consumers built with personal data “to harm low-income and other vulnerable consumers.”

 

September 22, 2014 at 7:55 am 1 comment

JES 5 Now Released: New Jython, Faster, Updated Watcher, with Jython Music

Matthew Frazier is an undergraduate at North Carolina State University who contacted me this last Spring.  He was going to be in Atlanta for a couple of months and was looking for a research opportunity.  Barbara and I had just been talking about how JES needed to be updated.  I checked his references and hired him without ever meeting him.  Wow, did that work great!
JES 5 (the Jython Environment for Students) has just been released at: https://github.com/gatech-csl/jes/releases/tag/5.010.  Matthew did a great job updating our workhorse for Media Computation Python (which was originally developed in Summer 2002 and is still used daily).  JES includes a full implementation of Jython, plus support for media manipulation — libraries, help functions, and explorers for looking at individual pixels and samples. Here’s a one-screen overview of JES (click on it to make it bigger):
introJES
JES window to left with program area and command area (REPL), Watcher button for debugger, two image explorers, and one sound explorer.  Help is under JES Functions menu and Help menu.
Some of the things he did for JES 5 include:
  • First, we’re on github!  Come join us in stomping out bugs and making JES even better!
  • Upgrading the Jython interpreter to version 2.5, making available new language features and speeding up many user programs.  I have been working on the 4th edition of the Python MediaComp book this summer, and have introduced the time library so that users can actually time their algorithms (one of those CS Principles ideas), so I had ready-made programs to run in both JES4.3 and JES5.0.  The speed doubled.
  • Adding code to JES and the installers to support double-clicking .py files to open them in JES, on all supported platforms.
  • Bundling JMusic and the Jython Music libraries, allowing JES to be used with the text “Making Music with Computers” by Bill Manaris and Andrew Brown.  This is super exciting to me.  All of their examples (like these) work as-is in JES 5 — plus you can do sampled sound manipulations using the MediaComp libraries.  The combination makes for a powerful and fun platform for exploring computation and sound.  My thanks to Bill who worked with us in making everything work in JES.
  • Adding a plugin system that allows developers to easily bundle libraries for use with JES.
  • Fixing the Watcher, so that user programs can be executed at arbitrary speeds.  This has been broken for a long time, and it’s great to have it back.  When you’re looking for a bug in a program that loops over tens of thousands of pixels or sound samples, the last thing you want is a breakpoint.
  • Adding new color schemes for the Command Window, which allow users to visually see the difference between return values and print output.  This was a suggestion from my colleague Bill Leahy.  Students when first learning return can’t see how it does something different from printing.  Now, we can use color to make the result of each more distinctive.  Thanks to Richard Ladner at ACCESS Computing who helped us identify color palettes to use for colorblind students, so we can offer this distinction in multiple color sets.

returnedPrinted

  • Fixing numerous bugs, especially threading issues.  When we first wrote JES, threading just wasn’t a big deal.  Today it is, and Matthew stomped on lots of threading problems in JES 5.  We got lots of suggestions and bug reports from Susan Schwartz, Brian Dorn, and others which we’re grateful for.

Thanks to Matthew for pulling this all together!  Matthew’s effort was supported by NSF REU funding.

September 18, 2014 at 8:49 am Leave a comment

Older Posts


Recent Posts

October 2014
M T W T F S S
« Sep    
 12345
6789101112
13141516171819
20212223242526
2728293031  

Feeds

Blog Stats

  • 954,022 hits

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 3,072 other followers

CS Teaching Tips


Follow

Get every new post delivered to your Inbox.

Join 3,072 other followers