Posts tagged ‘NCWIT’

Women 1.5 Times More Likely to Leave STEM Pipeline after Calculus Compared to Men: Lack of Mathematical Confidence a Potential Culprit

When you read this paper, consider Nathan Ensmenger’s assertion that (a) mathematics has been show to predict success in CS classes but not in computing careers and (b) increasing mathematics requirements in undergraduate CS may have been a factor in the decline in female participation in computing.

Our analyses show that, while controlling for academic preparedness, career intentions, and instruction, the odds of a woman being dissuaded from continuing in calculus is 1.5 times greater than that for a man. Furthermore, women report they do not understand the course material well enough to continue significantly more often than men. When comparing women and men with above-average mathematical abilities and preparedness, we find women start and end the term with significantly lower mathematical confidence than men. This suggests a lack of mathematical confidence, rather than a lack of mathematically ability, may be responsible for the high departure rate of women. While it would be ideal to increase interest and participation of women in STEM at all stages of their careers, our findings indicate that if women persisted in STEM at the same rate as men starting in Calculus I, the number of women entering the STEM workforce would increase by 75%.

Source: PLOS ONE: Women 1.5 Times More Likely to Leave STEM Pipeline after Calculus Compared to Men: Lack of Mathematical Confidence a Potential Culprit

August 24, 2016 at 7:06 am 8 comments

C.P. Snow keeps getting more right: Why everyone needs to learn about algorithms #CS4All

When I give talks about teaching computer to everyone, I often start with Alan Perlis and C.P. Snow in 1961. They made the first two public arguments for teaching computer science to everyone in higher education.  Alan Perlis’s talk was the most up-beat, talking about all the great things we can think about and do with computer.  He offered the carrot.  C.P. Snow offered the stick.

C.P. Snow foresaw that algorithms were going to run our world, and people would be creating those algorithms without oversight by the people whose lives would be controlled by them. Those who don’t understand algorithms don’t know how to challenge them, to ask about them, to fight back against them. Quoting from Martin Greenberger’s edited volume, Computers and the World of the Future (MIT Press, 1962), we hear from Snow:

Decisions which are going to affect a great deal of our lives, indeed whether we live at all, will have to be taken or actually are being taken by extremely small numbers of people, who are nominally scientists. The execution of these decisions has to be entrusted to people who do not quite understand what the depth of the argument is. That is one of the consequences of the lapse or gulf in communication between scientists and non-scientists.  There it is. A handful of people, having no relation to the will of society, have no communication with the rest of society, will be taking decisions in secret which are going to affect our lives in the deepest sense.

I was reminded of Snow’s quote when I read the article linked below in the NYTimes.  Increasingly, AI algorithms are controlling our lives, and they are programmed by data.  If all those data are white and male, the algorithms are going to treat everyone else as outliers. And it’s all “decisions in secret.”

This is fundamentally a data problem. Algorithms learn by being fed certain images, often chosen by engineers, and the system builds a model of the world based on those images. If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognizing nonwhite faces.

A very serious example was revealed in an investigation published last month by ProPublica. It found that widely used software that assessed the risk of recidivism in criminals was twice as likely to mistakenly flag black defendants as being at a higher risk of committing future crimes. It was also twice as likely to incorrectly flag white defendants as low risk.

The reason those predictions are so skewed is still unknown, because the company responsible for these algorithms keeps its formulas secret — it’s proprietary information. Judges do rely on machine-driven risk assessments in different ways — some may even discount them entirely — but there is little they can do to understand the logic behind them.

Source: Artificial Intelligence’s White Guy Problem – The New York Times

One of our superstar alumna, Joy Buolamwini, wrote about a similar set of experiences. She’s an African-American woman who works with computer vision, and the standard face-recognition libraries don’t recognize her. She lays the responsibility for fixing these problems on the backs of “those who have the power to code systems.”  C.P. Snow would go further — he’d say that it’s all our responsibility, as part of a democratic process.  Knowing about algorithms and demanding transparency when they effect people’s lives is one of the responsibilities of citizens in the modern world.

The faces that are chosen for the training set impact what the code recognizes as a face. A lack of diversity in the training set leads to an inability to easily characterize faces that do not fit the normal face derived from the training set.

So what? As a result when I work on projects like the Aspire Mirror (pictured above), I am reminded that the training sets were not tuned for faces like mine. To test out the code I created for the Aspire Mirror and subsequent projects, I wore a white mask so that my face can be detected in a variety of lighting conditions.

The mirror experience brings back memories from 2009. While I was working on my robotics project as an undergraduate, I “borrowed” my roommate’s face so that I could test the code I was writing. I assumed someone would fix the problem, so I completed my research assignment and moved on.

Several years later in 2011, I was in Hong Kong taking a tour of a start-up. I was introduced to a social robot. The robot worked well with everyone on the tour except for me. My face could not be recognized. I asked the creators which libraries they used and soon discovered that they used the code libraries I had used as an undergraduate. I assumed someone would fix the problem, so I completed the tour and moved on.

Seven years since my first encounter with this problem, I realize that I cannot simply move on as the problems with inclusion persist. While I cannot fix coded bias in every system by myself, I can raise awareness, create pathways for more diverse training sets, and challenge us to examine the Coded Gaze — the embedded views that are propagated by those who have the power to code systems.

Source: InCoding — In The Beginning — Medium

August 22, 2016 at 7:31 am 9 comments

Through the Screen of a Female Coder: A First Person Perspective on Diversity in STEM – CRA

I’m a fan of these first person female perspectives on what it was like to be a CS student. (Recall the Stanford one I posted recently.) I met Satoe at Snowbird last month.

When I approach female friends with the question “Why don’t you try computer science or computer engineering?” I often hear responses such as “I’m not good at math,” or “Do I look like a gamer boy to you?” The low participation of women in technical fields like computing can be seen as a vicious cycle: women feel as though they do not “belong” in technical fields to the degree that men do, leading women to avoid or shy away from those fields (Cheryan et al., 2009; Good et al., 2012; Lewis et al., 2016), potentially perpetuating women’s underrepresentation in technical fields. According to a report by Jane Stout, director of CRA’s Center for Evaluating the Research Pipeline and Tracy Camp, a CRA-W board member, “at all levels of the academic computing pipeline, men outnumber women by at least 3:1,” (Stout & Camp, 2014) indicating issues with mentorship and role models. In order to better examine this issue, I categorize the issue into two parts: barriers put up by the women themselves and external pressures. External pressures explain the male oriented culture and stereotyping. Women are disadvantaged by gender biases in the workplace as seen through the application process and promotion consideration. They also feel like they don’t belong in a world of ‘gamer nerds.’

Source: Through the Screen of a Female Coder: A First Person Perspective on Diversity in STEM  – CRA

August 3, 2016 at 7:57 am Leave a comment

College of Computing Using Google Funding to Close CS Diversity Gap: Barb Ericson’s Project Rise Up 4 CS

Project Rise Up 4 CS and Sisters Rise Up 4 CS are really great ideas (see previous blog posts on the work presented at SIGCSE and at RESPECT) — though I’m obviously biased in my opinion.  I’m grateful that Google continues to support Barb’s project, and the College did a nice write up about her efforts.

In fact, according to ongoing data analysis by Barbara Ericson, director of computing outreach for the Institute for Computing Education (ICE) for the Georgia Tech College of Computing, “The disparity here is so great that in 2015 10 U.S. states had fewer than 10 girls take the Advanced Placement (AP) Computer Science (CS) A course exam while 23 states had fewer than 10 black students take the exam.”

In an interview with the New York Times late last year Ericson said working to solve tech industry’s gender and racial diversity gap is important “because we don’t have enough people studying computer science in the United States to fill the projected number of jobs in the field.”

To address this problem and prepare more high school students for computer science careers, the College of Computing established RISE Up 4 CS in 2012.

Leveraging Google RISE Award funding, the RISE Up 4 CS program offers twice-a-week webinars and monthly in person sessions at Georgia Tech to prepare underrepresented students to succeed in taking the APCS A course exam and class. For the webinars, students use a free interactive e-book developed by Ericson to learn about searching and sorting data, and the fundamentals of JAVA.

Source: College of Computing Using Google Funding to Close CS Diversity Gap | Georgia Tech – College of Computing

July 22, 2016 at 7:52 am Leave a comment

Why Professors Resist Inclusive Teaching by Annie Murphy Paul: Especially important in CS

Annie Murphy Paul is talking about inclusive teaching here, but she could just as well be talking about active learning.  The stages are similar (recall the responses to my proposal to build active learning methods into hiring, promotion, and tenure packages). These are particularly critical for computing where we have so little diversity and CS teachers are typically poor at teaching for diverse audiences.

Stages of Inclusive Teaching Acceptance

Denial: “I treat all my students the same.  I don’t see race/ethnicity/gender/sexual orientation/nationality/disability. They are just people.”

Anger: “This is all just social science nonsense! Why won’t everyone just get over this PC stuff? When I went to grad school, we never worried about diversity.”

Bargaining: “If I make one change in my syllabus, will you leave me alone?”

Depression: “Maybe I’m not cut out to teach undergraduates. They’re so different now. Maybe I just don’t understand.”

Overwhelmed: “There is so much I didn’t know about teaching, learning, and diversity. How can I possibly accommodate for every kind of student?”

Acceptance: “I realize that who my students are and who I am influences how we interact with STEM. I can make changes that will help students learn better and make them want to be part of our community.”

Source: Why Professors Resist Inclusive Teaching « Annie Murphy Paul

 

July 6, 2016 at 7:27 am Leave a comment

Mattel’s Game Developer Barbie is fantastic, says Casey Fiesler

Casey Fiesler and Miranda Parker did a wonderful remix of the original computer engineer Barbie (see Guardian article about that).  Great to see that Mattel did a better job the next time around, and Casey loves it.  I love the point she makes below, which echoes a concern I’ve voiced about open source software.

This is particularly important is because as much as we don’t want to suggest that girls can’t code, we also don’t want to suggest that coding is the only path to working with computers or games. Sometimes other parts of computing—like design or human-computer interaction—are delegitimized, considered less rigorous or less important. Or maybe they’re delegitimized in part because they happen to be the parts of computing where there are more women present (in other words, more inclusive), which is even worse.

Source: Mattel’s Game Developer Barbie is fantastic.

June 29, 2016 at 7:47 am 1 comment

“I had so many advantages, and I barely made it”: Stanford alumna and Pinterest engineer on Silicon Valley sexism

I’m a believer in empirical evidence, and I worry about getting a representative sample.  Sometimes, the right size sample for the question is one. CS is now the biggest major among women at Stanford (see article here).  Do the issues that Jane Margolis and Alan Fisher described in Unlocking the Clubhouse still exist there?

As the article linked below describes, women don’t always feel welcome in CS at Stanford. It’s hard to address the issues of classroom culture described.  Having separate classes for different groups of students with different backgrounds/interests (as at Harvey Mudd does) might help.

I know of even worse experiences at other CS departments.  The Stanford CS teachers actively encourage women.  There are still CS teachers who discourage women in their classes. It’s hard to get administrators to focus on broadening participation in computing in the face of overwhelming enrollment.  It’s even harder to push better teaching from the top down. “Teachers have academic freedom,” is a common response to requests to change teaching (see my efforts to incentivize active learning) — we allow teachers teach anyway they want. It isn’t clear that still makes sense when there are empirically better and worse ways to teach. That’s like letting modern doctors use bloodletting or not wash their hands (see NPR piece making that argument).

At Stanford, I took two introductory computer science classes. I soon became convinced that I was much too behind my male classmates to ever catch up. I was surrounded by men who’d breezily skipped prerequisite courses. As freshmen, they’d signed up for classes that I was intimidated to take even as a sophomore. They casually mentioned software engineering internships they had completed back in high school, and declared they were unfazed by any of the challenges professors might throw our way. My classmates bragged about finishing assignments in three hours. I told myself that they were quantifiably five times better me. I remember the first “weeder” computer science course I took–meant to discourage the unworthy from pursuing the major. My classmates bragged about finishing assignments in three hours. Listening to them chat, I felt mortified: the same work had taken me 15 hours of anguish at the keyboard to complete. They are quantifiably five times better than I am, I told myself.

Source: “I had so many advantages, and I barely made it”: Pinterest engineer on Silicon Valley sexism — Quartz

May 6, 2016 at 7:45 am 4 comments

Older Posts


Recent Posts

August 2016
M T W T F S S
« Jul    
1234567
891011121314
15161718192021
22232425262728
293031  

Feeds

Blog Stats

  • 1,257,999 hits

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 4,247 other followers

CS Teaching Tips


Follow

Get every new post delivered to your Inbox.

Join 4,247 other followers