Posts tagged ‘image of computing’
The former Dean of the College of Computing at Georgia Tech, Rich DeMillo, established three schools within the College. I’m not really sure how he came to decide these three groupings. I am finding them useful for understanding the tensions in defining computer science today (perhaps the “malaise” in Beki’s blog post).
- The School of Computer Science (SCS) is focused on the traditional definition of computer science. It looks inward, into improving the computer and how it functions. Systems, networking, programming languages, theory, and compilers go here. Software engineering goes here, though flavors of it could go elsewhere.
- The School of Interactive Computing (IC) looks at the boundary between the computer and everything else. It includes human-computer interaction, learning sciences & technologies, computational journalism, and computing education research (where humans are the “everything else”), and also includes robotics, computational photography, and vision (where “everything else” is literally the world). Intelligent systems and graphics go here, for using humans as the model for intelligence and form, but versios of each could go elsewhere.
- The Division (soon to be School) of Computational Science and Engineering (CSE) focuses on the application of computing for advancing science and engineering. This was the most innovative of the three. Rich once told me that he wanted this School to provide an academic home for an important field that wasn’t finding one elsewhere. Computer science departments often don’t tenure computational science researchers because their work may not necessarily invent new computer science, and science departments don’t tenure scientists just for being code monkeys. This area is too important to leave adrift.
I admit that I’m a man with a hammer. I see these three groupings at the various colleges and universities I visit, as the three competing images for what is computer science. SCS faculty have history on their side — their view of computing is roughly what was defined in the computing curricula reports from 1968 onward. (I do wonder if those early curricular reports may have defined CS for education too soon, before we really knew what would be important as computing evolved.) IC faculty have modern day relevance on their side — much of the exciting new computing work that gets picked up in the press from this group. Here in the College of Computing, these sides tussle over the shared ownership of our MS and PhD degrees in computer science. (We don’t argue so much about the BS in CS because Threads provides enough flexibility to cover a wide range of definitions.) Do graduate students have to take a course in Systems? In Theory? Aren’t these “core” to what is Computer Science? And what is “Computer Science” anyway? Does (or should) the School of “Computer Science” have a particularly strong say in the matter?
In the latest issue of Communications of the ACM, Dennis Groth and Jeffrey Mackie-Mason argue that we need “informatics” degrees as something separate than a computer science degree. When they list informatics academic units, they include my IC School. They define “informatics is a discipline that solves problems through the application of computing or computation, in he context of the domain of the problem.” That’s close enough to my “computing at the boundary with everything else.” They are arguing that we can make greater advances in informatics by splitting those degrees off from computer science.
As we tussle over the name and identity of “Computer Science,” I increasingly value Dennis and Jeffrey’s point. I can see that IC and CS may be different bodies of knowledge. Computer science students ought to know about RISC processors and assembly language. Students in IC must understand and be able to use empirical methods, especially social science methods like interviews and surveys (e.g., how to put them together well, how to avoid bias in creating them and evaluating the results). These methods are necessary to listen to someone else, figure out their problem, and then later, figure out how the technology solves (or at least, impacts) the problem. When I look at IC-related professionals and researchers, I see few that use knowledge of RISC processors and assembly language in their work. The empirical, social science methods don’t fit well into CS. I was on the committee that wrote the ACM/IEEE Computing Curriculum 2008 Update, and in particular, I was in charge of the HCI section. We had to gut much of what SIGCHI felt was absolutely critical for students to know about working with humans (and which I agreed with) because we simply couldn’t cram more into a CS student’s four years. IC and CS have a significant overlap, but there is a lot in each that is not in the intersection.
We tussle over these degrees and names because, in part, we fear creating a new name. We worry that students won’t be interested in a degree from computing that’s not named “computer science.” IC co-owns our BS in Computational Media (about 300 students, ~25% female, placing students at places like Electronic Arts and Pixar) and a PhD in Human-Centered Computing (one of the few PhD programs in a computing school that is over 50% female). Students are willing to take a gamble, and we’ll draw on a different demographic of students.
I’ve not said much here about CSE yet, but that’s because it’s not big enough to tussle yet. Recently, I got to interview students and teachers in interdisciplinary computational science classes. These classes don’t really work for CS (or IC) students. The computer science being used is too simple for them (so they’re bored while the science students come up to speed), but the science is way harder than they can just jump into. For CS students to succeed in CSE classes, they need to take a bunch of science classes to understand how real scientists are using scientific computing. We run into the same problem as squeezing the important parts of HCI into CS — we run out of room. As CSE grows in numbers and importance, we will eventually find that it doesn’t fit into IC or CS, either. By separating the fields, we encourage greater research advances through tighter focus, and we create better, clearer opportunities for student learning by removing the unnecessary and spending more time on the necessary.
Our high school daughter came home last night with her Sophomore year course elections form. She’s considering taking Computing in the Modern World, Georgia’s version of the similar course in the ACM Model K-12 Curriculum. She might take Beginning Programming at some point, but she’ll need the pre-requisite…which is listed on the form as the course in “Computer Applications.” Which is WRONG! Barb, who helped write the state standards, knows that Beginning Programming should require Computing in the Modern World. Barb’s comment was, “Who do I have to fight now!?!“
The notion that computer science is just beefed up applications (CS == Apss++) is prevalent in high schools. Lijun Ni, my student studying high school CS teachers finds it all the time. “Oh, I’m a computer science teacher! It’s important for students to go beyond computer applications into real computer science!” That’s a true sentiment, but there doesn’t have to be a connection between applications and CS. You can be a great computer scientist and not be able to figure out Word or Excel. Last time I spoke to Jane Margolis, she was facing it in in her new pre-AP course in the LA Unified School District. She said that the teachers complain, “How can they learn computer science when they have weak keyboarding skills? We’ll have to do two weeks of keyboarding first.”
Is the problem that computing has been too successful? That you can do so much with applications, that it’s considered the fundamentals, the base of all of computing? Or is that teachers do not understand computer science as a real, academic, rigorous subject? Or is that high school leaders don’t understand computer science at all? I suspect that we have a chicken-and-the-egg problem. How do we get real computer science valued in schools when the people making the decisions don’t understand computer science?
With all the excitement over “apps” on the new iPad, maybe now is the time to push: CS != Apps++
We had a visitor at Georgia Tech today, alum Mike Terry, who has been studying the usability practices of open source development teams, like for Gimp, Inkscape, and Firefox. The short answer is, “There are no usability practices,” but that’s a little too pat. It’s a little bit more complicated than that, and actually even more concerning from an education perspective.
The folklore is that open source developers start because they have “an itch to scratch,” something that they want developed. Mike thinks that that’s true, but that scratching that itch doesn’t actually take long. Social factors keep open source developers going — they care about their developer community and working with them.
Mike finds that few projects really care about usability. The argument, “If you made your usability better, you’d increase your user base,” is not enticing to most open source developers. Open source developers have no layers (like salespeople or tech support) between themselves and the public users. Thus, they get inundated with (sometimes ill-informed and even downright stupid) bug reports and feature requests. The last thing open source developers want is more of those.
Since open source developers soon stop being users of their own software, and they don’t want to talk to lots of users, how do they deal with usability? Mike says that the top developers develop close relationships with a few power users, and the developers design to meet those users’ needs. So there is some attention to usability — in terms of what high-end, power users want.
So what happens when a User Experience person wanders into the open source fold? Mike has interviewed some of these folks (often female), and finds that they hate the experience. One said, “I’d never have done it if I wasn’t being paid to do it.” I guess there’s not much of an open source usability developer community. The open source developer community is not welcoming to these “others” with different backgrounds, different goals, and most of all, not a hard-core software development background.
Mike believes that the majority of our software will be open-licensed. I expressed concerns about that future in terms of education.
- How do people get started in developing software in an all open-source world? Mike suggested that open source is a great way for high school students to get started with software development. I pointed out how unfriendly open-source development communities have been to newcomers, especially females, and how open-source development mailing lists have been described as “worse than locker rooms.” Mike agreed with those characterizations, then said, “But once you get past that…” Well, yeah — that’s the point. Margolis and Fisher showed us years ago that those kinds of subtle barriers say, “This is a boys-only club — you don’t belong!” and those can prevent women and underrepresented minorities from even trying to enter the community.
- I worry about the economics of open-source and what signals it sends to people considering the field. Mike assured me that companies like RedHat are making money and hiring programmers — but there are many more unpaid programmers working on RedHat than paid programmers. If the world goes mostly open source, how do we convince students that there are jobs available developing software? Many kids (and parents) already believe that software jobs are all being outsourced. How do we convince them that there are good jobs, and they don’t have to work for years for free before they get those paying jobs?
- Finallly, I really worry about the lack of thought-diversity in the open source communities. People who care about usability are driven away from these communities? While we educators are trying to convince students that not all of computing is about programming, the open source community is telling newcomers that programming is all that matters. If the whole software industry goes open source, we’re going to have a hard time selling the image of a broad field of computing.
I found Mike’s work fascinating, and well grounded in data. I just find the world he describes a little disconcerting. I hope that the open source community considers the education issues of its next generation of developers.
Why are professors so liberal? Why are computer science majors mostly male and white or Asian? One possible answer is the same for each — that’s what we’ve been raised to expect. Typecasting may explain the liberalness of professors, and why nursing is predominantly female.
A pair of sociologists think they may have an answer: typecasting. Conjure up the classic image of a humanities or social sciences professor, the fields where the imbalance is greatest: tweed jacket, pipe, nerdy, longwinded, secular — and liberal. Even though that may be an outdated stereotype, it influences younger people’s ideas about what they want to be when they grow up.
My colleague Nancy Nercessian has been studying how engineering scientists think, and the short form answer is, “With stuff.” They use distributed cognition through the things in their lab in order to think through problems.
Nercessian began by posing the question, “How do engineering scientists think?” The resulting journal article in Topics in Cognitive Science quotes Daniel Dennett: “Just as you cannot do very much carpentry with your bare hands, there’s not much thinking you can do with your bare mind.”
Famously, Edsger Dijkstra is quoted as having said “Computer science is no more about computers than astronomy is about telescopes.” Nancy’s results suggest that, while Dijkstra may be right that computer science is not about computers, a computer scientist can’t think without a computer.
Thomas Sowell’s column appears in the Atlanta Journal-Constitution on Tuesday’s, and his column this week appeared under the headline World worse off because of role intellectuals play. Sowell’s argument is that intellectuals overall did more harm than good in the 20th century. Examples of intellectuals who caused great harm in Sowell’s opinion include Hitler and Marx. He distinguishes the Wright brothers in his article, because they created something.
All these people produced a tangible product or service and they were judged by whether those products and services worked. But intellectuals are people whose end products are intangible ideas, and they are usually judged by whether those ideas sound good to other intellectuals or resonate with the public.
So are computer scientists “intellectuals” by Sowell’s definition? We create products and services, but our products and services are merely intangible ideas. You can’t touch a bit, nor a Web page, nor a window and scroll bar. How do you judge the quality of our products and services? Is there a way of judging software by more than “sound good to other intellectuals or resonate with the public”?
I’m not sure that Sowell’s argument stands up to much criticism, e.g., the difference between intellectuals and those whose ideas are worth something is just that the latter have tangible products and services? If I’m better at marketing, so my ideas turn into a product, then I’m not longer “just” an intellectual? Still, the philosophical question of what we are, we who build things out of just thought and some serious typing, is interesting.
The January 2010 issue of Communications of the ACM has an interesting piece by Bjarne Stroustrup, What should we teach new software developers? Why?. He suggests that “the first degree qualifying to practice as a computer scientist should be a master’s,” and that “professors who teach students who want to become software professionals will have to make time and their institutions must find ways to reward them for programming.”
I’m pleased that he talks about “software developers” in his title, as opposed to making more global statements about “computer scientists.” He does make the general claim that “The ultimate goal of CS is to help produce better systems.” Is that really the goal? If a scientist writes a 50 line script that enables her to create a visualization or analyze some data in order to gain some new insight, and then throws that script away, isn’t that also a goal of CS? Maybe Stroustrup would also see that 50 line script as a “system,” or maybe he would disagree that there is any “CS” involved in the scientist’s 50 line script. Hamming famously said, “The purpose of computing is insight, not numbers.” I wonder if “systems” are more about “insights,” “numbers,” or something else entirely.
My wife’s grandfather died last Wednesday. He was 96, and though he had been fading for years, he had lived an amazing life. The memorial service, with many of his 20+ grandchildren and 40+ great-grandchildren attending, felt less like mourning and more like a celebration of his life. Many wonderful stories were shared.
Barb’s grandfather had worked for Thumb Electric (look at a map of Michigan, and the rural area called the “Thumb” sticks out like a sore one). Barb’s grandfather was part of a crew that literally brought electricity to this part of Michigan. One of his jobs was to get farmers to “sign up” for electricity. If he could get a whole street to sign up, then the wires would be brought in.
I remember him telling me stories about how hard it was to convince farmers to buy into electricity. What did they need electricity for? Their farms worked, just as they had for years. Some farmers might have heard about some new-fangled device (say, a milking machine) that they wanted. But most farmers were happy with what they had. He was selling a dream of what things might be like if they had electricity.
It seems hard to believe, in hindsight, that farmers might not want electricity. Until it was ubiquitous, there weren’t that many devices that needed it. Until the devices came along, it’s usefulness was unproven. Buying into electricity was paying a cost with uncertain benefit. Grandpa Hund was selling a dream.
Selling real computing education to teachers at the high school level or as part of a general education requirement in college feels like a similar challenge. Andy diSessa has been talking for years about what it would be like if there was ubiquitous real computer literacy. Seymour Papert had a vision for “mathland,” realized through computation, where students would learn math naturally, with the involvement of a community of elders (like in a Samba school).
The first challenge to overcome, like with electricity, is to show that there’s something more to be gained. “My students make their own Excel spreadsheets. Some of my kids make Flash animations.” How do we convince that teacher that real computation is so much more powerful? How do we show that knowing how to program, not at a professional level but with real understanding, allows students to explore ideas in ways that no application will ever support?
The second challenge, again like electricity, is to show that the price is affordable. “Programming is too hard. My students can’t stand all that syntax. Programming is a menial task that is being off-shored.” This challenge is made greater by computer scientists who encourage the view that real computer science is reserved for the wizards. Only those who know the mystic arts of diagrams (like UML) and archetypes of spells (like design patterns) should be allowed to speak the magic words (as arcane as we can make them, like “public static void main.”)
Really knowing computing can be as powerful as electricity. Selling it can be as hard. Since Grandpa Hund’s funeral was lit up with electric lights and warmed with electric heating, he showed that there is a way to sell it.
Really nice piece in the New York Times (quoting Jan Cuny of NSF) describing the 10K teachers project, the new AP CS effort, but most of all, arguing the growing importance of jobs that blend computing and other disciplines (“X” as in “Computing + X”).
Hybrid careers like Dr. Halamka’s that combine computing with other fields will increasingly be the new American jobs of the future, labor experts say. In other words, the nation’s economy is going to need more cool nerds. But not enough young people are embracing computing — often because they are leery of being branded nerds.
That study about how environment can influence stereotypes has really attracted attention! It’s showing up all over the Internet. I was just interviewed for a piece at Discovery News this week (quoted below) where I argued that the stereotypes might be addressed with real computer science classes. More of the interview appears in the reporter’s blog post, where I argue that actual environments are probably not influencing those stereotypes so much — how many high school kids have ever been to the office or lab of a real computer scientist? It’s the perception painted by media, in lieu of actual experience.
Georgia Institute of Technology professor Mark Guzdial agrees with Cheryan’s findings, but stresses that computer-science stereotypes can be overcome easily.
“We are finding that the stereotypes are prevalent, but not entrenched as you might think,” Guzdial told Discovery News. “In general, there is very little computer science in middle and high schools today. We find that a little bit of real information and experience can influence those stereotypes dramatically.”
My colleague, Beki Grinter, just posted an intriguing blog entry titled Reflections on ICT4D which tells us a lot about the priorities in computer science. “ICT4D” is Information and Communications Technologies for (4) the Developing world. Beki’s blog talks about the growth of this interdisciplinary field including classes and degree programs (such as the “Computer4Good” classes at Georgia Tech that I’ve already whined about).
What Beki does in her post is use the emergence of ICT4D to make observations about the assumptions and the priorities of our field. Some of the ones that particularly struck me:
- Few people who do computer science living in the conditions of the developing world. Thus, we can’t be expected to understand their problems (or rather, some of us may expect that we can, but we honestly can’t). Problem exploration and definition is thus an important part of ICT4D, as it is for HCI — but that may be part of what keeps HCI researchers from being seen as rigorous as others in academia. Admitting that there are alternative perspectives, views, and experiences of the world, and developing methods for understanding those, should be a contribution, not a detriment.
- The solutions that we develop in computer science rely on an infrastructure that is invisible to our discipline: a power infrastructure, an educational infrastructure that ensures (for one) that our users can (mostly) read, and a retail infrastructure for distributing our products. ICT4D is an academic discipline that starts by removing those assumptions.
The most interesting insight that I got from Beki’s article is how we focus on the solutions to the problems in computer science, where ICT4D is about the problems. As a computing educator, I hear repeatedly from teachers, “Computer science is problem-solving on computers!” Yet, as Beki points out, we organize our discipline and our findings on characteristics of the solution, not the problem.
- Why are the programming language, HCI, and computing education people in different conferences and journals, if they’re all about the same problem of human-computer understanding and communication?
- The “biggest question in computer science” (as Jeanette Wing called it), “Does P=NP?” is in some sense a question “We have these solutions to these problems that are ‘NP,’ and we’re wondering if there is a ‘P’ solution to those problems.” We haven’t classified the problem, we’ve classified the solution, and we’re wondering if our solutions are in some way general descriptions or characteristics of the problem.
- The recent discussion on my last blog post talks about how we in computer science tend to dismiss (as mathematics does, says one commentor) our tools and an understanding of how the tools influence our process. That’s how we reach solutions, and we tend not to care about that. We tend to care about the solutions not the process or tools we used to reach them, and that’s a shame. Our solutions have dramatic impacts on society, and it is important to understand how we got there.
Beki avoids critiquing computer science (unlike me, I’m afraid). Instead, she uses the developing study of ICT4D as a lens or mirror to provide us insights onto what we do, what we assume, and what we prioritize. I encourage you to take a peek and see what you see in Beki’s mirror.
If you haven’t seen these blog posts yet by “The Wicked Teacher of the West,” I recommend them. The author is middle school computer science teacher whom I think is terrific–smart, cares about teaching, trying hard to learn something new. The challenges she had in her workshop are probably like those of our best students when they are struggling in our computing classes.
I also know a bit about the workshop that she was in, and I also hold it in high regard. It’s got great ideas in it, and there are definitely things in that workshop that Wicked Teacher might use in her classes. Here’s the rub as I see it: those two things weren’t connected for her. As she put it, “I felt like I didn’t ‘get it’ but other than saying that I felt like I had no context, I couldn’t articulate what I meant.”
Today, I’m currently working on the lists of “Big Ideas of Computer Science” for the new APCS “Computer Science: Principles” Commission (and connecting these to Peter Denning’s Great Principles of Computing), and for the last few days, I’ve been working on the lecture slides for the Second Edition of our Python Media Computation book. The combination of these blogs and these activities have me wondering, “How do we explain the big picture to students, especially when we don’t agree on the big picture?”
Whenever I hear someone saying matter-of-factly, “Computer Science is really just engineering,” I know that they haven’t really thought about what computer science is, at least in terms that students are looking for. I think the workshop leaders had a reason for telling the Wicked Teacher why they were asking her to do all that she did. But not only didn’t they tell her, I don’t think it was the same one that she was looking for.
The workshop leader saw the meaning of the code as being the correct execution. What Wicked Teacher was looking for was why should she care. The state education officer who critiqued me for not thinking hard enough about the match of standards to computing principles was right. If I want teachers to care about computing, I need to show them why it’s important in terms that they consider important.
This connects to my Python slides and Great Principles. In one example, in the book, we show how to decreaseRed in a picture:
def decreaseRed(picture): for pixel in getPixels(picture): value = getRed(pixel) setRed(pixel, value * 0.5)
Then we parameterize that function further:
def changeRed(picture, factor): for pixel in getPixels(picture): value = getRed(pixel) setRed(pixel, value * factor)
Why do that? Why should we add the additional parameter to the function? I completely believe that this is an important part of introductory computing that we should teach. But what’s the story that we give students to give this meaning?
In our Python book, we give the traditional, engineering-based explanation: By adding the additional parameter, we make the code more reusable so we can later build even more complex things more easily. But what if someone doesn’t care about building more complex things, or building anything later? What if they care about other things that are just as valuable?
I realized that I really could tell a story about “bindings” here, about associating names with values. Peter Denning’s Great Principles touch on some of these. By delaying the association of a value for factor, I maintain flexibility and expressiveness. That’s a path that leads me to thinking about a wonderful set of abstractions that are powerful and unique to computer science: scopes and namespaces, functions as first class data objects, lambda (as the value for a function binding), creating new kinds of control structures, aggregating data and procedures together to create objects, and the power of “messages” where the function to be invoked for a given “name” is decided later, by the receiver. That might be engineering, but it’s closer to mathematics to me.
What many of my most serious students really care about is exploring the effects of this new function. What visual effects can you get by manipulating red? When do you want to? What is the power and limitation of a red/green/blue color model? What can I say easily, and what is more complex? They care about the power of representations, and choosing a particular model, and about empirical data resulting from experiments with these program. Now, computer science looks like science. That’s yet another, equally valid meaning for adding the factor parameter.
The Wicked Teacher was looking for a meaning in what she was learning, and I suspect that it’s a different meaning than what the workshop leaders were offering. There are several efforts, like Peter Denning’s and those of the APCS Commission, to define what “Computing” means. The real challenge that we face (in these efforts, and as teachers) is to offer a variety of meanings. We want to encourage deep thought and engagement in the power of computing.
U. Washington has a terrific Computer Science & Engineering Colloquium series available on iTunes. I just finished a talk by Alan Borning on his UrbanSim project, which is (as he describes it) “Professional SimCity” which supports people making urban planning decisions. He talks about the challenge of trust in that setting. These are high-stakes decisions with complicated issues, trading off pollution with equitable land use with water needs with land values, and so on. The project has moved from Java to Python, so that the models can be more easily inspectable and changed — he said modelers just didn’t like Java and hated declaring types in their models. He said that their tools used to have a button to re-run the unit tests on the software, so that users could be assured that the basic software underlying the models was working as expected. Nobody ever used it, so they removed the button. People trusted the software, even though changes/mistakes there could have had much bigger impact on the end result than the models themselves.
Mike Hewner’s comment on my last blog post and Alan Borning’s talk have had me thinking about when students trust us, and when they don’t. It’s not obvious to me when students trust us and when they don’t. I’ve two stories to share about trusting (or not) the computer science teacher.
Story 1: We trust you. Bill Leahy teaches CS2261 Media Device Architectures here at Georgia Tech. This is our class on computer organization for students in our Computational Media degree program, which students find much more motivating than our traditional computer organization class. Instead of working on a simulator of a pretend processor, CS2261 has students program a Nintendo Gameboy. Same low-level focus, and even more complicated (since it’s a real computer), but more motivating, so it’s a real win. Students tell us that they like programming for the Gameboy. Yet, they never do.
Bill shows them on the first day of class how he can write code for the Gameboy, compile it, download it to flash memory, and then boot it on the Gameboy. Lo-and-behold, his game runs on the Gameboy. To the best of Bill’s knowledge, no student has ever repeated that process! None of them ask him how to get the flash card set-up, and nobody ever shows him their program running on their own Gameboy. Amazing as it seems, it seems to be enough for the students to know that their code could run on the Gameboy. During the course, they only use an emulator. So, in a real sense, it’s exactly the same as the other course! The authenticity of the course (that it really is about programming a Gameboy) is entirely based on trust and that one example on the first day.
Story 2: We don’t trust you. Soon after we started teaching Media Computation here, we started having requests for a second Media Computation course. No, students didn’t want to take the next CS class — they explicitly wanted to do more in “Media Computation.” So we built one, which focuses on how the wildebeests were animated stampeding in Disney’s The Lion King. That was Disney’s first experience with computer-modeled characters in a crowd (herd) simulation. Explaining that scene requires us to cover all the basic data structures as well as continuous and discrete simulations. The course has been well received with a 90% success rate — pretty good for a data structures course mostly serving non-CS majors.
Faculty in Industrial and Systems Engineering (ISyE) asked us if their students could take this course. They wanted their ISyE students to learn some Java, and a focus on simulations is right in-line with how ISyE professionals use computing. However, nobody told their students that.
At Georgia Tech, ISyE typically has the smallest Freshman class and the largest Senior class of the Engineering programs. Students don’t know what it is coming in, and then they discover ISyE while here. When they first discover it, they don’t necessarily know what it’s all about. In particular, they don’t understand the relationship between ISyE and Computing is.
I have gobs of surveys with student comments like, “Boy, will I be glad when this class is over with, because then I’ll never use a computer again!” and “Why do I have to take this stuff? Industrial engineers don’t ever program and hardly ever even use computers!” Now, we do tell them otherwise. They simply don’t believe us. Gregory Abowd, when he taught this course, got an ISyE faculty member to come in and tell the students, “Honest! I do use simulations! My research group does program! You will, too!” Do we have to do that every semester? Maybe — the students certainly don’t trust us on this point.
There’s an article circulating a lot on the Internet yesterday and today suggesting that IT careers aren’t cool enough for Canadian students. Mike Hewner thought it was weird that the article was comparing apples-and-oranges.
Nearly 77 per cent of students believe ICT jobs offer average or above average pay; 74 per cent believe ICT jobs offers average or above average job security; and 37 per cent believe ICT jobs are above-average in terms of creativity. However, 34 per cent believe ICT jobs are difficult and complex; 31 per cent believe ICT jobs are not fun; and 25 per cent believe ICT jobs are not cool.
Why tell us the positives on creativity and the negatives on coolness? Why not give us the positives and negatives on each? So Mike looked up the original report data, and sent me this summary:
Here’s what students said…both “above average” (4s and 5s on their scale) and “below average” (1s and 2s on their scale):
creative 37 cool 23 interesting 30 fun 22 very easy 16 big impact on the world 39 high paying 44 very secure 35 not creative 20 not cool 25 not interesting 27 not fun 31 very difficult 34 no impact on the world 14 low paying 9 not secure 11
So yes, 25% of students thought that computing was “not cool,” but 23% said that it was “cool.” Is 2% a meaningful difference? The article seems a little one-sided.
What is still striking to me about these data is that these students generally see computing as paying well, creative, and difficult — and still not interesting as a career choice. My bet is that US students would rank the results similarly. Does this mean that they think that there are easier jobs that are creative and pay well? Or that there are other jobs that pay even better with similar characteristics? Or is that job prospects really don’t play much role in what 9th and 10th graders are thinking about when considering their future majors?