Posts tagged ‘computing education research’
I’ve seen Michael Lee present two papers on Gidget at ICER, and they were both fascinating. Gidget is now moving out of the laboratory, and I’m eager to see what happens when lots of people get a chance to play with it. Andy Ko has a blog post about Gidget that explains some of the goals.
Hello Gidget Supporter!
We are happy to announce that Gidget has launched today! You, your friends, and your family members can now help Gidget debug faulty code to solve puzzles at helpgidget.org
Gidget is a game designed to teach computer programming concepts through debugging puzzles. Gidget the robot was damaged on its way to clean up a chemical spill and save the animals, so it is the players’ job to fix Gidget’s problematic code to complete all the missions. As the levels become more challenging, players can combine newly introduced concepts with previously used commands to solve the puzzles and progress through the game.
Gidget is the dissertation work of Michael J. Lee who is a PhD candidate at the University of Washington’s Information School. Prior to its public release, over 800 online participants played through various versions of the game, and over 60 teenagers played through the game and created their own levels during four summer camps in 2013 and 2014. Our research has shown that novice programmers of all ages become very engaged with the activity, and that they are able to create their own levels (i.e., create their own programs from scratch) successfully after playing through the game.
Please share widely and refer to the press release for more information. We hope you have fun playing the game, and appreciate your interest and support for Gidget.
Michael J. Lee and the rest of the Gidget Team
Michael J. Lee
PhD Candidate, Information School
University of Washington
Seattle, WA 98195-2840
Sarah Esper (one of the leads on CodeSpells) was part of the 2013 ICER Doctoral Consortium, and was just in the ICER CRR with me. She’s designing CodeSpells based on computing education research. It’s worth checking out!
Become the most powerful wizard the world has ever seen by crafting magical spells in code.When we were young, wizards like Gandalf and Dumbledore struck a chord in our minds. We spent hours pretending to be wizards and casting epic imaginary spells.Now, we want to bring that kind of creative freedom to video games. Instead of giving the player pre-packaged spells, CodeSpells allows you to craft your own magical spells. It’s the ultimate spellcrafting sandbox.What makes it all possible is code. The game provides a coding interface where you can specify exactly what your spells will do. This interface is intuitive enough for individuals young and old who have never coded before. But skilled coders will also enjoy using their coding skills in new and creative ways! Even children can use this interface to make mountains out of the terrain, make an impenetrable force field around yourself, or even make a golem creature out of the surrounding rocks. The sky is the limit!
I’ve known Valerie Barr for years and believe that she was honest with the agents. I don’t believe that she lied about her involvement with a domestic terrorist organization that had “ties” (whatever that means) to two political activist organizations she belonged to.
I’m most shocked about the process. Valerie was dismissed on the basis of a report by a possibly biased agent — there are no transcripts or notes from the interview. The OPM is prosecutor, judge, and jury — there is no defense. Doesn’t sound like due process to me. It’s a loss to our community that a well-regarded researcher is forced out of NSF.
It’s a greater loss in that it will make it less likely that another “typical liberal college professor” (a quote from the below article) might offer to serve.
After again being asked if she had been a member of any organization that espoused violence, Barr was grilled for 4.5 hours about her knowledge of all three organizations and several individuals with ties to them, including the persons who tried to rob the Brink’s truck. Four people were found guilty of murder in that attack and sentenced to lengthy prison terms, including Kathy Boudin, who was released in 2003 and is now an adjunct assistant professor of social work at Columbia University. “I found out about the Brink’s robbery by hearing it on the news, and just like everybody else I was shocked,” she recalls.
But OPM apparently thought otherwise, again citing her “deliberate misrepresentation” in its report. Relying heavily on that investigation, NSF handed Barr a letter on 25 July saying that it planned to terminate her IPA at the end of the first year because the OPM review had found her to be unfit for the job…Barr was given a chance to appeal NSF’s decision, and on 11 August she submitted a letter stating that OPM’s summary report of its investigation “contains many errors or mischaracterizations of my statements.” As is standard practice, agencies receive only a summary of the OPM investigation, not a full report, and lawyers familiar with the process say that an agent’s interview notes are typically destroyed after the report is written.
The Snowbird conference is the every-other-year meeting of deans and department chairs in computing, to talk about how to support computing research and education. There was a panel this last summer on the state of CS education in K-12.
This panel discusses the role that U.S. research departments must play in sustaining CS in K-12. The panelists will address issues of educational reform, while highlighting the role that academia has played in other disciplines; illustrate the breadth of existing efforts from the perspective of a university-led project; and consider how departments could contribute to building the needed research base for CS education.Chair: Jan Cuny NSF. Speaker: Jeanne Century CEMSE, University of Chicago, Dan Garcia University of California at Berkeley, Susanne Hambrusch Purdue University
The slides are available here. I particularly liked Susanne Hambrusch’s slides on the role of computing education research in the University. The slide below (copied from her deck) addresses a particularly critical point — computing education research has to be seen as a real research area, not just what some education-focused faculty do.
This tension between computing education research being research versus supporting the education mission of the University comes up often for me. I was recently asked, “How does your work with high school teachers improve the education of CS undergraduates at our school?” I replied, “It probably doesn’t. This is my research. I’ll bet that researchers in your medical school study cancers that your undergraduates don’t have.” Susanne is pointing out that we have to get past this confusion. Yes, Universities teach. But Universities also study and explore questions of interest. If those questions of interest involve education, it should not be immediately confounded with the teaching that Universities do.
I’m intrigued by this project and would really love to see some analysis. Do students who use Scratch recognize Sniff as being a text form of Scratch? If it doesn’t work well, is the problem in the syntax and semantics of Sniff, and maybe we could do better? Do students transfer their knowledge of Scratch into Sniff?
So if Scratch is so great why do we need Sniff? The problem is that at some point you need to move beyond Scratch. It could be that you want to tackle a different kind of problem that Scratch can’t handle well. Perhaps you’ve realised that graphical programming is a nice idea, and great way to start, but in practise its clumsy. Clicking and dragging blocks is a tedious and slow way to build large programs. It could be you need something that feels “more grown up” – the cat sprite/logo is cute, and even older children will find it fun for a while, but Scratch is designed to look and feel like a toy even though its actually very powerful. For whatever reason at some point you start to look for something “better”.
My report on ICER 2014 is at Blog@CACM here. I also participated in the post-ICER Critical Research Review or Work-in-Progress Workshop (both titles have appeared at different times). Colleen Lewis organized it, based on the “functions” peer review that Education graduate students do at Berkeley. It was great, far better than I might have guessed.
I wanted to participate, in order to support and be part of this new kind of activity at ICER. I was expecting maybe a dozen people in a room, where one at a time a person would present for 15-20 minutes and then get feedback for a few minutes. Y’know — a “workshop.” Boy, was I wrong.
Instead, Colleen broke us up into two groups of five. (The small size was critical.) All of us presented some brief paper (couple pages preferred) that everyone read beforehand. Colleen gave each of us a writeup on the desired culture and tone for the event. “Don’t be mean” and “Don’t be defensive” and “Be nice” were some of the common themes in those directions. At the CRR, each of the five went off to a different room/space.
Over the course of five hours (two the first day, three the next), each participant had her or his turn to share their work. Sometimes we saw data (a video, or a bit of interview transcript), that the group was meant to help interpret. Sometimes we saw a student problem or a design problem, and we brainstormed theoretical perspectives that could help to gain leverage on understand the student’s issues or to improve the design.
It wasn’t a presentation, and it wasn’t an audience. It was (to use Colleen’s phrase) “borrowing four smart people’s brains to work on your problem for an hour.” I got a lot out of the feedback on my problem (related to the Constructionism for Adults post from awhile back). It was enormous fun digging into the others’ problems. Ben Shapiro of Tufts, Craig Miller from Depaul, Sara Esper of UCSD, and Kate Sanders from Rhode Island College were my teammates — it really felt more like a team, working together toward joint success than a presentation.
At the end, we evaluated the activity to figure out what worked and what didn’t. It really worked to have an easel for a note-taker (not the presenter/leader) to use to track all the discussion. The notes helped the group figure out where they were at, and were a wonderful artifact for the presenter afterward.
Overall, it was a huge success. I expect that we’ll see many future ICER (and other CER venue) papers coming out of the work we shared in Glasgow. I encourage others to participate in the CRR in future years.
Like the post I made last week, we’ve been working on a bunch of experiment setups during the summer, and are now looking for participants. This one is open to most readers of this blog.
We have found that there is a lot of literature on how to design text to be readable on the screen. But for interactive ebooks with embedded elements like coding areas, visualizations, and Parson’s problems, we know less about usability. Steven Moore is an undergraduate researcher working with us, and he’s put together a collection of three different ebooks and a survey on preferences for each. We’d love to get participants to try out his ebook samples and survey, please.
We are a research group at Georgia Tech developing new approaches to teaching computer science at a distance. In collaboration with researchers at Luther College, we have created a new kind of electronic book for learning Python. The book is entirely web-based and cross-platform, with special features, including programming within the book, program visualizations, videos, multiple-choice questions, and Parson’s problems (a special kind of programming problem).
We are currently seeking individuals with 6 months or more experience with programming in a textual language. If you are willing to volunteer, you will need to complete a survey regarding the design and usability of three different interactive computer science e-books and specific components within those e-books. Links to the e-books will be provided within the survey and the whole study can be completed via most web browsers. The survey should take roughly forty-five minutes to complete. We would like you to complete it by September 30th, 2014.
The risks involved are no greater than those involved in daily activities. You will receive a $15.00 gift card for completing the survey. Study records will be kept confidential and your participation in this study is greatly valued.