Implementing Design Studio Pedagogy with an Augmented Reality CS Classroom
Architecture and art is often taught in a design studio setting, where students work in a large, open space where everyone can see what everyone else is doing all the time — for collaboration, for inspiration, and for camaraderie. Colleen Kehoe wrote her dissertation on advantages of these pedagogies for learning and how they might be used in CS classes. Colleen was part of establishing the use of design gallery walks (where students work is displayed for the whole class to review and comment on) in some of our HCI classes. The challenge to using design studio pedagogies in most CS classes is that our work lives just on the screen, where the only ones who can see it are those right in front of the screen.
This semester, we built a design studio classroom using augmented reality technology, and taught a recitation section of a Media Computation course using it.
The room was created by Blair MacIntyre with students Ashwin Kacchara and Ryan Jones. They used technology from Microsoft Research called RoomAlive, which uses Kinects to scan the room and develop a model to drive the projectors. Blair and his students defined a set of virtual displays for each student’s work. When students were in the room, they programmed in Pythy from Steve Edwards, a browser-based Python IDE that supports the Media Computation library. Ryan modified Pythy so that the last picture generated from student work was saved to a database, then he and Ashwin used RoomAlive to display those images around the room. The effect was that the wall was covered with the latest of students’ work for all to see.
Betsy DiSalvo is an expert on design pedagogies. She guided the design of the room and me (as the teacher in the room) in figuring out how to use the room. Amber Solomon is a first year PhD student working with me who evaluated the project. Betsy has been working with Amber during the evaluation, since I’m conflicted as the teacher of the class. Amber’s done an amazing job, observing literally hours of the design studio recitation section and a comparison recitation section, then interviewing almost all of the students in the design studio classroom. They’ve written one article already, for the IEEE Virtual Reality 2016 Workshop on K-12 Embodied Learning through Virtual & Augmented Reality (KELVAR) which is available through the workshop website.
I had a great time teaching in the class. I was able to move around the room, pointing to student work as examples of things I wanted to highlight. I knew the room was really working the first time that a student produced a humorous picture (turning Donald Trump into a Shrek-like green). Students started laughing, grabbing one another to get their attention. Then another student pulled our his phone to Snapchat the image. How often do CS students use Snapchat to share other students’ CS work?
I’m writing this now because Amber is now finishing her interviews, and we’re already getting some surprising results. I don’t want to give away too much, because I hope she’s going to publish another fascinating paper on her results.
We were worried about the effect of the technology on the students. Would it frighten students off? Would it be too unusual? Amber says that students didn’t find it unusual or novel.
The biggest surprise for me so-far: It helped students in getting help. In any CS class, you can provide help, but it’s hard to get students to take it. There is a whole literature on help-seeking behavior. For a student to seek help, the student has to first admit that he needs help — and that can trigger imposter syndrome. Students told Amber that they were willing to ask for help because their work (and everyone else’s) was visible, so everyone knew who needed help. One student told Amber, “I liked it alot. It projects like the last image someone produced. You could see who had already, like, fully understood the topic and, like, who had completed the task and then you could ask them for help if you needed too, or people who are struggling you could help them.”
We’re grateful for support for this project from Microsoft Research and from a GVU/IPaT Engagement Seed Grant.