Archive for July 9, 2010

What’s the role of the body in learning computing?

The notion of “embodied cognition” and the role of the body in learning has been popping up for me in various places recently, which has me wondering about the role of sensing and moving in the physical world for learning computing.

  • At the Journal of the Learning Sciences editorial board meeting, there was some discussion about papers on “embodied cognition” (where I joked to the person sitting next to me, “Do we do papers on disembodied cognition?” but got caught by Cindy Hmelo, co-Editor-in-Chief, then the teacher made me repeat it in front of the whole class.)
  • As Aleata mentioned, Ulrich Hoppe was strongly against the current interest in tangible programming at ICLS.  He criticized both the LilyPad and our Media Computation work as focusing on the wrong things.  These are pandering to “engagement and motivation,” and he feels that it’s more important to get students to think critically about different programming languages than we’re using now.  He believes that we should shift our focus to declarative programming languages as having a stronger mathematical base, and provided a detailed example in Prolog.
  • In the recent issue of CACM, there’s an interesting (but too short) interview with Chuck Thacker (new Turing Award winner) where he talks about his interest in tablet computers.  He suggests that the only way of getting information faster into a computer than typing is “to use a different set of muscles…writing or drawing.”
  • I’m currently reading some papers from the Spatial Intelligence and Learning Center where they talk about how we evolved learning with our hands.  The interesting question they’re exploring is how use of our hands might support learning of higher-level cognitive functioning.  Does the mere act of writing notes about complex topics help us learn those topics?  How does use of our hands in manipulatives enhance learning?

Which, of course, has me wondering about the role of these manipulatives in learning about computing.  Does the use of the robot and the textiles in the LilyPad work trigger a deep evolutionary mechanism that might be enhancing learning of the much more abstract computer science?  I don’t know, but I’m intrigued and am digging further.

One such connection that has been a focus of some of my research is the application of embodied cognition research and theory to explain various anomalies in educational research and new techniques for instruction and educational technology, as described in a recent post about an upcoming AERA symposium on embodied cognition and education I am organizing.

For example, researchers have found that attending to student gestures or using gestures while explaining concepts or procedures (for example in a math class) helps student understanding, and also having students interact with and physically manipulate models (such as acting out a story or physically manipulating a simulation) helps student reading comprehension or physics understanding.

via The Connection between Embodied Cognition and Learning: 3 Examples from Physics Education « EdTechDev.

July 9, 2010 at 6:51 am 7 comments


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 8,460 other followers

Feeds

Recent Posts

Blog Stats

  • 1,861,950 hits
July 2010
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031  

CS Teaching Tips