Posts tagged ‘ICER’

Teachers are not the same as students, and the role of tracing: ICER 2017 Preview

The International Computing Education Research conference starts today at the University of Washington in Tacoma. You can find the conference schedule here, and all the proceedings in the ACM Digital Library here. In past years, all the papers have been free for the first couple weeks after the conference, so grab them while they are outside the paywall.

Yesterday was the Doctoral Consortium, which had a significant Georgia Tech presence. My colleague Betsy DiSalvo was one of the discussants. Two of my PhD students were participants:

We have two research papers being presented at ICER this year. Miranda Parker and Kantwon Rogers will be presenting Students and Teachers Use An Online AP CS Principles EBook Differently: Teacher Behavior Consistent with Expert Learners (see paper here) which is from Miranda C. Parker, Kantwon Rogers, Barbara J. Ericson, and me. Miranda and Kantwon studied the ebooks that we've been creating for AP CSP teachers and students (see links here). They're asking a big question: "Can we develop one set of material for both high school teachers and students, or do they need different kinds of materials?" First, they showed that there was statistically significantly different behaviors between teachers and students (e.g. different number of interactions with different types of activities). Then, they tried to explain why there were differences.

We develop a model of teachers as expert learners (e.g., they know more knowledge so they can create more linkages, they know how to learn, they know better how to monitor their learning) and high school students as more novice learners. They dig into the log file data to find evidence consistent with that explanation. For example, students repeatedly try to solve Parsons problems long after they are likely to get it right and learn from it, while teachers move along when they get stuck. Students are more likely to run code and then run it again (with no edits in between) than teachers. At the end of the paper, they offer design suggestions based on this model for how we might develop learning materials designed explicitly for teachers vs. students.

Katie Cunningham will be presenting Using Tracing and Sketching to Solve Programming Problems: Replicating and Extending an Analysis of What Students Draw (see paper here) which is from Kathryn Cunningham, Sarah Blanchard, Barbara Ericson, and me. The big question here is: "Of what use is paper-and-pen based sketching/tracing for CS students?" Several years ago, the Leeds' Working Group (at ITiCSE 2004) did a multi-national study of how students solved complicated problems with iteration, and they collected the students' scrap paper. (You can find a copy of the paper here.) They found (not surprisingly) that students who traced code were far more likely to get the problems right. Barb was doing an experiment for her study of Parsons Problems, and gave scrap paper to students, which Katie and Sarah analyzed.

First, they replicate the Leeds' Working Group study. Those who trace do better on problems where they have to predict the behavior of the code. Already, it's a good result. But then, Katie and Sarah go further. For example, they find it's not always true. If a problem is pretty easy, those who trace are actually more likely to get it wrong, so the correlation goes the other way. And those who start to trace but then give up are even more likely to get it wrong than those who never traced at all.

They also start to ask a tantalizing question: Where did these tracing methods come from? A method is only useful if it gets used — what leads to use? Katie interviewed the two teachers of the class (each taught about half of the 100+ students in the study). Both teachers did tracing in class. Teacher A's method gets used by some students. Teacher B's method gets used by no students! Instead, some students use the method taught by the head Teaching Assistant. Why do some students pick up a tracing method, and why do they adopt the one that they do? Because it's easier to remember? Because it's more likely to lead to a right answer? Because they trust the person who taught it? More to explore on that one.

August 18, 2017 at 7:00 am Leave a comment

Call for Nominations to Chair ICER 2019

SIGCSE is changing how they organize ICER.  Posted with Judy Sheard’s permission:

The ACM/SIGCSE International Computing Education Research conference (icer.acm.org) is the premier conference in the world focused on computer science education research, now in its 13th year. The leadership structure has recently been reorganized so that the the individual overseeing the selection of the program (the Program Chair) and the individual overseeing the running of the conference at a particular venue (the Site Chair) are to be held by different individuals.

We are currently seeking nominations for a Site Chair and a Program Chair for ICER 2019, to be held in North America.

Both appointments to Chair are for two years, called the “junior” and “senior” years, respectively. Site Chairs host the conference at their home institution during their senior year. Only one appointment for each role will be made each year, so that in any given year there is a junior and senior Site co-chair and a junior and senior Program co-chair. A nomination committee of the Program and Site chairs for the current year and the SIGCSE Board ICER liaison nominates the ICER Site chair and Program chair to start serving two years from the current year. The SIGCSE Board makes the appointments to both roles.

For both positions, the country of the home institution of each appointee will be rotated geographically by year as has been the tradition for ICER conference chairs, i.e.

  • Year 1: North America
  • Year 2: Europe
  • Year 3: North America
  • Year 4: Australasia

The criteria for appointees:

  • Program co-chair:
    1. Prior attendance at ICER
    2. Prior publication at ICER
    3. Past service on the ICER Program Committee
    4. Research excellence in Computing Education
    5. Collaborative and organizational skills sufficient to work on the Conference Committee and to share oversight of the program selection process.
  • Site chair:
    1. Prior attendance at ICER
    2. Collaborative and organizational skills sufficient to work on the Conference Committee and to oversee all of the local arrangements.
    3. Demonstrated interest in the computing education research community.

To nominate an individual, please include the individual’s CV and a cover letter explaining how the individual meets the criteria for the role. Self-nominations are welcomed. Please send nominations for the Site chair to the 2017 Site Chair, Donald Chinn (dchinn@uw.edu), and nominations for the Program chair to the 2017 Program Chair, Josh Tenenberg (jtenenbg@uw.edu). We also encourage informal expressions of interest to the individuals just mentioned.

March 13, 2017 at 7:00 am Leave a comment

Learning Curves, Given vs Generated Subgoal Labels, Replicating a US study in India, and Frames vs Text: More ICER 2016 Trip Reports

My Blog@CACM post for this month is a trip report on ICER 2016. I recommend Andy Ko’s excellent ICER 2016 trip report for another take on the conference. You can also see the Twitter live feed with hashtag #ICER2016.

I write in the Blog@CACM post about three papers (and reference two others), but I could easily write reports on a dozen more. The findings were that interesting and that well done. I’m going to give four more mini-summaries here, where the results are more confusing or surprising than those I included in the CACM Blog post.

This year was the first time we had a neck-and-neck race for the attendee-selected award, the “John Henry” award. The runner-up was Learning Curve Analysis for Programming: Which Concepts do Students Struggle With? by Kelly Rivers, Erik Harpstead, and Ken Koedinger. Tutoring systems can be used to track errors on knowledge concepts over multiple practice problems. Tutoring systems developers can show these lovely decreasing error curves as students get more practice, which clearly demonstrate learning. Kelly wanted to see if she could do that with open editing of code, not in a tutoring system. She tried to use AST graphs as a sense of programming “concepts,” and measure errors in use of the various constructs. It didn’t work, as Kelly explains in her paper. It was a nice example of an interesting and promising idea that didn’t pan out, but with careful explanation for the next try.

I mentioned in this blog previously that Briana Morrison and Lauren Margulieux had a replication study (see paper here), written with Adrienne Decker using participants from Adrienne’s institution. I hadn’t read the paper when I wrote that first blog post, and I was amazed by their results. Recall that they had this unexpected result where changing contexts for subgoal labeling worked better (i.e., led to better performance) for students than keeping students in the same context. The weird contextual-transfer problems that they’d seen previously went away in the second (follow-on) CS class — see below snap from their slides. The weird result was replicated in the first class at this new institution, so we know it’s not just one strange student population, and now we know that it’s a novice problem. That’s fascinating, but still doesn’t really explain why. Even more interesting was that when the context transfer issues go away, students did better when they were given subgoal labels than when they generated them. That’s not what happens in other fields. Why is CS different? It’s such an interesting trail that they’re exploring!

img_3874

Mike Hewner and Shitanshu Mishra replicated Mike’s dissertation study about how students choose CS as a major, but in Indian institutions rather than in US institutions: When Everyone Knows CS is the Best Major: Decisions about CS in an Indian context. The results that came out of the Grounded Theory analysis were quite different! Mike had found that US students use enjoyment as a proxy for ability — “If I like CS, I must be good at it, so I’ll major in that.” But Indian students already thought CS was the best major. The social pressures were completely different. So, Indian students chose CS — if they had no other plans. CS was the default behavior.

One of the more surprising results was from Thomas W. Price, Neil C.C. Brown, Dragan Lipovac, Tiffany Barnes, and Michael Kölling, Evaluation of a Frame-based Programming Editor. They asked a group of middle school students in a short laboratory study (not the most optimal choice, but an acceptable starting place) to program in Java or in Stride, the new frame-based language and editing environment from the BlueJ/Greenfoot team.  They found no statistically significant differences between the two different languages, in terms of number of objectives completed, student frustration/satisfaction, or amount of time spent on the tasks. Yes, Java students got more syntax errors, but it didn’t seem to have a significant impact on performance or satisfaction. I found that totally unexpected. This is a result that cries out for more exploration and explanation.

There’s a lot more I could say, from Colleen Lewis’s terrific ideas to reduce the impact of CS stereotypes to a promising new method of expert heuristic evaluation of cognitive load.  I recommend reviewing the papers while they’re still free to download.

September 16, 2016 at 7:07 am 4 comments

Andy Ko’s sabbatical research pivot into Computing Education

Great blog post from Andy Ko on why he’s shifting into computing education research.  I hope lots of researchers come to a similar realization — that computing education is valuable, hard, and interesting.

After I stepped down as AnswerDash CTO and begin my post-tenure sabbatical, it became clear I had to pivot my research focus. No more developer tools. No more studies of productivity. I’m now much less interested in accelerating developers’ work, and much more interested shaping how developers (and developers-in-training) learn and shape their behavior.

Source: My sabbatical research pivot | Bits and Behavior

June 1, 2016 at 7:55 am Leave a comment

Call for Participants: ICER Doctoral Consortium, Sept 8th, Melbourne, Australia

The ICER 2016 Doctoral Consortium provides an opportunity for doctoral students studying computing education to explore and develop their research interests in a supportive workshop environment with a panel of established researchers. We invite students to apply for this opportunity to share their work with students in a similar situation as well as senior researchers in the field.

Applicants to the Doctoral Consortium should have begun their research, but should not have completed it.  We want people who have questions to raise with their peers and the more senior mentors, and who still have time to respond to and use the feedback in their research.

DC Co-Chairs for 2016:

Anthony Robins, University of Otago, New Zealand

Ben Shapiro, University of Colorado, USA

Contact us at: icerdc2016@gmail.com

The DC has the following objectives:

  • Provide participants a supportive setting for feedback on their research
  • Offer participants comments and fresh perspectives from outside their own institution
  • Promote the development of a supportive community of scholars
  • Support a new generation of researchers with information and advice on research and academic career paths
  • Contribute to the conference goals through interaction with other researchers and conference events

The DC will be held on Thursday, September 8, 2016 (prior to the main ICER conference, in Melbourne, Australia). Students at any stage of their doctoral studies are welcome to apply and attend. The number of participants is limited to 15. Applicants who are selected will receive a limited partial reimbursement of travel, accommodation and subsistence (i.e., food) expenses of $600 (USD).  An extra $200 may be available for participants with travel expenses greatly exceeding the standard support.

Process Timeline:

  • Friday 20th May – initial submission
  • Friday 3rd June – notification of acceptance
  • Friday 17th June – camera ready copy due

You can find more information on applying athttps://icer.hosting.acm.org/icer-2016/doctoral-consortium/

April 27, 2016 at 7:42 am Leave a comment

Call for Participation: International Computing Education Research 2016 in Melbourne, Australia

The twelfth annual ACM International Computing Education Research (ICER) Conference aims to gather high-quality contributions to the computing education research discipline. We invite submissions across a variety of categories for research investigating how people of all ages come to understand computational processes and devices, and empirical evaluation of approaches to improve that understanding in formal and informal learning environments.

Research areas of particular interest include:

  • discipline based education research (DBER) in computer science (CS), information sciences (IS), and related disciplines
  • learnability/usability of programming languages and the psychology of programming
  • pedagogical environments fostering computational thinking
  • design-based research, learner-centered design, and evaluation of educational technology supporting computing knowledge development
  • learning sciences work in the computing content domain
  • learning analytics and educational data mining in CS/IS content areas
  • informal learning experiences related to programming and software development (all ages), ranging from after-school programs for children, to end-user development communities, to workplace training of computing professionals
  • measurement instrument development and validation (e.g., concept inventories, attitudes scales, etc) for use in computing disciplines
  • research on CS/computing teacher thinking and professional development models at all levels

In addition to standard research paper contributions, we continue our longstanding commitment to fostering discussion and exploring new research areas by offering several ways to engage. These include a doctoral consortium for graduate students just prior to the conference, a work-in-progress workshop for researchers following the conference, and poster and lightning talks. This is in addition to the format of conference sessions, where all research paper presentations include time for discussion among the attendees followed by feedback to the paper presenters.

Submission Categories

ICER provides multiple options for participation, with various levels of discussion and interaction between the presenter and audience. These sessions also support work at various levels, ranging from formative work to polished, complete research results.

Research Papers

8 page limit (plus up to 2 additional pages for references), double-blind peer reviewed and published in the ACM digital library as part of the conference proceedings. Accepted papers are allotted 30 minutes for presentation and discussion at the conference.

Doctoral Consortium

2 page extended abstract submission required and published in ACM digital library as part of the conference proceedings. Students will present their work to distinguished faculty mentors during an all-day workshop and during the conference in a dedicated poster session.

Lightning Talks and Posters

Abstract (300 words) submission required and made available on conference website, but not published in proceedings. Accepted abstracts for lightning talks will be given a 3-minute time slot for rapid presentation at the conference followed by a discussion period for all attendees. Posters may either accompany a lightning talk or may be proposed separately using the same abstract submission process.

Work in Progress Workshop

This one-day workshop is a venue to get sustained engagement with and feedback about early work in computing education. White paper submission required but not included in proceedings.

Co-located Workshops

Proposals for pre/post conference workshops of interest to the ICER community (ie, those that aim to advance computer science education research) are welcomed and encouraged. ICER local arrangements personnel will be available to assist with workshop logistics where possible. If interested, contact the conference chairs for more details by April 22nd 2016:judy.sheard@monash.edu

For more information about preparation and submission, please visit the page corresponding to the submission type of interest.

Important Deadlines and Dates

Research Papers
Abstract submission (mandatory) Friday, April 15, 2016 at 11:59pm US Pacific Time
Full paper submission Friday, April 22, 2016 at 11:59pm US Pacific Time
Notification of acceptance Friday, June 3, 2016
Final camera ready deadline Friday, June 17, 2016
Other Submission Types
Doctoral consortium submissions Friday, May 20, 2016
Lightning talk and Poster proposals Friday, June 17, 2016
Work in progress workshop application Friday, June 17, 2016
Conference Schedule
Doctoral Consortium Thursday, September 8, 2016
ICER Conference Friday, September 9 – midday Sunday September 11, 2016
Work in Progress Workshop Sunday September 11 – midday Monday September 12, 2016

More details can be found at the specific pages, linked above.

April 7, 2016 at 12:09 pm Leave a comment

A CS Education Research Class Syllabus

I’m teaching a graduate special-topics course on Computer Science Education Research this semester.  Several folks have asked me about what goes into a class like that.  Here’s the syllabus (from our “T-Square” Sakai site).  The references to “Guzdial” below are to my new book, Learner-Centered Design for Computing Education that I just turned in to Morgan & Claypool on Nov. 15. Should be available by the end of the year.

This class would look different if it was in Education, rather than in Computer Science.  For example, there might be less on tools.  The sessions where we consider how CS Ed Research appears at CHI and IDC may no longer be relevant.  Instead, I could imagine work contextualizing CS Education Research in mathematics education or science education.  I would expect to see sessions on equity, on teacher development, and on computing in schools.

 

CS8803: Computer Science Education Research

College of Computing Building Room 52, 9:35-10:55 T/Th

Teacher: Mark Guzdial, guzdial@cc.gatech.edu, TSRB 324/329

Office Hours:: By appointment

Course Overview: Introduction to computing education research (CER). History and influential early work. Learning goals for different populations, with particular attention to broadening participation in computing. Connections to research in learning sciences, educational psychology, science education. Design of research studies in CER, including Multi-Institutional Multi-National, laboratory, and classroom studies.

Textbook: We’ll be using readings from the ACM Digital Library (feely available on campus), and Guzdial’s new monograph Learner-Centered Design of Computing Education (draft available here in Resources, and eventually at the Morgan & Claypool site http://www.morganclaypool.com/toc/hci/1/1). We’ll use other readings that are available on the Web or via the Resources folder on T-Square.

Grades

  • 30%: Do 5 Reading Reflections. There are 6 opportunities for reading assignments. Students can skip one. Reading reflections are marked check or minus (something needs to be fixed). All reading reflections should be typed, with font >= 11 pt. No reading reflection should be longer than 3 pages typed and single spaced.
  • 15%: Class participation. Class time will be interactive, with little lecture. It’s a significant part of the learning in the class to participate. (The programming assignment is part of class participation.)
  • 10%: Research Study Re-Design. Redesign a research study from a published paper (referenced in Guzdial or published in ICER, SIGCSE, RESPECT, or ITICSE), to improve on the scope and findings. Due Oct 20.
  • 10% Where would you use this?. Try out any of Scratch, Alice, App Inventor, Snap, StarLogo, NetLogo, Blockly, or Pencil Code. Knowing what you know from class, would you recommend this environment? When? For whom? To learn what? Write a short (2-3 page) paper. Due Nov. 19.
  • 10%: Research Question White Paper. Write a short (3-4 pages) white paper defining a research question that’s worth exploring in CER. Explain why it’s an important, interesting, and answerable question. Identify the research community that you are speaking to with this research question. Think first section of an NSF proposal. Due Nov 12.
  • 25%: Research Study Design. Propose a study to explore the your unique research question. Think NSF proposal. Plan on 6-10 pages. 15% on paper due Nov 24. 10% on 10 minute presentation (5 minute Q&A) during last week of class.

Syllabus

Week 1

Aug 18: Introduction to class

  • Who are you and what is your experience with computing education?
  • Small Group Discussion: What do you want to know about computing education research? What do you think is unknown and worth exploring?

Aug 20: Computing for Everyone. Read Chapter 1 of Guzdial.

  • Come in with a quote that’s “interesting”
  • Pro/Con Debate: “We should teach computing to everyone.”

Week 2

Aug 25: Learning Sciences

Aug 27: The Challenges of Learning Programming. Read Chapter 2 of Guzdial.

  • Come in with a quote that’s “interesting”
  • Small group activity: What’s your hypothesis for why programming is hard? How would you test your hypothesis?
  • Reading Reflection: Using ideas and quotes from Chapter 1 and 2 of “How People Learn” to explain what’s hard about learning to program.

Week 3

Sep 1: Read Multi-institutional, multi-national studies in CSEd Research: some design considerations and trade-offs (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Compare and contrast: Randomized-control trials (see definition) vs. longitudinal studies (see definition) vs. MIMN studies.
    • What are each good for?
    • Why not use more RCT and longitudinal studies in computing education?

Sep 3: Read Computational Thinking and Using Programming to Learn in Guzdial

  • Generate a list: What are examples of computational thinking?
  • Small group activity: Have you ever used programming to help you learn something else? What are the characteristics of when programming helps and when it gets in the way?

Week 4

Sep 8: Read the first Chapter of Changing Minds at this link and Weintrop and Wilensky from ICER 2015 (ACM DL link)

  • Generate a list: What are characteristics of programming environments that support learning?
  • Small group activity: How do characteristics of programming for software development and for learning differ?
  • Reading Reflection: Identify some testable claims about Boxer in diSessa’s chapter. How would you test that claim?

Sep 10: Read Media Computation and Contextualized Computing Education in Guzdial

  • Come in with a quote that’s “interesting”
  • A mini-lecture with peer instruction and prediction using Media Computation.
  • Reading Reflection: When might contextualized computing help, and where might it not?

Week 5

Sep 15: Write a program to create something of interest or answer a question of interest before coming to class.

  1. Either download JES (from Github link) and create a picture or sound that you find interesting.
  2. Or Download Python (recommend using the Enthought install) and use the Computational Freakonomics website and course notes to answer a question of interest.
  3. Or use the CSPrinciples Ebook Data Chapters to answer a question about pollution in states.

Be prepared to show what you made or what you learned in class.

Come to class ready to answer two questions:

  • Did this motivate you to learn more about CS or the context?
    • Where did programming get in the way, and where did it help?

Sep 17: Read Adults as Computing Learners in Guzdial.

  • Come in with a quote that’s “interesting”
  • Small group activity: What’s similar and dissimilar between the teachers and the graphic designers? Identify another class of adults who might need to learn computing. Which group are they more like?

Week 6

Sep 22: Read The state of the art in end-user software engineering (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Build two lists: Features of a programming environment that support end-user programming and those that support learning about computing by end-user programmers.

Sep 24: Read Learner-Centered Computing Education for CS Majors by Guzdial

  • Come in with a quote that’s “interesting”
  • Small group activity: Come up with examples from your own experience of (a) CS education that you see as learner-centered and (b) CS education that was not learner-centered.
  • Reading Reflection: Contrast the adults in Chapter 5 and the non-majors in Chapter 6 with the CS majors in Chapter 7. What’s similar and what’s different about their learning and the support that they need?

Week 7

Sep 29: Read one of:

  • Spatial Skills Training in Introductory Computing (see ACM DL link)
  • Subgoals, Context, and Worked Examples in Learning Computing Problem Solving (see ACM DL link)
  • Boys’ Needlework: Understanding Gendered and Indigenous Perspectives on Computing and Crafting with Electronic Textiles (see ACM DL link)

Come to class ready (a) to summarize your paper and (b) to support/refute these three hypotheses:

  • We ought to add spatial skills training in all introductory CS courses.
  • We ought to use subgoal-labeled worked examples in all introductory CS courses.
  • We have to consider gender and cultural relevance in designing all introductory CS courses.
  • Reading Reflection: You are the Director of Georgia Tech’s Division of Computing Instruction. You may implement one change across all of your introductory courses, and you have very little budget. What will you change?

Oct 1: Read Towards Computing for All in Guzdial.

  • Come in with a quote that’s “interesting”
  • BIG list: What do we most need to know to advance computing for all? Where are the research gaps?
  • Everyone leave with a personal list of the top three research gaps that you find most interesting.
  • Reading Reflection: Pick any paper referenced in Guzdial that we did not read separately in this class. Read it and summarize it for me.

Week 8

Oct 6: Read Margulieux and Madden’s “Educational Research Primer” (in class Resources)

  • Small group activity: For your favorite research gaps, what research methods would you use to fill some of that gap?
  • Group activity list: What are the research methods that we need to learn more about?

Oct 8: RESEARCH METHODS: Based on the Oct 6 discussion, we’ll pick a paper or two to read here to inform our knowledge of research methods.

Newer Research

Week 9

Oct 13: No class! Fall Break.

Oct 15: RESEARCH METHODS: Based on the Oct 6 discussion, we’ll pick a paper or two to read here to inform our knowledge of research methods.

  • Discussion of Research Project: You don’t have to do it. You do have to design it.
    • First step: Define your question (due Nov 10), and make it answerable.
    • Second step: Tell us how you’d answer it.

Older Research

Week 10

Oct 20: Research Re-Design Due Here By 5 pm.

Oct 22: Read CE21 and IUSE proposals in Resources. (Note: They both weren’t funded in this form.)

  • Group Dissection:
    • What are the research questions?
    • What are the hypotheses?
    • What are the research methods?
  • Small group: Is this do-able? Would you give it a thumbs-up or a thumbs-down?

Week 11

Oct 27: What’s involved in reaching and studying populations at large-scale? Large scale: Read 37 Million Compilations: Investigating Novice Programming Mistakes in Large-Scale Student Data (ACM DL link) and Programming in the wild: trends in youth computational participation in the online scratch community (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Two lists: What can we know from looking at these kinds of data, and what can’t we know?

Oct 29: What’s involved in reaching and studying populations at small-scale? Small scale interviews/phenomenography: Read Graduating students’ designs: through a phenomenographic lens (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Small group discussion: What can we answer with a phenomengraphic approach that we can’t learn (easily) in other ways?

Week 12

Nov 3: What’s involved in reaching and studying populations in high school? In the High School: Read A Crafts-Oriented Approach to Computing in High School: Introducing Computational Concepts, Practices, and Perspectives with Electronic Textiles (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Storytime: Sharing stories about getting into K-12 schools.

Nov 5: CS Education Research in CHI. Read Learning on the job: characterizing the programming knowledge and learning strategies of web designers (ACM DL link) and Programming in the pond: a tabletop computer programming exhibit (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Group list: What makes a CHI paper different from an ICER paper?

Week 13

Nov 10: CS Education Research in IDC. Read Strawbies: explorations in tangible programming (ACM DL link) and “Let’s dive into it!”: Learning electricity with multiple representations (ACM DL link)

  • Come in with a quote that’s “interesting”
  • Group list: What makes an IDC paper different?

Nov 12: Research White Paper Due Here

CS Ed Research at Georgia Tech. Read one of Betsy DiSalvo’s papers — your choice.

  • Come in with a quote that’s “interesting”
  • Small group: Contrast Betsy’s research questions and methods with those of Mark’s and his students.

Week 14

Nov 17: CS Ed Research at Georgia Tech. Read Engaging underrepresented groups in high school introductory computing through computational remixing with EarSketch (ACM DL link) and EarSketch: A Web-based Environment for Teaching Introductory Computer Science Through Music Remixing (ACM DL link)

  • Group list:
    • What are the research questions for EarSketch?
    • What are the research hypotheses?
    • What are the research methods?

Nov 19: Try it out! Hand in your Where would you use this? papers before class. Come to class prepared to demo the environment you picked.

  • Debate: For a set of audiences and learning goals that we define in class, argue for your environment to meet that need.

Week 15

Nov 24: Research Design Paper Due Here.

Nov 26: No Class! Eat Turkey.

Week 16

Dec 1: Present Research Designs

Dec 3: Present Research Designs

November 18, 2015 at 8:22 am 3 comments

Older Posts


Recent Posts

October 2017
M T W T F S S
« Sep    
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Feeds

Blog Stats

  • 1,440,481 hits

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 5,159 other followers

CS Teaching Tips