## What students get wrong when building computational physics models in Python: Cabellero thesis part 2

Danny’s first study found that students studying Matter and Interactions didn’t do better on the FCI.  That’s not a condemnation of M&I. FCI is an older, narrow measure of physics learning. The other things that M&I cover are very important.  In fact, computational modeling is a critically new learning outcome that science students need.

So, the next thing that Danny studied in his thesis was what problems students were facing when they built physics models in Vpython.  He studied one particular homework assignment.  Students were given a piece of VPython code that modeled a projectile.

The grading system gave students the program with variables filled in with randomly-generated values.  The Force Calculation portion was blank. The grading also gave them the correct answer for the given program, if the force calculation part was provided correctly. Finally, the students were given the description of another situation.  The students had to complete the force calculation (and could use the given, correct answer to check that), and then had to change the constants to model the grading situation.  They submitted the final program.

Danny studied about 1400 of these submitted programs.  Only about 60% of them were correct.

He and his advisor coded the errors.  All of them.  And they had 91% inter-rater reliability, which is amazing!  Danny then used cluster analysis to group the errors.

Here’s what he found (image taken below from his slides, not his thesis):

23.8% of the students couldn’t get the test case to work.  19.8% of the students got the mapping to the new test condition wrong.  That last one is a common CS error — something which had to be inside the loop was moved before the loop. Worked once, but never got updated.

Notice that a lot of the students got an “Error in Force Calculation.”  Some of these were a sign error, which is as much a physics error as a computation error.  But a lot of the students tried to raise a value to a vector power.  VPython caught that as a type error — and the students couldn’t understand the error message.  Some of these students plugged in something that got past the error, but wasn’t physically correct.  That’s a pretty common strategy of students (that Matt Jadud has documented in detail), to focus on getting rid of the error message without making sure the program still makes sense.

Danny suggests that these were physics mistakes.  I disagree.  I think that these are computation, or at best, computational modeling errors.  Many students don’t understand how to map from a situation to a set of constants in a program.  (Given that we know how much difficulty students have understanding variables in programs, I wouldn’t be surprised if they don’t really understand what the constants mean or what they do in the program.)  They don’t understand Python’s error messages, which were about types not about Physics.

Danny’s results help us in figuring out how to teach computational modeling better.

• These results can inform our development of new computational modeling environments for students. Python is a language designed for developers, not for physics students creating computational models.  Python’s error messages weren’t designed to be understood for that audience — they are about explaining the errors in terms of computational ideas, not in terms of modeling ideas.
• These results can also inform how we teach computational modeling.  I asked, and the lectures never included live coding, which has been identified as a best practice for teaching computer science.  This means that students never saw someone map from a problem to a program, nor saw anyone interpret error messages.

If we want to see computation used in teaching across STEM, we have to know about these kinds of problems, and figure out how to address them.

Entry filed under: Uncategorized. Tags: , , , .

• 1. Alan Kay  |  August 1, 2011 at 10:30 am

Based on absolutely nothing but “a feeling” …

It feels to me as though a lot of these kinds of results are partially artifacts of “teaching to largish classrooms of students”.

I also agree that Python is likely not the best form of language for this kind of thinking and building … but the nagging feeling remains that — compared to some of the best teaching I’ve witnessed over the years e.g. from Betty Edwards and Tim Gallwey, neither of whom would ever let a learner even form the misconceptions — a lot of the problems stem from the poor teacher to student ratio and a kind of factory model of learning, which really doesn’t apply here.

Cheers,

Alan

• 2. Mark Guzdial  |  August 1, 2011 at 10:58 am

Hi Alan,

I completely agree that these results arise due to large classes, but that’s exactly what Danny is interested in. He considers in his thesis other ways of embedding computational modeling in physics, and other ways of teaching introductory physics, but dismisses them: “Fundamental alterations of the introductory course are rarely implemented at large scale, hence, the impact of such changes on student learning is not yet well understood.” I have this sentence marked in my copy of the thesis as being confusing and troubling. Don’t we better understanding changes of curricula at small scale?

As I went on in the thesis, I realized that Danny is a realist, and I sympathize with his perspective. The dominant model of teaching is large classes in most of STEM in state universities, and student learning in that situation is what we are challenged to address. The funding situation in the United States is unlikely to change in the next couple of decades such that we can move away from enormous classes. We have to understand how to teach large numbers of students. Physics I will always be huge at state institutions. At Georgia Tech, we teach over 1600 students in introductory computer science. If an intervention works with few students per teacher, but doesn’t work at these scales, then it’s simply not useful for most US universities.

Danny’s results aren’t about fundamental notions of learning. They’re about education at-scale.

Cheers,
Mark

• 3. Alan Kay  |  August 1, 2011 at 11:16 am

Hi Mark

Again, my feeling is that more will be accomplished with good 1 on 1 computer tutors than trying to find a way to make huge student to human-teacher ratios work.

Trying a fix a fundamentally flawed idea that has little to no chance of being fixable seems a poor allocation of resources.

Cheers,

Alan

• 4. Mark Guzdial  |  August 1, 2011 at 12:04 pm

Hi Alan,

I’ve started thinking about these huge classes as being 1-on-1 sessions between educational technology (including tutors and also new kinds of ebooks) and students, with the added advantage of occasionally getting face-to-face time, where one can do student-engagement activities like peer instruction.

Cheers,
Mark

• 5. Alan Kay  |  August 1, 2011 at 12:21 pm

Hi Mark

This is more in accord with my intuition …

Cheers,

Alan

• 6. Bill Mill  |  August 2, 2011 at 8:54 am

Mark,

In the future, people will want to read and execute any source you post! Your audience is nerds 🙂 Since it’s kind of hard to read in the screenshot, I gisted it: https://gist.github.com/1120117

• […] has shown that physics students’ learning is different with computational modeling, where their challenges are in computational modeling in Python, and here, how their attitudes about computational modeling are changing. Danny has done a […]

• 8. Alfredo Louro  |  August 2, 2011 at 10:33 am

I would disagree that raising a vector to a power is a computational error and not a physics error. In my view, it is very much a physics error, not understanding what a vector quantity is.

Beyond this, I think the errors are great! They are very revealing about what fails in the thought process and what needs to be addressed.

• 9. adawes  |  August 4, 2011 at 2:23 am

Very interesting results. Python can handle exceptions in very custom ways so it should be possible to specify error messages for these basic cases. Perhaps the common ones could be replaced with something that informs the students on a physics level rather than by throwing a basic type-error.

It would be interesting to see if something like VPython could be extended by rephrasing common errors to make them more informative and apply these results to future modeling environments.

• 10. Mark Guzdial  |  August 4, 2011 at 9:55 am

I do hope that you explore the CS education literature on how to create error messages that students can use and understand, like the SIGCSE 2011 Best Paper awardee: Kathi Fisler, Guillaume Marceau and Shriram Krishnamurthi “Measuring the Effectiveness of Error Messages Designed for Novice Programmers.” There’s a literature on how to do this right (which Java, unfortunately, ignored), dating back to PL/C.

• […] An interesting post about recent work on the implementation of python-based modeling in physics courses. Read more at Mark Guzdial’s blog. […]

• […] I’m going to take a close look at the types of errors made by the students Caballero worked with and figure out how they might inform our computational physics instruction. […]

• […] Shriram, won the SIGCSE 2011 best paper award on designing error messages for students (which is an issue that has come up here recently).  I told Guillaume about Danny Caballero’s thesis, and he told me about why it’s so […]

• […] to see before the program executes?  What are they learning from those executions?  I think live coding (and execution) is very important. We need to think through what students are learning from those […]

• 15. The holy grail of conference talks | TOPLAP  |  February 25, 2013 at 9:59 am

[…] and interacting live with running computer programs. Live coding is also now recognised as best practice in Computer Science lectures by such scholars as the highly influential Prof. Mark Guzdial, so we […]

• 16. 1 – Live Coding | Traffic.Ventures Social  |  October 14, 2018 at 11:29 am

[…] Guzdial, Mark. “What students get wrong when building computational physics models in Python: Cabellero thesi…. Retrieved 5 February […]

This site uses Akismet to reduce spam. Learn how your comment data is processed.