Do we know how to teach secure programming to K-12 students and end-user programmers?

January 14, 2019 at 7:00 am 16 comments

I wrote my CACM Blog post this month on the terrific discussion that Shriram started in my recent post inspired by Annette Vee’s book (see original post here), “The ethical responsibilities of the student or end-user programmer.” I asked several others, besides the participants in the comment thread, about what responsibility they thought students and end-user programmers bore for their code.

One more issue to consider, which is more computing education-specific than the general issue in the CACM Blog. If we decided that K-12 students and end-user programmers need to know how to write secure programs, could we? Do we know how? We could tell students, “You’re responsible,” but that alone doesn’t do any good.

Simply teaching about security is unlikely to do much good. I wrote a blog post back in 2013 about the failings of financial literacy education (see post here) which is still useful to me when thinking about computing education. We can teach people not to make mistakes, or we can try to make it impossible to make mistakes. The latter tends to be more effective and cheaper than the former.

What would it take to get students to use best practices for writing secure programs and to test their programs for security vulnerabilities? In other words, how could you change the practice of K-12 student programmers and end-user programmers? This is a much harder problem than setting a learning objective like “Students should be able to sum all the elements in an array.” Security is a meta-learning objective. It’s about changing practice in all aspects of other learning objectives.

What it would take to get CS teachers to teach to improve security practices? Consider for example an idea generally accepted to be good practice: We could teach students to write and use unit tests. Will they when not required to? Will they write good unit tests and understand why they’re good? In most introductory courses for CS majors, students don’t write unit tests. That’s not because it’s not a good idea. It’s because we can’t convince all the CS teachers that it’s a good idea, so they don’t require it. How much harder will it be to teach K-12 CS teachers (or even science or mathematics teachers who might be integrating CS) to use unit tests — or to teach secure programming practices?

I have often wondered: Why don’t introductory students use debuggers, or use visualization tools effectively (see Juha Sorva’s excellent dissertation for a description of how student use visualizers)? My hypothesis is that debuggers and visualizers presume that the user has an adequate mental model of the notional machine. The debugging options Step In or Step Over only make sense if you have some understanding of what a function or method call does. If you don’t, then those options are completely foreign to you. You don’t use something that you don’t understand, at least, not when your goal is to develop your understanding.

Secure programming is similar. You can only write secure programs when you can envision alternative worlds where users type the wrong input, or are explicitly trying to break your program, or worse, are trying to do harm to your users (what security people sometimes call adversarial thinking). Most K-12 and end-user programmers are just trying to get their programs work in a perfect world. They simply don’t have a model of the world where any of those other things can happen. Writing secure programs is a meta-objective, and I don’t think we know how to achieve it for programmers other than professional software developers.

Entry filed under: Uncategorized. Tags: , , , .

A little bit of computing goes a long way, and not everyone needs software engineering: The SIGCSE 50th Anniversary issue of ACM Inroads Vote for SIGCSE’s Top 10 Papers of the first 50 years

16 Comments Add your own

  • 1. Bonnie  |  January 14, 2019 at 7:32 am

    “Most K-12 and end-user programmers are just trying to get their programs work in a perfect world. ”
    One of our jobs as educators is to lead students away from this mindset. I don’t think you can teach security explicitly to small kids, but they should be taught to think about the ways in which their program might not work. This is a core concept of computing and if they aren’t learning it, they aren’t learning programming.
    As for end user programmers, the problem there is simply terrifying. So much of their code ends up being used throughout companies. Not only does this present a big security risk, but it also presents risks in terms of correctness. I know people in the finanancial industry who worry about all of the spreadsheets set up by end users, with complex calculations and modeling. These spreadsheets propagate through companies and see real use, but have never been tested in any meaningful way.

    • 2. Mark Guzdial  |  January 14, 2019 at 9:22 am

      The premise of this blog post is to assume that we want to “lead students away from this mindset,” as you say. How? I’m not sure that we know how to do it successfully for CS major undergraduates. I’m pretty sure that we don’t know how to do it for K-12 and non-CS major undergraduates, and maybe not for adult professional end-user programmers either.

      • 3. orcmid  |  January 14, 2019 at 10:19 am

        I’m concerned that we are using “secure programming” in an over-broad manner. There are many illities packed in here, and it may be better to unpack them, just as you have suggested for communities of practice and perhaps under the umbrella of suitability for purpose.

        I share your observation that we don’t know how to instill it in the practice of programming, not least of all at novice levels. Having attention on failure modes does not come easily.

        More thoughts.

        One aspect is “defensive programming,” not unlike defensive driving. I think we have all noticed that It takes a while to learn how reason about where boundaries should be enforced, what to do when there are failures, and not defending against the wrong things or in the wrong places. It takes considerable rationality and experience seems to be the best teacher. It is also insufficient to see these as local actions — understanding context and setting and purposive use enter in.

        Extending to the creation of dependable software, and being a trustworthy developer/producer covers many factors, and should not depend on the notion of perfect programming. I hazard that some amount of disciplined “engineering” is an element here as well. And comfort with the idea that substantial software development is not a solo activity and working in the open is important to master.

    • 4. orcmid  |  January 17, 2019 at 12:03 pm

      1/2: “Software has been democratized along with all the other digital artefacts. As such we live in a world drowning in software of an unknown provenance and quality, and we need to solve for the problems of experience and education on the creation side so as not to continually re-learn software development and production lessons that have been realized over the past four or five decades, while also putting in place different curation systems in different communities as signalling mechanisms for end consumers. Teaching project users to wash their hands would be a good idea.”

      Great article that is about far more about open-source development and about systemic and policy problems:

  • 5. Cody Henrichsen  |  January 14, 2019 at 10:34 am

    At the high school and college level I think we can explicitly address this at a basic level by introducing robustness and design. Students already know/experience the “joy” of a program crashing due to incorrect input/parameter and if we extend how that concept leads to more secure development we can begin the process. This will not correct all errors for sure but it can at least start the process. This will work for sure in at least AppInventor, Scratch, Java, Swift, and C++. I do not know about javascript and python land because of lack of types and my own experience.

    • 6. Mark Guzdial  |  January 14, 2019 at 10:37 am

      I’m setting a higher bar than “we can explicitly address this.” I’m asking can we do it successfully? It’s like Kathi Fisler’s work with the Rainfall Problem. Lots of people thought their students could beat the Rainfall Problem, but the students couldn’t do it. Can we show K-12 and non-CS students learning robustness and design?

      • 7. Bonnie  |  January 14, 2019 at 11:00 am

        Since I see programming as essentially an engineering activity, yes, we should be teaching students to think about robustness and design. These are core concepts of computing. And I get the sense you are positing that non-CS majors are incapable of thinking about robustness and design -surely you don’t mean that?

        • 8. Mark Guzdial  |  January 14, 2019 at 12:11 pm

          Nope — everybody’s capable of thinking about robustness and design. But everybody avoids thinking hard.

          I think about education like public health. Everyone is capable of stopping smoking, for example. But getting people to actually stop smoking is very hard.

          Everybody is capable of thinking about security. What gets them to actually do it? It’s hard enough to get CS majors to do it. Not clear how to get these other students to do it. “Just think about robustness and design” is as effective as saying “Just say no.”

  • 9. zamanskym  |  January 14, 2019 at 12:08 pm

    I remember many years ago when I had an epiphany. It was my last semester of college, late one night. I thought “oh crap, pretty soon I’m going to have to write code that actually works for real and doesn’t just have to be good enough to get past the grader!!!!”

    Until it’s real for students, I don’t see them buying in until / unless it’s real to them.

    I introduce the kids to version control pretty early but truth be told, until not the tool correctly bites them in the rear or using it saves them, they don’t get religion.

    Same with testing – probably more so since testing can only confirm the presence of bugs not their absence.

  • 10. David Young  |  January 14, 2019 at 1:12 pm

    orcmid is right, it’s necessary to unpack the term “security.” In the present discussion it seems that people use “security” to mean correctness, freedom from unintended consequences, or “safety” for some definition of that word. In other discussions, people write “security” meaning protection of privacy, or protection against impersonation or unauthorized access. What is really meant by the term?

    Security measures taken under one definition of security may work against security under another definition. Security measures may lead to unintended consequences including catastrophes. Attestations to the “correctness” of a program may lull us into a false sense of security. Getting definitions right is important.

    It may be a mistake to speak about security as an absolute property of a machine or a computer program. Rather, security may only exist in context. Isn’t it more sensible to speak about the security of persons or companies or communities than machines or programs?

  • 11. Raul Miller  |  January 14, 2019 at 3:13 pm

    I am just going to repeat or go over the points that everyone else has made, I guess:

    “Security” is several things – it has to do with problem prevention, but it also has to do with recovering from bad situations.

    (That said, “security” can also bring to mind stuff people see in entertainment media. But it’s maybe worth keeping in mind, there, that successful entertainment typically exaggerates some things while neglecting many other things?)

    Current computing security mechanisms include: passwords and encryption and variations like two factor authentication, but also include mechanisms like checksums and backups.

    (At the k-12 level, I think some security awareness might “rub off on some students”. For example, they’ll have passwords which they might share and then maybe be embarrassed by the consequences of having shared them. So I guess, also, some part of security has to do with the quality of your friends, and your experiences with them. And that’s not all going to be good.

    But educators should want a more reliable approach — if it’s your job to teach children, you’re not going to want to have as your only . legacy the “lucky few” who happen to survive bad experiences and come out of that with a useful mindset.)

    So security education at the k-12 level probably means games or similar activities, where students can try putting what they were taught into practice. (And so have some sort practical background to help them understand the words they are hearing.)

    But this very simple model of security education activities already gets into a different issue: While real security issues are relatively common, it’s relatively rare that any one person experiences those issues. Most (but not all) of the real problems that a student experiences are going to be problems they created themselves – coding mistakes but also glib understanding of the materials they’re studying. (These “self created problems are most of the problems you see” issue also holds true for professionals.) Restated: there’s some double checking and recovery work that everyone working with computers deals with.

    With that in mind, my intuition says that focusing on how to isolate and recover from problems is probably where we need to be, including for k-12 people. If the “security approaches” we hand students are also useful for solving problems they encounter frequently, that should prepare them for similar issues in the future.

    Which also leads me back to a focus on debuggers and debugging, it seems…

  • 12. gasstationwithoutpumps  |  January 14, 2019 at 4:26 pm

    Many of the comments I wanted to make seem to have been made already: the confusion between “security” and “robust programming”, our inability to teach professional programmers these skills (as evidenced by the crap on the market) much less children, and the difficulty of getting children to learn even easy programming with perfect inputs.

    Perhaps, Mark, you should do another post on this topic with a less ambiguous and less ambitious goal, so that people can make constructive suggestions rather than just grumbling about impossible dreams.

  • 13. Kathi Fisler  |  January 15, 2019 at 6:43 am

    Let’s build on what others have said about secure/safe/robust programming being a mindset, and potential failure being a key (if not essential) motivator.

    K-12 students and novice end-user programmers likely aren’t creating programs large enough to have serious flaws (unless they are using languages or platforms that make such flaws particularly easy to create). So perhaps the idea of developing the mindset through their own programs is part of the problem.

    In part, thinking about secure/safe/robust through programs requires students to understand the notional machine and how a program might go wrong within that machine. But many of these errors arise from bad or subtle input data. Understanding input data doesn’t need a notional machine — it needs thinking about a problem space.

    I have long said that students can create testing samples for far more sophisticated problems than those they can program. You don’t have to know how to implement a sorting algorithm to come up with orders of numbers that might be easier or harder to sort, for example.

    How often do we give students problems where the only goal is to come up with a set of tests that might trap broken code, or exercise various things that might go wrong? I do give these exercises, and it is clear that students need help to learn to think through a problem space. Rather than conflate this skill with learning to program, why not start by focusing on teaching students to think through problem spaces?

    Shriram’s group (and the broader How to Design Programs team) have long tried methods and created tools for assessing students’ test suites (most of my thinking on this derives from them). At Brown, Shriram’s student Jack Wrenn has been developing a tool called Examplar that runs student examples and tests against a suite of (instructor-provided) good and bad solutions, then grades the tests on how well they classify the two kinds of implementations. The Brown team is actively researching the impact of this approach (stay tuned).

    Let’s think harder about developing appreciation of these issues independently from writing code. The topic might get more interesting (and relevant) that way.

    • 14. Mark Guzdial  |  January 15, 2019 at 9:17 am

      Hi Kathi,

      I like the idea of getting students to think through problem spaces. You, Shriram, and Jack are doing this with undergrads studying CS at Brown, right? I know that you have test suites in other parts of Bootstrap and HtDP, e.g., I see them them in Bootstrap Algebra and Data Science. But that’s different than thinking about an adversary, right?

      Do you see Exemplar as leading to an understanding of adversarial thinking? If so, do you see Exemplar (or something along the same lines) work for K-12 students? For end-user programmers?


  • 15. orcmid  |  January 17, 2019 at 12:41 pm

    I favor the comments by @Bonnie and @Kathi, especially about engineering. I fear that there is too much about programming practice as mastery of code, though. Frederick Brooks in observing that there are no software-development silver bullets remarked that we know how to write programs, the problem is in knowing what programs to write. This can be taken to as an observation that one must consider the opportunity space and what it takes to be a trustworthy provider of solutions into that space.

    This just happened.

    On January 5, I made a new mail-order prescription request at BigRxCo web site. This is for a regular prescription, but using BigRxCo is favored by this year’s insurer and I went there, set up an account and requested the prescription, one I have regularly.

    On Wednesday, January 9, I received an email reporting that BigRxCo could not process my prescription order because my physician has not responded to their authorization request. I called the physician and learned that they process all of those on Fridays and the authorization request had been received.

    On Monday, January 14, I went on-line to find my status. I was informed, on the BigRxCo web side, that my order had been cancelled because authorization had not been received. I did not act on that immediately — the prescription was not urgent, and I delayed taking immediate action. I also wanted to ensure that my credit card had not been charged.

    On Wednesday, January 16, I receive BigRxCo email notification that my prescription was being filled and given an estimate of when the first 90-day supply would be received.

    Now, that “cancelled” did not appear to mean “cancelled” is not a coding defect. Far from it. That it appears to be an erroneous statement is a complicated situation. It could even be an use-case and synchronization error. That is, receipt of the authorization might have been treated as a new event that spawned an order. I have no idea. There is no apparent way for me, as an end-user, to determine the state of affairs.

    There are other aspects of my interactions with BigRxCo on-line that leads me to believe that there are further defects in the coordination between them and my BlueColor health-insurance organization. I think this is enough to reveal the state of affairs and, I trust, reveal that programming isn’t the problem. And I feel that my reliance on BigRxCo is not all that secure (i.e., trustworthy).

    Finally, here’s a time-worn case that not many may have relied upon and that is trickier to automate in a reliable manner (although there is software that comes close). And correct automation is not fool-proof, as the article points out. There is much more to think about than programming, including encouragement of a robust conceptual model for users.

    • 16. orcmid  |  January 18, 2019 at 5:24 pm

      More on trustworthy system development, because now this just happened.

      Two days ago, or thereabouts, the Octit fitness application updated automatically on my Windows 10 PC and my Windows 10 Phone. These presumably are UWP applications and have the same source code.

      I don’t think I had to log in to the update on the PC. If I did, it worked. On the Phone, my password was no good :).

      I re-installed the Phone app as part of replacing a 200GB micro-SD card, in case that was the problem. No, still no joy.

      I also did a blind password reset (using the Octit web site) to the existing password. Worked on the PC, not on the Phone.

      Now, that password is 17 characters of alphanumerics. Having see a similar problem before, I did another password reset to a newly-generated 15 character one. VOILA! Works on both PC and Phone.

      I have no idea what the difference is, and whose code it is in. As the consumer end-user, I don’t care. It does have me feel a bit insecure around how passwords are handled by the Octit folk, and that is why I use a password generator and have different passwords for all places where passwords are required.

      I don’t think this is about writing secure code, but it is a kind of chronic situation in the industry I love.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trackback this post  |  Subscribe to the comments via RSS Feed

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 10,184 other subscribers


Recent Posts

Blog Stats

  • 2,054,519 hits
January 2019

CS Teaching Tips

%d bloggers like this: