Seatbelts & IT Security

Seatbelts & IT Security

Intrusion detection systems, penetration tests, dynamic firewall configuration, email screening, malware detectors, item-level access controls, antivirus software, NIST CSF, ISO 27000, &c: All of these are important, but none actually yield IT security unless users behave appropriately. Yet most users don't.

Unless we understand the roots of users' reluctance, we will be unable to change their behavior. IT will remain insecure. The challenge, I believe, is fundamentally organizational and educational rather than technological.

No alt text provided for this image

My college friend Andy refused to wear a seat belt. "In a crash," he'd say, "I want to be thrown clear". Andy was training to be an engineer, and usually respected data as a basis for decisions. Later, when I was teaching decision analysis, I realized that Andy believed as he did based on incomplete data: he was focusing on outcomes—and only some of those—without taking probabilities into account.

Being thrown clear in an accident while not wearing a seat belt is (a) unlikely, in that probably one will be thrown about inside the vehicle (one outcome Andy was omitting), rather than thrown clear, and (b) dangerous, in that "thrown clear," when it happens, generally ends with impact against an unforgiving hard surface (another). Indeed, as Aaron Martin summarizes it, "...you actually have a 25 percent greater chance of being killed if thrown from the car." So Andy's belief in unbuckled seat belts was wrong.

No alt text provided for this image

Likewise, many users, given the option, turn off filtering—or, worse still, redirect their email to avoid it altogether. I'm confining myself here to "malware" filtering, based on attachments or other code that intentionally causes compromise, corruption, or other damage, and not to simpler and less controversial "spam" filtering to trap legitimate messages that users simply prefer not to see.

Users worry that malware filters will catch legitimate messages. To avoid having the occasional message caught, they bypass those filters. Those users thereby expose themselves to malware. (I have in mind several administrators on different campuses who did precisely that, and caused no end of problems for themselves and others.)

It's perfectly true that malware filters can catch legitimate messages. I've had that happen to me. But eliminating or bypassing malware filters—unbuckling them to throw messages clear, if you will—is the wrong solution. Filters only rarely catch legitimate messages. Bypassing filters usually lets malware in. Sequestering suspect messages in quarantine, where users can review them and retrieve any caught incorrectly, is the right solution.

No alt text provided for this image

But sequestration adds work, complexity, and delay. That is especially true for mobile users, since viewing quarantine websites on small screens is difficult at best, and more typically impossible.

Thus the Andy-like choice to turn off malware filtering. To prevent users from making such choices and risking problems, many institutions make malware filtering and suspect-message quarantine mandatory. Unfortunately, that can simply exacerbate the problem. To avoid having to look at their quarantined messages, users may switch from institutional to personal accounts, even for institutional business. That merely transfers the security problem. And it introduces new business problems and exposures.

No alt text provided for this image

My more recent friend Jon also refused to wear seat belts, but for a different reason. The choice whether to protect oneself, in Jon's essentially libertarian view, should concern no one else, let alone society, since the negative consequences of choosing wrong fall on the chooser.

But the negative consequences of unbuckled accidents—be they injury or death—do affect others. They impose emotional and (in the case of incapacitating injury) physical burdens. And the injured only pay part of their treatment cost: insurance and government spread much of the cost to others, via premiums or taxes. So although he may be right that society should not constrain choices whose consequences are purely individual, Jon's belief that this applies to seat belts is, like Andy's incomplete analysis, wrong.

Like Jon, though, many users assert that IT security is a matter for individual autonomy, not collective responsibility. They believe that when they ignore security recommendations, or bypass security measures, any security problems that arise only affect them. (Even so, they expect the central IT organization to fix the problems. But that's a topic for another day.)

No alt text provided for this image

Except for totally disconnected devices—now almost extinct, and therefore irrelevant—modern IT is by its very nature collective: its value depends on interconnection and society. Likewise, modern IT security is by its very nature collective: security risks arise because users and devices are interconnected. Users acting autonomously rather than collectively multiplies risks, fosters their propagation, and creates problems for the organizations or society in which they operate.

No alt text provided for this image

So we arrive at a dilemma that has been typical in myriad circumstances. It arises whenever

  • the general welfare depends on most individuals (either within an organization, or more generally) making certain choices,
  • those choices entail individual inconvenience or other localized costs, and
  • those choices produce little benefit—or at least little direct, immediate, or perceivable benefit—to the individuals making them.

That certainly describes most IT security measures: we complicate email, require frequent changes to complex passwords that are hard to remember, disallow using the same login credentials across different services, demand that users have cellphones or remember their favorite foods for 2fa, make sharing difficult, and otherwise throw obstacles into the paths of users—most of whom, even without these security measures, will never have a serious security problem, and so perceive IT security to entail costs without benefits. Moreover, the benefit of IT security measures is that they prevent problems, something that, being an absence rather than a presence, is by definition a benefit users can never see.

We promulgate IT security rules and implement filtering and other technologies seeking to control a social problem, which is that malware and other security violations propagate rapidly across networks. Even though big problems from such violations are relatively rare, and these problems often are contained or resolved rapidly, users hear about them. Users who did as we require wonder why it was worthwhile, since they still hear about problems—and may even experience those problems, but not realize the problems were caused by others who didn't do as we required. This all leads users to distrust IT, to become skeptical of our efforts to control IT security, and to act like Andy and Jon.

No alt text provided for this image

As I've written elsewhere, the era of control is, in any case, over. My focus in that earlier essay was on IT leadership generally, and CIOship in particular, but the points apply equally well to IT security. "...In order to manage actions in areas they cannot control, IT leaders need to learn to use—in radically different ways—three important, known methods for exerting influence: conspiracy, bribery, and propaganda":

By conspiracy, I do not mean that IT leaders should become traitors skulking around dark hallways. Rather, they should look to management approaches based on teamwork and collaboration ...The key point here is that a centralized IT organization can work well in a collaborative environment only if all of those involved share a sense of goals, possibilities, and strategies and if they work together to maximize success....
By bribery, I do not mean that IT leaders should solicit or slip money under the table. Rather, they should configure technologies and services so that members of the campus community make choices that serve the general good. Among other things, IT leaders need to organize and price central information technology so that the free, voluntary choices of individuals align with the needs of the institution. This is partly a matter of how the central IT organization deploys technology, but it is equally a matter of full disclosure....
Finally, by propaganda I do not mean that IT leaders should focus on producing videos and posters extolling their good works. Rather, they must learn to inform and persuade rather than defend and dictate. As control gives way to persuasion, it is very important to be honest and clear about mistakes, about bets that didn’t pay off, about unsatisfactory service, and about outages.
No alt text provided for this image

To that last point, we need to be honest and clear about exactly why individual IT security choices are so important: because they maximize the general good, rather than because they benefit choosers individually. "Protect yourself" isn't the right argument. "Protect us" is.

I have no idea whether Andy and Jon still eschew seat belts. But I observe that most everyone buckles up these days, as a matter of reflex. They may not think about individual choice and social benefit when they do that, but neither (mostly) do they think of seat belts as a burden or an infringement on individual liberty.

That's what we need to achieve for our investments in IT security technology and mechanisms to work: users making the right choices reflexively, because we have both provided effective civic education to users and made the right choices easy.

要查看或添加评论,请登录

Greg Jackson的更多文章

  • "Follow the Money," and Other Unsolicited Advice for CIOs: An Update

    "Follow the Money," and Other Unsolicited Advice for CIOs: An Update

    A couple decades ago, when I'd been CIO at the University of Chicago for just under three years, Marty Ringle, then and…

  • More Trouble with Triangles

    More Trouble with Triangles

    "I understand the frustration," writes Neil on behalf of Airbnb Support, ..

  • QWERTY, 1888, and the Myth of McGurrin and Taub

    QWERTY, 1888, and the Myth of McGurrin and Taub

    Okay, so yeah, I'm obsessed with things that happened in 1888, the year that Mrs. Draine, my 8th grade "core" teacher…

    1 条评论
  • Judah Schwartz, requiescat in pace

    Judah Schwartz, requiescat in pace

    1983. We've won the National Institute for Education's $9-million competition for a new national Educational Technology…

  • Online Meetings: Can They Be Both Open & Secure?

    Online Meetings: Can They Be Both Open & Secure?

    Say I'm a high-school kid, cooped up at home and bored. And let's say I'm somewhat tech-savvy.

    1 条评论
  • The Information Center (MIT, 1969)

    The Information Center (MIT, 1969)

    LISTEN use microphones not loudspeakers read don’t write take a leaflet someday “It is a dynamic…” no static. The…

  • A Policy Penalty Probability Puzzle

    A Policy Penalty Probability Puzzle

    I drive up to the exit gate at University Town Center, our local high-end mall, usually called just "UTC". The gate…

  • Wait, I said that?

    Wait, I said that?

    "Anyone can do any amount of work," Robert Benchley observed, "provided it isn't the work he is supposed to be doing at…

    1 条评论
  • Truth or Consequences

    Truth or Consequences

    Nothing But The Truth is always a good choice. The Truth, ditto.

  • Service Failure and the Moment of Truth

    Service Failure and the Moment of Truth

    Things go wrong. It's best to avoid that, but still.

社区洞察