The data breach at the Office of Personnel Management that saw millions of sensitive personnel records stolen is a teaching moment for information assurance, but policymakers are cutting class.

The key lessons of the breach aren’t about people or technology but process: persistent unwillingness from the U.S. government to prepare for failure and integrate risk while investing in information security. These problems are not confined to one agency or department; breaches in the past 6 months at the White House and State Department alone speak to that. The disaster that occurred at OPM is indicative of a much more serious disease that has plagued the federal government through multiple administrations: failure to prepare for the inevitable breach.

Instead of securing data and building risk into how we design and maintain information systems, multiple government agencies and oversight bodies have dismissed both in favor of a one-size-fits-all checklist approach. Rather than looking for simple narratives of incompetence and personal failure, we should be pushing deeper to understand how an organization charged with administering the most sensitive personal information federal employees possessed could prioritize the availability of their services over the confidentiality of this data.

Organizations respond to standards – contractors are (not yet frequently enough) penalized where they fail to meet well defined contractual milestones and civil servants have performance metrics integrated into annual evaluations. A sprint on weakened legs is not the solution to poorly conceived and badly executed information security strategy. It’s easy to be hard – expending all of our energy vilifying a few leaders or speechifying about how we’re going to harden systems. More important and valuable would be improvements to the process, reforming federal information assurance standards to bring assessment and use of risk into security decision-making and preparing for failure by securing data itself.

Perfect security is impossible but significant improvements can be achieved and have already been demonstrated in the private sector. Around the same time as the OPM attack, a prominent password management service called LastPass was also breached. The company offers to hold passwords and login credentials for websites and applications, allowing users to forgo memorizing an intimidating array of passwords in favor of one master key. A version of this master password is stored in the cloud after going through a hashing and salting process, where the plain numbers and letters are converted into a pseudo-random series of bits via one way mathematical operations. In mid June, Attackers made off with the master password and email addresses for every LastPass user in a massive breach for the service.

LastPass was prepared to fail however - the attack didn’t compromise individual website credentials and customers were notified shortly after the breach’s discovery. The master passwords will be difficult to speedily crack because of the extended hashing process they undergo, meaning attackers will be forced to expend tremendous computational resources to open even a fraction of their loot. The services’ security architecture was designed to expect a breach and prepare for the worst by protecting data and segmenting systems.

Federal organizations aren’t prepared for a breach or to build with resilient engineering like LastPass. The existing information assurance process can not countenance failure and doesn’t consider all three elements of risk: threat, vulnerability, and consequence, in security decision-making. Beginning with the design and passage of the Federal Information Security Management Act (FISMA) in 2002, government information assurance has focused on a checklist of security controls to be applied to all information systems. Operating against the acquired wisdom of information security professionals to “act as if” you’ve been breached and protect data itself against theft, these standards presuppose perfect security is possible, so long as the proper controls are applied.

Even the successor to FISMA, a new information assurance program from the National Institute of Standards and Technology (NIST) called the Risk Management Framework, promises a “risk oriented” approach while delivering much of the same checklist mindset. Little time and fewer specific standards are provided to assess the risk to an information system, most instead given over to selecting, implementing, and then monitoring new controls. Dr. Andy Ozment, Assistant Secretary for Cybersecurity and Communications at DHS, in his testimony to the House Homeland Security Council made the point that there are always more tools and technology for organizations to purchase than constrained budgets allow. Using risk allows organizations to prioritize investment in those systems under greatest threat of or vulnerability to attack.

Emphasizing checklists also creates a posturing process within organizations. Where a control is missing or implemented improperly, there is a shortfall. An inspector general’s report can highlight any of these discrepancies, large or small, and have done its job to call out the “incompetence” in implementation without assessing the actual security of a system. A chief information officer (CIO) or other office charged with implementing these checklist controls can do so without ever really assuring the security of their information.

OPM’s failings are not just those of personality and to avoid the larger organizational picture is to miss the most valuable lesson of the breach – information assurance in the federal government is on the wrong path and the problems are not confined to OPM. The best outcome from the swirl of attention and rhetoric surrounding this breach would be a renewed focus on the information assurance process of all federal organizations, especially those responsible for one of the more important sixteen critical infrastructure sectors like power transmission or air-traffic control systems.

How we make these changes – reforming the IT acquisition system, realigning responsibilities within organizations to balance security and uptime as co-equal missions, and improving standards to include risk and encourage more than checklist behavior –will determine if we learn the lesson of this breach and are better prepared in the future.

Herr is a senior research associate at the Cyber Security and Policy Research Institute and PhD student in the political science department at George Washington University. His research focuses on national security policy, Internet governance, and the market for malware and cyber weapons.