In its initial response to the FTC, Wyndham rightly brings to the fore the irrationality of being accused of improper consumer data custodianship while not receiving specific instructions on the methodology, tools or best practices it should have adhered to in order to protect the data.
It's absurd for the FTC, a federal agency, to accuse public and private firms of improperly handling consumer data while a recent report from the Pentagon claims that as far as cyberattacks from abroad are concerned, even our military is not prepared to defend against this threat.
President Obama and former Secretary of State Hillary Clinton have openly acknowledged our worsening cyberattack vulnerabilities, going so far as to chastise China for using its military units to step up its continued undeniable probes and intrusions into networks that lie within our national borders. How can a federal agency like the FTC rightfully blame non-governmental firms for non-compliance when our own federal computing infrastructure is the target of thousands of successful daily probes and attacks?
Wyndham should have known better. These simple vulnerabilities are practically inviting attacks from outside the organization, making it relatively simple for hackers to get inside their network with ease. The fact that Wyndham apparently promised consumers PII protection further strengthens the FTC’s case.
Basic information security auditing techniques applied at the policy, procedure and system levels would have identified the very vulnerabilities that were cited by the FTC; moreover, fundamental scanning of its network would have provided Wyndham management with a clearer picture of its overall vulnerability to attack.
Either preventative measure probably would have saved the firm significant litigation costs that sadly, many more firms are likely to expend as the oft-cited governance problem of ignoring information security at the highest levels of management continues unabated.
Why is the FTC targeting Wyndham and not the dozens of banks and financial institutions for their sloppy cybersecurity defenses? After all, since September 2012, approximately 50 U.S. financial firms have been under coordinated and timed distributed denial-of-service attacks, and we recently witnessed the first-of-its-kind global hacking of U.S. and Indian banks that resulted in a cash loss of more than $45 million. Where is the FTC on all this cyber security malfeasance?
In reality there is no one 'best way' to defend against cyberattacks because the path towards safe computing practices that could thwart attacks has been lengthy, evolving and unclear. In 1979, the National Bureau of Standards began publishing its Federal Information Processing Standard, which provides guidelines for performing automatic data processing risk analysis. The standard has evolved since and is one of many standards in use by all non-military government agencies and government contractors involved with computing technology.
In the mid-1980s, the National Bureau of Standards and the National Computer Security Center began offering their own respective takes as to how to achieve safe computing; in the early 1990s, additional information security approaches were developed, resulting in even more ways to protect digital data, leading to divergent methodologies and standards supplemented with even more standards.
There are now so many risk management frameworks that compete against each other in terms of overall effectiveness and efficiency that one single standard does not dominate and all have some degree of usefulness. On top of that, the Internet's communications protocols were designed without security in mind, affording an open transmission environment. All this adds to the reason why the FTC cannot provide a common methodology for firms to follow.
It is premature to suggest that Congress add additional regulatory oversight to cybersecurity beyond the scope already covered by Gramm-Leach-Bliley, Dodd Frank and the Cybersecurity Act of 2012, for example. Even meticulous compliance with these regulations would not mitigate the susceptibility of network intrusions, because vulnerabilities in the most commonly used software are discovered daily, outpaced only by the number of new attack vectors used to exploit them.
The recent announcement by the Department of Homeland Security stating its intent to start sharing the government's collection of zero-day vulnerabilities with selected private firms is a first but tiny step towards leveraging critically useful data. Congress ought to focus on how to get this highly relevant information disseminated throughout the very public and private institutions, and the consumers, it is trying to protect.
Gabberty is a professor of information systems and management at Pace University in New York City. An alumnus of the Massachusetts Institute of Technology and New York University Polytechnic Institute, he has served as an expert witness in telecommunication and information security at the federal and state levels.