WannaCry ransomware attack illustrates need for evolution in cybersecurity norms
© Getty Images

Individuals and institutions affected by the WannaCry ransomware attack face a Hobson’s choice — the malicious software (malware) encrypts a user’s documents while the decryption keys remain in the hands of the cybercriminals. Victims of the attack can either pay the hackers for the release of their files, feeding the profit motive that generates attacks like this in the first instance, or refuse to do so and permanently say goodbye to their computers.

In this instance, the WannaCry attackers generally have not decrypted files after victims sent payment, making it an easy choice not to pay. But as a public policy matter, the WannaCry ransomware attack raises an equally thorny set of challenges, implicating technical questions surrounding the best ways to curtail the spread of this kind of malware, national security and intelligence policy, and corporate incentives to implement software updates. All of these challenges are taking place in a global context. In many ways, the WannaCry ransomware attack embodies the challenges and paradoxes of cybersecurity policy today and illustrates why sustainable solutions are difficult to achieve.

ADVERTISEMENT
The WannaCry ransomware used a vulnerability in older versions of Windows that allowed an infected computer to remotely compromise other network-connected computers. This worm-like propagation of the malware allowed it to infect hundreds of thousands of computers connected to the internet in less than an hour after the initial infection. These kinds of worms were fairly common from 2008-2011 and millions of computers were often infected by them.

 

The broad propagation of the malware was accelerated by the fact that many of the companies that were affected had not implemented patches that were widely available. This dynamic reflects two intertwined challenges — the first is the widespread failure to implement the fix for this particular software vulnerability that Microsoft released in March. The company appears to have been notified of the vulnerabilities in January, presumably by the U.S. intelligence community shortly after the exploits were obtained by a mysterious group called the Shadow Brokers, widely suspected to be linked to Russia. Microsoft issued a patch two months later but for a variety of reasons a number of companies whose systems were affected did not implement the fix the company developed.

There could be any number of reasons that companies fail to take a basic measure like installing patches for known vulnerabilities. On one end of the spectrum they could have made a reasoned determination that implementing the patch would interfere with their ongoing operations. On the other end lies simple organizational laziness. But the broader point is that a significant number of vulnerabilities for which there is a fix go unpatched, leading to systemic weaknesses in the cybersecurity ecosystem. Getting individuals and companies around the world to fix vulnerabilities when there is a reasonably easy way to do so ought to be a priority.

But determining a way for governments or industry leaders to make it so is a vexing challenge. So too is the problem of what to do about vulnerabilities in software that is widely used, but which companies choose to stop supporting because it is too old, the original manufacturer goes out of business, or it becomes unprofitable to do so. Whatever the reason, the fact that there is a lot of unsupported vulnerable software in existence poses a very significant challenge for Internet security.

Finally, the WannaCry episode poses a significant challenge for national security law and policy. It has been reported that the underlying exploit used in the ransomware attacks was developed by the U.S. Intelligence community to access the networks and systems of adversaries and collect foreign intelligence. The exploit was then stolen from the Intelligence community and provided to the Shadow Brokers.

In order to accomplish their objectives as foreign intelligence services, agencies like the CIA and NSA require such technical capabilities. But in keeping secret the vulnerabilities on which their exploits rely, they are denying software and hardware manufacturers the opportunity to develop patches for programs and systems that are used by the general public in addition to foreign adversaries. The government has established a process by which it evaluates when to retain important vulnerabilities and when to disclose them so they can be fixed. But episodes like this may force a reevaluation of that process, particularly in an era in which devastating leaks from the IC are becoming increasingly prominent.

There are no simple solutions to any of these challenges. Only coordinated approaches to the different components of the cybersecurity ecosystem can achieve meaningful progress.

 

Zachary Goldman is the executive director of New York University’s Center on Law and Security and adjunct professor of law. Damon McCoy is an assistant professor in computer science and engineering at New York University’s Tandon School of Science and Engineering.


The views expressed by contributors are their own and are not the views of The HIll.