Digital security is one of the most pressing issues on the minds of both the private sector and the public sector. But many technology experts are finding the potential limits that stringent policies on data and encryption might place on innovation even more concerning than the data protection and privacy issues they mean to address. Simple features that feel familiar to us all like searching for something on the web or in an app are threatened not by good security but by oversecurity.

Security architecture should be freeing, allowing each of us confidently to create and use tools that make our lives simpler, better, and, in business, more accountable — but thoughtlessly, fearfully overregulating without understanding the repercussions or the real needs at hand make this architecture the lock instead of the key.

ADVERTISEMENT
In my own work in healthcare, I've had countless conversations with health plans using policies set to surpass even the toughest business rules because they operate under the old better-safe-than-sorry adage. But, as policy itself starts to look this way, we need to consider what we sacrifice by playing it too safe.

Situated in the Medicaid sphere, where both public and private sectors collide over private health information, I’m intimately familiar with how fear can lead companies to make decisions that tie techies’ hands from offering them the best tools possible. While the effects can be as simple (and awful) as limiting an app to reach its full usability and usefulness, I worry we’re headed down a path where government regulations and business rules will keep technologies from reaching their large-scale potential.

When trying to put this into concrete terms, I oftentimes imagine someone sending me a searchable, usable PDF of a patient profile or policy brief I might need. I'd scan for relevant terms with my handy search bar function, make pertinent notes or highlights. Presumably, if I marked it up and shared it with anyone, the document would keep a record of my comments, recording that it was I who left them.

If I left that PDF somewhere my team could also access it, then we'd have the potential to collaborate towards a better outcome. Does that require a safe place to store it? Absolutely.  But my team already has access to the safe place — they’ve been vetted to — so the PDF should live freely inside our ecosystem, safely accessible only to us so that we might search it and use it as we wish.

Instead, information policy is moving towards regulation that would roughly equate to printing out that PDF (or saving it as an image), so it was no longer searchable but instead a manual artifact.

In this scenario, out of security fears, my team has lost the right to interact with this document freely. It's been taken and stored in a locked filing cabinet. And, though they've each been issued a key, a guard stands in front of the filing cabinet and only allows access to documents and data requested by specific inquiry.

My team can't make holistic requests like "show me all patient profiles referencing diabetes" or "I want all notes relating to behavioral health care coordination” or even “show me all of Amanda’s notes.”

They'd have to say, "show me Mr. Johnson,” who they know to have diabetes, and then repeat for every other person in those files known to have the disease, the equivalent of combing through physical forms in a physical cabinet simply to accumulate information.  There is no way to tell what else in my notes or anyone else’s is related to their inquiry interest because information was no longer freely searchable once in the locked cabinet.

But it could have been.

If this example feels laborious, it should. Security is needlessly increased, and usability suffers. The search becomes inefficient; the technology no longer feels familiar; and the data has gone from usable metric to latent artifact. And for what?

This is what overregulating will do to the tech we are designing. We are finally starting to collect more data. With MITA and Meaningful Use and interoperability standards and all other signs of progress, we might finally be headed toward putting all that information to work, to achieve something good. If we aren’t careful, however, overreaching policies might keep us from innovating.

Technologists should be the ones leading these conversations, helping both public and private sectors to find the intelligent balance between security and usability. We must start a dialogue on the dangerous implications a broadly devised policy can present before they turn into roadblocks that threaten innovation— before they become the lock instead of the key.

Havard is co-founder and chief innovations officer of Health: ELT.