Pretty Pricey Encryption: Why We Need to Stay Ahead of Hackers

“Apple cannot bypass your pass code and therefore cannot access this data” 

— Apple (2015, Dujardin)

Hooray, Apple lovers, Tim Cook did it! Your data is safe, secure, beyond reproach.

“We are drifting to a place in this country where there will be zones that are beyond the reach of the law” — James Comey (2015, Dujardin)

Booh, James Comey! How dare you want to be able to reach my personal data?

Agree? Disagree? Offended? Scared? You’re in for a treat.

Background

99 percent of new smartphones run either Google’s Android or Apple’s iOS [2017, The Verge]. Both mobile operating systems can be configured to delete — erase, wipe, etc. — data following too many incorrect passcode attempts. Enter the wrong password too many times and your stuff is gone, forever.

This auto-erase feature became an issue during the FBI v Apple case following the 2015 San Bernardino attack. Should Apple be compelled to assist the FBI in breaking in to the attacker’s personal phones? Or should Apple value privacy above all else? Or was the FBI forced to try their best in ten attempts, and then that would be it?

Apple did not budge. The FBI found an alternative way of cracking the device. All good? Not really.

Since 2015, Apple has strengthened its security. Whichever access method the FBI used in the San Bernardino case is likely fixed. There is no guarantee that the FBI can access new devices with auto-erase enabled.

That may be a problem. Law enforcement — even with warrant in hand — may be unable to access information vital to resolving crimes. What about crime victims? What about exonerating suspects? Similarly, granting law enforcement “backdoors”, “master keys”, or straight-up banning auto-erase is not feasible either. What if the really bad guys (whoever they may be) get their hands on the master-key? Build the backdoor, and they’ll come flowing through it.

There is no easy answer. There may not even be any answer. This issue ultimately boils down to a policy decision that involves a trade-off between opening Pandora’s box and building a box without keys. But is it really a binary decision, between privacy/streets-with-anarchy-and-crimes and surveillance/universal-access-no-freedom-no-privacy-no-nudies?

I refuse to believe so.

Pretty Pricey Encryption

In economic theory, signaling is a way to get around information asymmetry (1973, Spence). A sends a signal to B; B interprets the signal to mean something profound about party A (e.g. her education level, her willingness to pay, etc.). A and B may choose to transact, or not, based on the signal and other information. A’s signal is effective if it is:

  • somewhat costly to generate for A
  • hard to fake for anyone but A
  • meaningful enough for B to gain additional information.

This explanation would probably never pass an introductory economics course; there are other requirements and more nuances; but you get the basic gist of it. Signalling is awesome: It can help explain everything from animal colours to Harvard grads’ salaries to people buying Porsches. Do read the original paper.

You lost me — what’s that all to do with Apple and the FBI?

I propose a similar model for encryption, something that could be called “Pretty Pricey Encryption” (the acronym for “Pretty Expensive Encryption” did not test too well…)

Specifically, imagine a system with the following requirements:

a) Any decryption of the latest OS should take an estimated [n] days on some of the world’s fastest machines.

Want to break in to a device? Great, bring along some capital, compute power, and time. Why? This makes breaking in to devices costly, so law enforcement will be more deliberate in their requests. More importantly, it does not scaleto mass surveillance, or your average bad thug roaming the interwebs.

I do not know how to set [n], but if anything, it should be weeks rather than months, and days rather than hours.

b) As compute power advances, OS updates keep decryption times constant.

This will make sure that mass surveillance will not scale in the future, either.

c) Decryption requires physical device access.

No remote decryption should be possible. This locks out hackers, bot-nets, and all those other malicious actors.

d) Decryption attempts are logged (and maybe even verified) to a distributed database.

Blockchain, anyone? Truth be told, I am unsure about this particular requirement; the fundamental motivation is to increase transparency and accountability of law enforcement agencies.


What’s next?

I am no expert in encryption. Or in hardware manufacturing. Or hacking. I am probably not an expert in any of the areas that should be involved.

But I do believe that such important decisions should not be made by a select few in the board rooms of corporate America, but rather require intensive public debate. Technology companies have mastered marketing to consumers. Now it’s time to harness these same capabilities to engage in an honest public discussion about the limits, if any, to privacy in the 21st century.

Or, in the words of James Comey:

“[…] I want to make sure we have a conversation in this country where we really … understand the public safety trade-off associated with that kind of privacy.”

— James Comey

What do you think?

17 points
Upvote Downvote

Total votes: 0

Upvotes: 0

Upvotes percentage: 0.000000%

Downvotes: 0

Downvotes percentage: 0.000000%

One Ping

  1. Pingback:

Leave a Reply