On December 2, 2015
Syed Farook and his wife Tafsheen Malik killed 14 people and wounded 22 others
in a mass shooting in San Bernardino, California. Farook and Malik were killed after the
attack. The FBI is investigating the
case, and in so doing got Farook’s iPhone.
They got a warrant to search the phone, but the problem is the phone is
locked down so securely that the Feds can’t get in to see what’s inside. Presumably the cyberwarriors at NSA can’t even
break into it. The FBI can’t get the
encryption key from Farook because he’s dead.
Apple doesn’t have it either.
That’s one of their selling points for the iPhone, that they as a
company take the security and privacy of their customers very seriously. They also have plausible deniability in that
they can’t surrender to the Feds that which they do not have. The FBI’s
solution to their problem is to force Apple to create a security backdoor to
compromise the phone’s security.
On February 16th, Apple CEO Tim
Cook wrote an open letter to his customers, “A Message to Our Customers.”
He writes that smartphones (including the iPhone) are such an essential
part of our lives that to compromise the security of one’s iPhone can ultimately
put our own personal safety at risk. He
reaffirms his company’s dedication to protect the personal data of Apple
customers. He also states that Apple has
done everything within their power within the law to help the FBI in this
case. He trusts the FBI’s intentions are
good. But he draws the line at creating
something that Apple considers “too dangerous to create.” He disagrees with the government’s assertion
that the tool the FBI wants them to create would be used “only once.” He equates a tool to break iPhone encryption
to a “master key” capable of “opening hundreds of millions of locks from
restaurants and banks to stores and homes.” Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge. Tim Cook doesn’t want to be a party to enabling the government to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
I side with Apple on
this score, and here’s why. I have a bit
of a libertarian streak – “the government that governs best governs least.” While the government is essential to provide necessary
things,[promote the general welfare, protect the public health, etc], I find it
hard to believe that with all the resources at the government’s disposal that
it cannot solve this problem on its own.
I opposed Obamacare because I had a problem with the government telling
me that I had to buy health insurance. I
can and do buy health insurance without the government compelling me to do so
because it is the prudent thing to do.
Where am I going with this? If I
don’t like the government telling me I have to buy something, I also don’t like
the government telling a private business that they must build something, especially since this “something” is for government
use. Now, I
don’t usually watch Fox News, but when I do I watch the panel discussion in the
latter half of Bret Baier’s show. Noted
conservatives George Will and Charles Krauthammer have both gone on record on
this program to say they support the FBI’s position. Both of these guys don’t hesitate to state
whenever it suits them that government needs to stay out of people’s
business. But when it comes to
compelling a company to build something, how do they square that with their
conservative beliefs? At least
Krauthammer had a unique idea – Apple should welcome the free advertising that
their product is so secure even the government can’t hack into it.
While I support Apple
in this matter, I do have a problem with one part of Tim Cook’s argument. In his open letter he states the following:
“The government is
asking Apple to hack our own users and undermine decades of security
advancements that protect our customers — including tens of millions of
American citizens — from sophisticated hackers and cybercriminals. The same
engineers who built strong encryption into the iPhone to protect our users
would, ironically, be ordered to weaken those protections and make our users
less safe.”
The government wants
access to single phone, not many. If
Apple doesn’t want this capability to fall into the hands of “sophisticated
hackers and cybercriminals,” then be responsible and don’t make it available to
them. Keep the capability as “proprietary”
and don’t share it. If you can’t do
that, destroy the capability after it’s used once. My feeble mind tells me this is possible, but
I could be wrong. But Apple should not be compelled to do something that I believe the government can do on its own. I find it hard to believe that the collective ten-pound brains at DARPA and NSA can't solve this problem without Apple's help.
While it is true that
corporations are not people, it is also true that corporations cannot function
without people. And people are a funny
thing. Tim Cook is right to be wary of
people. People have feelings – they can
get pissed off for whatever reason and decide to share trade secrets to get
back at their employers. People have
beliefs – one man’s whistleblower is another man’s traitor [think Edward
Snowden]. All it takes is one guy with a
certain belief system [rightly or wrongly] to think that too much power is
concentrated in too few hands, and in so doing might take exception to being
party to creating a tool that will enable the government to snoop virtually
anywhere. And having taken that exception,
that individual would have no qualms about making such a capability known to
the general public. If that happens, we
all lose.