Mimesis Law
7 December 2019

The People v. Apple: Smartphone Data Must Be Subjected To Lawful Searches

Feb. 19, 2016 (Mimesis Law) — As it turned out, the 30 million people watching Geraldo Rivera open Al Capone’s vault were deeply disappointed; Geraldo found nothing of value or interest inside. But the reason it got so many eyeballs was the possibility that inside could be evidence of any number of Capone’s notorious crimes. The show encouraged viewers to let their imagination run wild. Maybe there was money inside, weapons, or even a body. The only way to find out was to open it and look inside, on live television.

If Apple had been in charge of the vault, no one—not even Geraldo—would have been allowed to look inside. Apple is currently resisting a court order related to the encrypted iPhones that the San Bernardino killers owned. If you don’t remember, this was the couple that killed 14 people, mostly co-workers, at a Christmas party.

There’s really no doubt that these two were indeed the killers, and there is additional evidence that they were inspired by a terror group to carry out the attack. You would want neither one as spokesperson for the wrongfully accused. So, it might appear curious to some that Apple would want to be remotely uncooperative in this case. The answer here is the same as many of life’s other questions—money.

Thanks to the revelations of Edward Snowden, we learned that the NSA was pulling data off the servers of nine technology companies. Certainly, the revelation was embarrassing for tech companies, but it was losing in excess of $35 billion in revenue that really mattered to them. If cooperating with the U.S. government affects your bottom line that much, you can see why some companies like Apple have suddenly found a renewed concern for their customers:

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

So, you see, it’s all about the hurt that “well-meaning and law-abiding” citizens would suffer. Though maybe Apple deserves some credit now, if only a little. On the other hand, the federal agencies, whose conduct soured governmental relationships with tech companies, and the politicians who carelessly enabled those agencies, would do well to remember the proverb, ‘sow the wind, reap the whirlwind.’

While it’s ok to partake in schadenfreude over Apple’s refusal, it raises a host of questions. As Scott Greenfield points out, cheering for Apple is something like voting for Kang over Kodos:

But if Apple, Google, Facebook prevail, and prove themselves mightier than our government, any government, then their CEOs become our new Overlords, omnipotent kings who cannot be stopped or controlled.  At the moment, they seem like benevolent kings, standing up for something with which we agree. But did you get to vote on Tim Cook ascending to the throne?

Of course, even if Apple beats back the feds, there is nothing to stop the initiation of yet another secret program with colorable legality to avoid the problem of non-cooperation in the future. Federal law enforcement has the financial means to solve most their problems by throwing money at them. The King is dead; long live the King. But thwarting search warrants at the local level is something potentially more troubling.

In theory, there is no activity so private, so secret that a judge is prohibited from authorizing law enforcement from observing it, listening to it, or seizing items related to it. Our personal privacy is not absolute, but neither is the bad guy’s privacy. In particular, wrongdoers usually aim to carry on in secret, making law enforcement intrusion often necessary to solve cases and stop future crimes. The hope is that we lose a little privacy at the margins but gain much greater security in return.

Consider the hypothetical case that upon a showing of probable cause, law enforcement is issued a search warrant for the contents of Al Capone’s safe. But Apple had built the strongest lock and safe in the world, which is the same one Capone used. Indeed, this safe is so secure that if someone tries to unsuccessfully open the safe too many times, the contents will be destroyed. To execute the search warrant, law enforcement needs Apple to turn off the Mission-Impossible style self-destruct; they will do the actual safe cracking.

That’s basically what Apple is being ordered to do regarding the phone:

In other words, the order does not tell Apple to crack the encryption when Apple does not have the key. Rather, it is asking Apple to turn off a specific feature so that the FBI can try to brute force the key — and we can still argue over whether or not it’s appropriate to force Apple to disable a key feature that is designed to protect someone’s privacy. It also raises questions about whether or not Apple can just turn off that feature or if it will have to do development work to obey the court’s order.

It is worth noting that it is possible that even with Apple’s assistance, the FBI may not be able to break the encryption without the key. And computer security legend John McAfee is pessimistic about the FBI chances, though sanguine about his own. After all, the point of strong encryption is to resist such code breaking efforts. Although the debate about whether the feds should be granted an encryption backdoor is ongoing, that consideration is beside the point here.

Local law enforcement does not have the financial and technological resources of federal law enforcement agencies. Unlike the feds, local law enforcement would often be thwarted without the ability to fully execute a search warrant of electronic data stored on smart phones or computer-type devices. Generally, your local law enforcement is running down cases like gang-affiliated groups running drugs or credit card skimming rings. With limited resources, things like 24-hour surveillance, wire taps, and undercover investigators cannot be used as work-arounds for search warrants. They would break budgets before they break cases.

Apple’s concern for the slippery slope and their bottom line is understandable and weakening security could lead to less privacy and more crimes like identity theft. But uncooperative technology companies will have real-world impacts on local communities. It’s not just terrorists who use cell phones. The ability to lawfully get data off smart phones is an important tool for law enforcement in more and more cases.

While companies like Apple stridently refusing to assist with search warrants will not, by itself, cause a great spike upward in the crime rate, it will make it more difficult to arrest and then prosecute moderately sophisticated criminals. This means their criminal enterprises will go on longer and cause more damage before they slip up, like finally nabbing Capone on tax evasion.

People certainly have privacy interests worth protecting, but people also have an interest in search warrants being executed against criminals.

Main image via Flickr/GillyBerlin.

14 Comments on this post.

Leave a Reply

*

*

Comments for Fault Lines posts are closed here. You can leave comments for this post at the new site, faultlines.us

  • rabidliberal
    19 February 2016 at 9:32 am - Reply

    couldn’t they (the FBI) make an clone of the drive/memory on the iPhone, then work off the clone and try to brute force it open?

    • Christopher Best
      19 February 2016 at 1:56 pm - Reply

      No, the encryption key is necessary to retrieve a clone of the data. The encryption is implemented at the hardware level.

      • sirnephilim
        20 February 2016 at 12:01 am - Reply

        To clarify: The PIN is used to create a key that unlocks a more complex key that unlocks the actual device storage. The first key is derived from a combination of the PIN, a code unique to the device’s hardware, and a one-time randomly generated seed that is created along with the PIN. The upshot is, it’s damned near impossible to crack the device storage encryption without the PIN key.

        Fail 10 times and the device’s storage chips start wiping themselves very quickly. Solid state chips are not as easy to recover as old school magnetic hard drives, even if not encrypted.

        Depending on how the wipe process is implemented, it might be possible to remove the storage chips from the phone and attempt a brute-force crack on the hardware level encryption, but this would be many orders of magnitude more difficult than guessing a PIN. (It would take the fastest computer system in the world a few centuries or more to break the code.) And since the PIN is tied to hardware and a random seed, putting the chips in another iPhone would not work to help guess the PIN.

        Any attack meant to circumvent the PIN would be very difficult, to the point that Apple may not actually be able to do it even if compelled. If I were to try it, I’d suggest something that would intercept and bypass calls to system memory, forcing the device to ignore wipe commands or forget how many attempts have been made. This could also be implemented at the software level if you could somehow inject code into the phone before the device is unlocked.

        Honestly, I’d be talking to the people who make the jailbreaks for iPhones, they have the most experience bypassing the device security at any level.

  • Laura
    19 February 2016 at 10:07 am - Reply

    You seem to have a fundamental misunderstanding of what Apple is being asked to do. Lets use your Geraldo analogy, shall we? Geraldo used brute force to get into that safe. That is, he used a big saw (some sort of torch, if I remember correctly) to break into the safe. he didn’t have a combination or a key or whatever locking mechanism there was, so he broke in. Remember that for a minute.

    People put a lot of stuff on their iPhones: pictures of their kids and dogs, their music, aps that connect to social media, apps that connect to their banks, apps that automatically charge their credit cards, maybe even some pictures of themselves in states of undress. These are things that none of us want criminals to get their hands on. So Apple had built safeties into the iPhone. That is their job: build a product that will keep out data safe from criminals. One of those safeties is a security feature wherein the data is cleared after 10 incorrect password guesses, and there is no way around it. This keeps my data safe if I ever leave my phone on a bus or have it stolen outright.

    Let’s go back to Geraldo now. Suppose he was not just a reporter with a sensationalist story, but rather an officer of the law with a valid warrant. He would therefore have the right to saw into that safe. The safe manufacturer would probably even help him to do it. We know Apple had done things like that in the past.

    This case is different, though. The saw capable of breaking into the safe doesn’t exist. The FBI’s request is akin to asking the safe manufacturer to develop a saw capable of cutting into their safes. Once that saw exists, every single safe that used to be impenetrable no longer is. The technology exists, and it *will* fall into the hands of criminals. That is NOT good for National Security. Every soldier on active duty, Secretaries of State, Presidents, CIA operatives… all of their data has now been compromised as well. If that “saw” gets into the hands of a terrorist, they can do a LOT of damage; and whatever those two jerks in San Bernadino had on their phones will not make up for that.

    Apple is doing the right thing but our government is not technologically knowledgeable enough to see it.

    • Wrongway
      21 February 2016 at 11:00 am - Reply

      I never thought using “geraldo vault escapade” as an analogy ever help me understand anything quite so clearly.. 😛

      thanx for that..

      • shg
        22 February 2016 at 7:58 am - Reply

        But for the fact that the premise was wrong, as the difference of opinion isn’t due to a misunderstanding, but a difference of priorities that clearly eluded Laura, this rather simple explanation might have been more useful.

        It’s not that her analogy to Geraldo was bad, but that she was fighting the wrong battle with regard to this post. People, like Andrew, are allowed to have different priorities. It doesn’t make them stupid.

  • Raccoon Strait
    19 February 2016 at 11:36 am - Reply

    “In theory, there is no activity so private, so secret that a judge is prohibited from authorizing law enforcement from observing it, listening to it, or seizing items related to it.”

    That theory leaves out ones thoughts, or does it include them allowing the government to pursue ‘any means’ to extract those thoughts from someones head? Where is that enumerated as a government power? What about our dreams?

    If I were a fiction writer who thinks up the most dastardly scenarios in order to sell my fiction, is that to be considered training material for bad guys and subject to government approval, instead of just fiction? If that same fiction writer were to put some of those thoughts onto an encrypted device, does that suddenly make those thoughts rightfully available to the government, at the governments will, without some illegal behavior to instigate that search? Is jaywalking sufficient illegal behavior to instigate that search?

    Who knew. Thought crimes are now illegal and congress never had to pass a bill or amend the constitution.

    Or, there are activities so private, so secret that no judge may authorize law enforcement (or anybody else) to access them in any way without the thinkers or actors permission. What other private activities exist? Should the government have audio and video capability in your bedroom?

    I still cannot figure out what the FBI hopes to learn from gaining access to this phone. They have all the meta-data, they know who called whom, when and for how long, and from and to where. The owner is dead, so there is no court case. Is there any actual benefit to law enforcement other than getting that backdoor made? Is that the only reason it is so important?

    Do they think there is information for another such attack? Why not go after some of those contacts they already know about? They had methodologies prior to the invention of cell phones. Are those forgotten? Do they need to access some retired agents for some institutional knowledge? Encryption has been around for several centuries. Why is it suddenly the main thing standing in law enforcements way?

  • Christopher Best
    19 February 2016 at 1:55 pm - Reply

    “It’s not just terrorists who use cell phones.”

    Yeah, I do, and I don’t want a reasonably competent pickpocket getting my financial details because the FBI forced someone to break the encryption.

    Once encryption is broken, it’s broken forever. That’s how it works.

  • Jack
    19 February 2016 at 2:48 pm - Reply

    Lets say I keep a hand written ledger book of my worldwide criminal operation, but I encrypt it manually – just like people have been doing for centuries. I use a key long enough to put the time to brute-force the decryption on par with how long it would take to brute force the full combined hardware/software key on an iPhone without Apples help.

    The police come to arrest me, but I am killed in the process and the police recover my ledger. They know it may lead to solving dozens of murders and wrapping up hundreds of drug cases. This is fundamentally the same issue as an encrypted phone, computer, etc. with a deceased owner that can’t be held in contempt for not divulging the key.

    Should law enforcement be allowed to compel moleskine or bic – the companies who facilitated my encryption – to form some special tool to recover the data? No – they would have to put cryptologists on it and break the encryption themselves, if possible. Apples phones are the exact same thing…

    This is only ever going to come down to “is encryption legal” – and in order to make it illegal, it’ll have to get past the first amendment. Forcing a company to put a backdoor on their encryption is literally telling a company “no, you aren’t allowed to use that math!”.

  • Tommy Gilley
    19 February 2016 at 4:30 pm - Reply

    Apple has given power back to the people, not to apple. Apple is fighting for the principle that some data is mine and the government can f off.

  • Peter
    20 February 2016 at 3:58 pm - Reply

    The FBI’s request is akin to asking paper shredder makers to provide a way to reconstruct shredded documents. While doing so would be technically possible–micro printing a pattern on each strip, say–it is probably not something most manufacturers would have the R&D money to develop.

    I suspect the real goal here is to eliminate whole device encryption, something law enforcement have been pining for. Only a company the size of Apple is going to be able to both provide a backdoor and still maintain reasonable security. Even they might not be able to–any workaround to needing the encryption key s going to create a major weakness*. Do you think a company like Blu will? Other makers will ether not bother or, worse, provide fake security that doesn’t actually work.

    * In this case, as I understand, they just want to be able to try the pin an infinite number of times, not have a backdoor. But what if this were a finger print-protected device. Then personably a backdoor would be the only option.

  • Dawn
    23 February 2016 at 9:53 am - Reply

    I wonder what would happen if Apple relented and agreed to the FBI’s demands but the actual workers, the actual people attached to the human hands necessary to accomplish the goal, refused. I assume Mr. Cook is incapable of producing the desired “key” on his own. It would be necessary for Apple employees to do the actual manufacturing. What if they refused? As in, NO-I’m not loading this board into the machine to solder the pieces on; NO-I’m not writing a code to run this; NO NO and NO. What then? Can an individual be compelled to perform actual labor?

  • Uncle Sam Can Compel Apple — Whether You Like It Or Not
    24 February 2016 at 9:06 am - Reply

    […] 24, 2016 (Mimesis Law) — In response to the claim that Apple ought to help the FBI, many commenters argued that government should not be allowed to compel a private citizen into […]

  • 2d Circuit Uses The Pimp Hand To Uphold Searching Cellphone GPS Data
    6 December 2016 at 9:08 am - Reply

    […] about cellphones. In a nutshell, they are pretty close to being court reporters that can provide overwhelming evidence of criminal conduct, if you’re in to doing those sorts of things. Or possibly even if you think you’re not […]