Tech And Terrorism Ethics: Apple Is Right. The Government Is Wrong.

FBI-APPLE

If, in some future nightmare scenario come true, the FBI needs to break the encryption on a private i-phone to find the secret code to defuse the Doomsday Machine  President Donald Trump set up after his mind finally snapped and he thought he was the Stay-Puft Marshmallow Man, I assume that Apple won’t stand on principle and will do what needs to be done to save the world. The current dilemma, however, is not that dire.

Although President Obama announced last year that he had decided not to pursue legislation requiring tech companies to give law enforcement access to users’ encrypted data, he proved once again that if you don’t like Obama’s promises, just wait a minute.  For last week, the FBI persuaded a judge to order Apple to create software that would help federal investigators crack into the iPhone 5C that terrorist Syed Rizwan Farook was using before he and his wife slaughtered guests at his company Christmas party in San Bernardino last December. Apple has vowed to defy the order.

Good.

The government doesn’t have the right to force private companies to do its bidding and create new vulnerabilities for their own products and the customers who depend on them, harming the companies’ the reputations for data security. As usual, this administration can’t be bothered to do things the way the Constitution directs, and have  laws passed that acknowledge new needs in an era of sophisticated technology and terrorist threats—or not. No, it follows the model championed by this President, and prefers to govern by edict and strong-arm tactics. (Do you think the judge that issued that warrant had any idea what the real technology and security issues are, much less understand the technology involved?) Fortunately, Apple has the resources and motivation to stop its rights and ours from being steamrolled and a terrible precedent from being established.

It would be a terrible precedent. This is not the existential exception to the rule, where the government can trample rights because the alternative is mushroom clouds everywhere as Vera Lynn sings “We’ll meet again.” As Techdirt has been explaining repeatedly,

The FBI absolutely does not need to know what’s on that phone. It might not even care very much about what’s on that phone…. there’s almost certainly nothing of interest on the phone….Farook destroyed his and his wife’s personal phones, indicating that if there were anything truly important, he would have destroyed the last phone too. Also: the FBI already has a massive amounts of data, all of which indicates that Farook and Malik were not in contact with a foreign terrorist organization, nor were they in contact with any other unknown terrorists. Even if, despite all evidence to the contrary, Farook and Malik were somehow in invisible traceless contact with an ISIS handler, that handler would not have revealed information about other cells, because that would violate the most basic tenet of security — need to know. Other information, including things like who they were in contact with could be obtained from other sources — either service providers for metadata or from the phones of those they were in contact with.

I’m sure the FBI would like to know what’s on that phone, but liking and needing are materially different. This is a utilitarian ethics problem, and the issue is balancing. Is doing permanent damage to the rights of corporations and individuals ethically justified by the circumstances? No. Indeed, of course no. Our government thinks it is, because our current government has insufficient respect for the Constitution, the limits of government, and individual rights.

Techdirt, which has been wonderfully clear on this issue, also did an excellent  job eviscerating the responses of every one of the Presidential candidates, pointing out that this is not a situation where there is some ideal middle ground: either the government can order a company to destroy its own product’s security, or it can’t.

“If you think there’s a “middle ground” you don’t understand the issue,” Techdirt notes.  “The thing that they don’t get is that the “nerd problem” here is: How can you make a security vulnerability that only can be used by the good guys? That’s impossible. Creating a security vulnerability opens things up to the bad guys. Period.  And, of course, neither of [the candidates’] answers tackle the actual issue at stake, which is to what level the US government can force a company to hack its own customers and undermine its own systems’ security. They’re really answering a different question. Because either they don’t understand the issue or they don’t actually want to be pinned down on it. “

Oh, I think it’s fair to say that they don’t understand the question. Especially those tech whizzes, Hillary “Like with a cloth?” Clinton and Bernie Sanders. You should read the whole Techdirt piece, but here’s some of its analysis of various candidates’ responses when asked about the controversy. It’s depressing, but educational:

“Donald Trump is getting the most attention. Starting earlier this week he kept saying that Apple should just do what the FBI wants, and then he kicked it up a notch this afternoon saying that everyone should boycott Apple until it gives in to the FBI. Apparently, Trump doesn’t even have the first clue about the actual issue at stake, in terms of what a court can compel a company to do, and what it means for our overall security….

Bernie Sanders did the “on the one hand/on the other hand/I won’t actually take a stand” thing: “I am very fearful in America about Big Brother. And that means not only the federal government getting into your emails or knowing what books you’re taking out of the library, or private corporations knowing everything there is to know about you in terms of your health records, your banking records, your consumer practices…On the other hand, what I also worry about is the possibility of another terrorist attack against our country. And frankly, I think there is a middle ground that can be reached.”…

Hillary Clinton did the same thing, trying to straddle the line by admitting a backdoor sounds problematic, but really, if the nerds just nerd harder, can’t they figure something out: ‘But she concluded with a favorite law enforcement talking point: that the smart people in America can surely solve this problem and find a way to help the FBI access encrypted communications with a little brainstorming and teamwork, [saying] “As smart as we are, there’s got to be some way on a very specific basis we could try to help get information around crimes and terrorism”…

The rest of the Republican field basically did the same thing as Sanders and Clinton. On the one hand this, on the other that. It’s classic “don’t pin me down” so I don’t piss off one constituency politicking..

Cruz: “They have a binding search order…I think we can walk and chew gum at the same time. We can protect ourselves from terrorists and protect our civil rights.”

Yeah, again, that’s not the issue. Yes, they have a court order. And that is fine, if Apple had full access to the content and just needed to turn that over. Everyone agrees with that. But that’s not the issue here. It’s whether or not Apple can be compelled to go much, much further, and build a way to hack their own customers, removing security features, so that the FBI can more easily access encrypted content.

Marco Rubio? Same on the one hand/on the other hand bullshit: “If you create a backdoor, there is a very reasonable possibility that a criminal gang could figure out what the backdoor is..We’re going to have to work with the tech industry to figure out a way forward on encryption that allows us some capability to access information – especially in an emergency circumstances.”

So, we need to work together to allow some capability… that Rubio himself admits will lead to “a very real possibility that a criminal gang” will exploit. Guess what the larger risk is: a criminal gang targeting your data, or being caught in a terrorist attack? It’s the former, not the latter, and yet Rubio is pretending they’re the same.

Next up to bat, John Kasich. He’s even worse. Not only does he not understand the issue, he doesn’t even give one of those on the one hand/on the other hand answers, suggesting he doesn’t even know the key part of all of this: “I don’t think it’s an example of government overreach to say that, you know, we had terrorists here on our soil and we’ve got to understand more detail about who they may have been communicating with.”

…But that’s not the debate. The debate is if in trying to collect every possible bit of content, they have the power to commandeer a tech company and have it build tools to undermine that company’s own security systems.

Ben Carson, shows his usual level of confusion, suggesting Apple is only doing this because it doesn’t trust the government and then giving another wishy-washy answer: “I think that Apple, and probably a lot of other people, don’t necessarily trust the government these days,” Carson said. “And there’s probably very good reason for people not trust the government. But we’re going to have to get over that because right now we’re faced with tremendous threats, and individuals, radical jihadists, that want to destroy us. And we’re going to have to weigh these things, one against the other. I believe that what we need is a public-private partnership when it comes to all of these technical things and cyber security because we’re all at risk in a very significant way,” Carson said. “So it’s going to be a matter of people learning to trust each other, which means Apple needs to sit down with trustworthy members of the government, and that may have to wait until the next election, I don’t know, but we’ll see.”

This response makes absolutely no sense, and is almost self-contradictory. He’s basically admitting that the government might misuse such powers, and even suggests the Obama government in particular would do so. But his government, of course, would be fine. If you’re a Presidential candidate and your argument for a powerful surveillance tool is “well I don’t trust the other guy to use it, but you can trust me…” you’ve already lost….

Fortunately, Techdirt and Apple do understand the issue, even if our pathetic Presidential candidates don’t have a clue. They are right. The government is wrong.

_____________________

Sources: Washington Post, Techdirt

60 thoughts on “Tech And Terrorism Ethics: Apple Is Right. The Government Is Wrong.

  1. I think you’re exactly right – but for the wrong reasons.

    I suspect it’s perfectly possible for Apple to hack into this phone, and “throw away” the “key” having done so. There’s no reason to assume that any solution would automatically be available to anyone else. Hence that whole line of argument seems irrelevant.

    However – there’s a huge precedent to be concerned about. If Apple gives in and does the bidding of the US government – what’s to prevent China asking them to do the same thing? Or France? Or Finland? And on what grounds would Apple be able to refuse? And under what international auspices could they make an exception?

    I agree, Apple is in the right here, but because of the international legal precedent – not because of some mythical technological Pandora’s box.

    • How exactly does Apple let the FBI have this key and trust it to successfully “throw it away”? I’ve read a lot of stuff on this, and literally no tech expert I’ve read says that once the system is broken open, that will be the end of it—and Charles, even if it is, why should Apple’s customers believe or assume that the government hasn’t kept “the key”? Security is both a matter of belief and reality. Forget China—why would I think that this key won’t be used on my phone? Because the FBI says it won’t? You seriously think anyone should be satisfied with that?

      • Here we go:

        “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.”

        Apple doesn’t have the means to disable the auto-erase function, which introduced tougher security measures in iOS 8 following the NSA-leak by Snowden, and Cook immediately responded to the judge’s order on Feb. 17 via a letter to Apple customers denying the court order and urging consumers to take a stand for encryption and privacy. While Apple cannot disable the function, Cook says the U.S. government asking Apple to create a backdoor to the iPhone is something the company considers “too dangerous.”

        “The FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation,” Cook said in the letter. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.”

        Court records state that Apple needs to write to code to create a Software Image File the FBI can use on subject devices it requires access to. The worry is that this new method would be duplicated in some way, or get into the wrong hands, risking user privacy. “The SIF will be coded by Apple with a unique identifier of the phone, so that the SIF would only load and execute on the subject device,” the court order states.

        It appears the reason for this battle is the fact that the Apple ID password on the iPhone in question was changed less than 24 hours after the government took possession of it. A senior Apple executive told reporters during a conference call on Friday, Feb 19 that if that hadn’t happened, the company wouldn’t need to create a backdoor into the device, as the information would have been easily accessible via iCloud.

        http://www.digitaltrends.com/mobile/apple-encryption-court-order-news/2/

        Got that? the FBI wants the software for devices…meaning they intend to keep it. Meanwhile, it already screwed up!!!

        • No. The FBI gives Aplle the phone. They open it. They give the phone back and destroy the key.
          I do not see any necessity to hand over the key to the FBI, all they want is the ohine’s dsta. This is just obfuscation.

            • What evidence? For what trial? They’re dead. If there’s any rationale, it’s to find accomplices or uncover plots.
              And then there’s the precedent issue, which seems to me far greater.

            • No. 1: Your analysis is exactly spot on Jack. For once, we agree on every point.

              No. 2: Actually, government entities hire private companies all the time for this sort of thing — including forensic investigations. Thank goodness, or we’d be out of business! Most companies, including mine, are well trained in chain of custody procedures.

              • I don’t know about the physical searches themselves…but yes government does hire 3rd parties for analysis of data gathered. But there is emphasis on 3rd parties. In this case, Apple can not, in context, be viewed as a 3rd party.

                • For me, the emphasis is on “forced.” Apple can’t be forced to do anything, but it could, on its own volition, create the tool in question. Then, the government OR a third party investigator could search the phone’s data. Apple probably has the skill to do the investigation as well, but it would be best to hire a forensic investigator.

              • Like I’ve said, we don’t agree often, but when we do it’s usually profound. This is so basic, so simple, it beggars belief that rational people who take more than a reflex tick to think about it wouldn’t come to the same conclusion.

                Sure the FBI could try to hire Apple, but Apple has the right (and in this case, probably duty) to say no, and once they do, the FBI should be left playing pocket pool. That should be the end of the story until legislators try to step in.

      • They don’t. Apple keeps the key and gives up the info, as requested. Problem solved. This is not about case evidence, this is about tracking down a plot.

    • Your entire premise is flawed. It’s impossible for Apple to throw away the key because of the way the order is worded. The FBI isn’t handing them the phone and telling them to unlock it, they’re telling Apple to engineer a backdoor to their security features and give that product in it’s entirety to the FBI.

      And as Techdirt pointed out: The FBI does not need the information on that phone. And so the question becomes motive: I don’t trust the FBI and if you do, I think you’re an idiot. I think that the FBI is using this as an excuse to ask for the backdoor, specifically because they get to keep it after they unlock the absolutely useless phone in this case.

      • they’re telling Apple to engineer a backdoor to their security features and give that product in it’s entirety to the FBI.
        If that’s true then I totally agree, but my impression was the FBI just wants the data, not The Key to All Phones.

        Seems that would be the obvious counter offer, no?

        Meanwhile, no one is addressing the really scary issue, the precedent for doing the government’s bidding

        • “If that’s true then I totally agree, but my impression was the FBI just wants the data, not The Key to All Phones.”

          I don’t see how you can read the motion in any other way… Which leads me to believe that you haven’t read it.

          “Meanwhile, no one is addressing the really scary issue, the precedent for doing the government’s bidding”

          Oh that’s being addressed, perhaps not here and now, but it is. The government simply does not have the authority to compel a company to design anything, period. They could, I suppose, legislate that no new devices be sold without a backdoor, but that would require legislation, and unpopular legislation at that.

                • More—it can, however, be exceuted by a citzen acting as a duly authorized agent of the government.

                  Tell me: what would happen to any tech company that agreed to act as a government agent, even once?
                  You can’t where you want to go from here, Charles. In practical terms, it’s impossible.

              • The government, we presume, acts on behalf of the people AS A WHOLE. A private citizen, solo, can be biased and corrupt. Compromised by anti-whatever bias or compromised by pro-whatever bias. We assume a private citizen acting “on behalf of” and with “commission” of the government has the inhibitions associated with serious censure (or even severe punishment) of violating public trust. No one gets a warrant unless they understand the seriousness of the undertaking. And by definition, “just asking” a private entity to do so does not impute that defining seriousness.

                Government and prosecution cannot be casual.

          • Be patient…we have a Leftist in unknown territory…he wants to look like he supports individual liberty…but he really is comfortable with the Governmental “oversight”…

    • Wow…way to be completely wrong.

      It’s right to resist this…because it opens the door for the possibility of invasion of privacy of MILLIONS MORE iPhone users. That’s the issue and the ethical value at stake.

      Not the possibility that foreign governments might figure it out…or even criminal gangs.

  2. This pretty well spells it out. “Apple had another possible solution: If the FBI placed Farook’s phone near a known Wi-Fi network (like the one at his home or his workplace), it might automatically create a new iCloud backup with the missing information. That idea was foiled when the county, acting at the direction of the FBI, reset Farook’s iCloud password.” Enough said. They screwed up ,drop it.

  3. Apple is correct on this one. It should fight the order. It is overly broad and compels Apple to do something without due process. Moreover, there is underlying complexity that Farook and his wife are dead and can not engage in future terrorist activities. And, the phone belongs to the City of San Bernardino, and most likely does not contain any intelligence, anyway.

    In addition to the overriding issue (as nicely laid out by TechDirt) of government compelling a private corporation/business/enterprise/individual to do something against its own interests without proper due process, the FBI appears to have made the problem bigger than it had to be. If I understand the phone at issue, there is a backdoor patch to open this phone, but the FBI tried to crack it too many times and the phone locked itself up. The only way to reopen it without Apple writing new code is to wipe it out and start over again – mostly from the last point data were backed up. I believe this is an iPhone 5, and the patch must be created to reopen it without wiping out the data. The other issue is that the phone actually belongs to the City of San Bernadino, which purchased software to recover data. However, while the City purchased and authorized the key or software, the patch the City paid for was never installed on this particular phone (or any other iPhone 5s) before this phone shut itself down. Therefore, the only way to access the phone is by Apple writing code to do so; however, there is no way (as I understand the technology) to write specific code for this specific device. Writing that code would expose all of those devices to security breaches. Consequently, Apple is within its rights to fight the Court’s order. Thankfully, Apple has the resources and legal expertise to take the correct stance.

    TechDirt’s take downs of the various candidates are also spot on – from the Republican and the Democrat sides. Republicans argue that good corporate citizenry requires compliance with lawful court orders, and assumes that this Court’s order is lawful. The Democrats are not too dissimilar, and probably less disingenuous (“Trust us”). Yet, the IRS targeting conservative groups should give sufficient pause that power in the hand of incompetent government agencies will be abused; it is just a matter of to what degree.

    jvb

  4. ” As usual, this administration can’t be bothered to do things the way the Constitution directs, and have laws passed that acknowledge new needs in an era of sophisticated technology and terrorist threats—or not. No, it follows the model championed by this President, and prefers to govern by edict and strong-arm tactics. ”

    The criminally stupid part of that is that there’s probably enough bipartisan support on this issue to actually force this into legislation… Even if only to be slammed by constitutional challenges. This is one of the few topics Rand Paul was absolutely correct on, and I would have LOVED for him to still be in the race for exactly this situation.

  5. I see this as simple legal issue. Give the phone to Apple and ask Apple to provide the government all the information on the phone. Then Apple keeps the phone. It is like a file cabinet. The government does not need the “cabinet,” it needs the information inside.

    This is no different than sending a subpoena to someone and seeking non-privileged documents. All one needs provide is the information. I’ve never seen subpoena which says, “Give us the info on Mr. Jones and the keys to your office building and keys for all the other tenant in the building plus keys and lock combinations to all file cabinets.”

    The government knows that fearful people do dumb things so if they scare people, there will be a demand that apple destroy everyone’s privacy. Now we know how come so many witches were burned at the stake.

  6. There’s no better security expert than Bruce Schneier. Read what he says:

    https://www.schneier.com/blog/archives/2016/02/decrypting_an_i.html

    Paraphrase: despite what you all are claiming, the request is not for a code-cracking piece of software to be given to the government, and the critical issue raised is not one of being forced to create backdoor software. The request is more like what was said by (.) the landlord being asked to open the door, not to make and turn over the skeleton key.

    The issue is one of precedent: if the request is allowed now, even if it’s only about one phone–and it is–then what’s to prevent the next request, and the next, and the next.

    • Judge Sheri Pym did not order Apple to break the encryption on the iPhone. Instead, she asked the company to develop a new version of the iPhone’s iOS operating system that would allow the FBI to use its computers to guess the passcode quickly, without getting locked out for making too many guesses. This approach, sometimes referred to as a “brute force attack,” circumvents the iPhone’s encryption without actually breaking it.

      “Apple may maintain custody of the software, destroy it after its purpose under the order has been served, refuse to disseminate it outside of Apple and make clear to the world that it does not apply to other devices or users without lawful court orders,” the Justice Department told Judge Sheri Pym. “No one outside Apple would have access to the software required by the order unless Apple itself chose to share it.”

      You can’t deny that the order doesn’t require Apple to MAKE a skeleton key.

      http://technology.inquirer.net/46677/us-would-let-apple-keep-software-to-help-fbi-hack-iphone#ixzz40xuLl41A

          • Well, i think we’re finally in violent agreement. The issue is whether they’re required to do the government’s bidding – or not. And the precedent that creates.

            The issue is NOT, I suggest, the particular form of bidding that requires Apple to create and turnover a piece of software that the government can freely use going forward.

            Most of the discussion in this thread has been assuming that Apple is being forced to turn over a backdoor key of some sort, and that just isn’t the case. It confuses matters to claim it does

            The real issue – which I agree with you, is real and significant – is that if they do the government’s bidding, which is to render up the info that exists on the phone – then what’s to stop the government – any government – from demanding anyone’s personal data.

            The precedent is the issue – not the hacking software.

            • I want to see what “violent agreement” looks like. Does it involve knocking over chairs, food fights, and grappling in the halls? If so, I’m all in! Yes, I am! I am overdue for a good violent agreement.

              On a serious note, I think that you and Jack are both correct on the underlying issues. Perhaps they may subtle distinctions without a difference, but those differences are vital to a functioning representative democracy and a free market. They are liberty issues. Compelling a private enterprise to take actions against its own interests violates fundamental notions of fair play and substantial justice.

              jvb

  7. I am shocked, shocked I tell you to find out you do not trust the government to do what they say? The next thing you will tell me is that they are using the Patriot Act to spy on us and not just the terrorists.

  8. I’ve been staunchly in Apple’s corner on this issue since before it was even an issue and just some crying by the FBI about some future situation where they wanted to read iMessages. However – given that is firmly what I support, I feel it’s a duty to consider all arguments and loopholes to see if there is any sort of compromise. ScottzWartz hit the nail on the head with his comment above and in this particular situation, given that the owner of the phone is San Bernardino County….

    My main objective in this arena is to avoid Mass-Surveillance or Warrant-less Searches. The court order as written is absolutely appalling because that is the goal of the court order. It has no interest in the data on the phone, but as an end-around for all future investigations. If they were seriously interested in just the data on this one phone in this one situation, I might acquiesce under the following stipulations:

    1) The Owner of the phone is San Bernardino County and they make the request.
    2) Apple would hack the phone and export the data directly to the FBI, as directed by San Bernardino County.
    3) Apple would keep and destroy the hack they employ.

    My supporting reasons would be:

    A) It’s a physical hack with physical possession. It wouldn’t be available for anonymous, warrant-less, or mass surveillance uses.

    B) This phone, in particular, is susceptible because he used a simple passCODE which is 4 or 6 numbers. A security minded person like myself uses a complex passWORD. A passcode can be broken with brute force in days if not hours. A password such as I employ can be broken with brute force in about 100 years.

    To wrap this up – I agree with Jack’s original post and the majority of comments here. Apple is in the right because the order is so wrong, but I do think there is a properly worded order that could be workable – middle ground that doesn’t set a precedent.

  9. I have news for everyone who’s swallowed the pro-privacy media’s Big Lie of “setting precedent”. What the FBI wants to do has nothing to do with “setting precedent,” and everything to do with the fact that they’re not competent enough to hack the iPhone themselves.

    Don’t believe me? Just read this interview with John McAfee that I don’t think the pro-privacy media will dare to talk about: http://www.maximumpc.com/john-mcafee-we-are-20-years-behind-china-and-russia-fbi-apple-iphone-san-bernardino/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.