Should Apple have to help the FBI unlock a terrorist's iPhone?

First, understand that the FBI has the right to get into the phone of one of the San Bernardino terrorists. It is a county phone and the county has no objections.

Second, the phone has encrypted data protected by a password that could take thousands of years to bypass using so-called "brute force" (applying all possible passwords). 

Third, the FBI wants Apple to create a custom operating system which would be loaded into the phone without destroying the data and would make getting into the phone and de-encrypting the data easy.

Fourth, Apple claims that creating such a "back door" would inevitably make all of its users data vulnerable to the bad guys, because, I guess, the technology once created would eventually be "out there" and available to criminals. 

Fifth, the FBI wants all phones sold in the US to have law enforcement back door. The problem? Refer to the Fourth item above.

Sixth, Apple says that complying with such an order is the beginning of a slippery slope of government spying on the public. The counter argument is that orders like this are nothing new, such as when a landlord is ordered to allow the FBI into a suspect's apartment, it is simply a phone and not an apartment.

Full discussion in this article.

Views: 527

Reply to This

Replies to This Discussion

It sounds like Apple is saying it can't get into it at the moment, because they made it secure...and the question is if they can CREATE a back door?

I think Apple purposefully made the phones so THEY cannot get the info out of them, so, if you lose your password, you lose your data on the phone...its wiped.

So, For THIS phone, they'd need to hack into it TO create a backdoor...something they spent time and effort to make as difficult as possible.

Perhaps Apple could make a one off OS that "expires" - and hack this one phone, get the data, and reload the normal OS.

This would be akin to Bill Gates losing his iphone's PW, and calling Apple, and saying, look, I lost my PW, and I need the data off it....can you do it?

IE: A search warrant ONLY program that can't be used as the OS on other phones as their normal OS.

I cannot possibly imagine containing the research and the knowledge gained by working on breaking the safety measures. Even if a few people work on it...you now have people able to use part or all of those skills to work on other projects, a government hell bent on making this technique standard on the phones of all people they suspect of whatever and you now have a safe system which isn't safe...as it has been cracked. If apple can crack it...young genius hackers online will be able to as well.

I've discussed this topic with some students...and one had a pretty interesting comment:

Apple and Facebook have become experts on collecting, slicing and dicing and selling information about you. They don't just make it available, they analyse it and sell it to others. Now...Facebook and Apple are fighting to have not one single piece of data of an individuals phone immune to the investigative eyes of anti-terror agents.

I agree that it's hypocritical though I don't think this is any reason to create a back door. The answer shouldn't necessarily be to make encryption circumvented...but perhaps regulate the industry on big data to protect consumers more and allow them to control the data learnt about them online (they can choose to have it revealed and receive target ads...or not). Isn't the government recording every single phone call and text message and whatsapp enough? Isn't that all ready a gross breach of privacy? The government didn't in the past open up and read every piece of post we sent one another. Why is digital any different?

How is ordering Apple to devise a way to break into this phone basically different from ordering a safecracker to break into a safe where important criminal data may be contained?

If there is no backdoor made in this case (and installed in future phones), former Federal prosecutor Jeffrey Toobin said "Why not just tell terrorists an criminals 'If you want to keep your data safe, keep it on your iPhone'?"

It's utterly different unseen. The government isn't ordering a hacker to get into the phone...they are ordering the company itself to undermine their product. The equivalent with a black-safe is not ordering a safe-cracker to open it...but for the company to start selling safes that aren't safe.

The government would have to order a company to produce black safes that can be tampered with just because one safe might have data that would help one investigation (if the government can do this then sucks to be an American with valuables in their bedrooms...better order your next safe from England or Canada). Who the hell would buy a safe from a company that had known flaws that could be exploited? What could possibly go wrong? 

Sixth, Apple says that complying with such an order is the beginning of a slippery slope of government spying on the public. The counter argument is that orders like this are nothing new, such as when a landlord is ordered to allow the FBI into a suspect's apartment, it is simply a phone and not an apartment.

Again...totally wrong analogy. The government wouldn't be asking the owners to somehow open the doors of their tenants apartments...they would be ordering the owners to devise a way to manipulate apartments with flaws in the doors and windows that can be exploited so other agents (good or bad) could get inside. Then...once done...build all new apartments with this flaw.Sounds like an apartment I'd love to live in.

But, let's forget the FBI's desire to have back doors into every smartphone in the future and think about just this case. Then the analogies are closer to what law enforcement does all the time. 

If the FBI is having a hard time getting into an Ajax Company safe, they can ask for assistance and if the company refuses to help, they can get a judge to order the company to help. 

Also, were someone to invent an apartment that absolutely couldn't be broken into by any means, it might be reasonable for the FBI to ask the manufacturer to provide a way to get in, providing they had a warrant.

If the phone could be cracked without risking the privacy and encryption of millions of other users (and ability to spy on them) then yes...I would think just about anything goes per the privacy or private data of any clear terrorist. Bash open every thing they own if it means finding important evidence. I cannot see how this is possible in this case if (as Apple claims) the encryption is designed to be almost impossible to crack...and once cracked...compromised.

I can easily imagine intelligence officers and police having their very own phones compromised in the process. I cannot see how opening a back door is an intelligent move (at least based on what apple has told us).

Also, were someone to invent an apartment that absolutely couldn't be broken into by any means, it might be reasonable for the FBI to ask the manufacturer to provide a way to get in, providing they had a warrant.

Absolutely impenetrable is impossible. Certainly you can get an axe or sledge hammer and bash your way in (or more extreme measures depending on the door). I'm not sure this would be useful for most thieves as the sound would certainly attract a lot of attention...however some quiet, mechanical, sneaky lock on my door...would be the last thing I would want compromised by someone in the know...especially if thousands of officers and former officers and whoever they share this with...know how to do it.

A couple things. We have to make a decision as to what's more important: sheltering everyone including terrorists and other baddies from the prying eyes of government or surrendering to government a way to investigate misdeeds. 

The other thing is that the black hats have proven very skillful in the past at attacking security not just from a social engineering approach but technologically as well.

Even if Apple becomes legally sheltered from helping the FBI, we can be sure the black hats are hard at work on finding a way through whatever security features are installed no and in the future.

I can even imagine on the dark web a super brute force attack along the lines of the way human DNA was solved or SETI@Home is looking for intelligent life in the universe.

John McAfee says he can get into the phone in 3 weeks using social engineering (da fuq?) (source)

Although I know social engineering is a very powerful method, it depends on manipulating someone who has access to what you need.  Presumably, the password went to the shitpile with the owner of the phone, who's a ways beyond being manipulatable.

Or maybe there's just something I don't see here.

I would like McAfee to explain what role social engineering could possibly play. Doesn't make a lot of initial sense.

And also, as I think more about it, if it truly is securely encrypted and there is no back door (like, say, a place where the phone saved your passwords as you type them in) already in existence, I fail to see how even Apple could install something that could recover the text without a brute force attack.

If they can, the algorithm isn't secure.  (An algorithm is not secure by definition, if knowing it without the key is enough to let you crack the code.)

So I am finding the whole premise of the hoo-rah--that the phone is secure yet Apple could crack it post facto if they wanted to--very puzzling.

That doesn't negate the hazard of back doors being installed for future use.

RSS

© 2018   Created by Rebel.   Powered by

Badges  |  Report an Issue  |  Terms of Service