• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Court orders Apple to help unlock iPhone used by San Bernardino shooter

Status
Not open for further replies.

The Real Abed

Perma-Junior
I hope my iPhone doesn't do that. My friend tried to unlock mine to take a photo of me while I was playing soccer once, but she forgot the code. To think it could have wiped it!
It's off by default.

Also, I could have sworn the iPhone lets you use the camera when the phone is locked. It just doesn't let you view already taken photos. I'll have to check for myself... Edit: Yep. You can use the camera without unlocking. Swipe up on the camera icon in the bottom right corner. It just hides all previous photos. Once again, another safety feature. Imagine asking a stranger to take a photo of you and your girlfriend in Italy and you unlock the phone and hand it to them. They run off and keep the phone awake until they get to a hiding place. This way you can keep it locked but still let a stranger use it for that purpose.
 

mf.luder

Member
Put me on the side of privacy at all costs, even if the crime is deemed terrorism. It's a slippery slope to force companies to break encryption, and in my opinion, the terrorist boogeyman isn't a good enough reason. You start moving into future-crime territory if every private detail of someone's life is accessible to government (even with a warrant)

Yep. No thanks to whatever the cause is, I'll take privacy over all.
 

Syriel

Member
No, in this case the company has the gps location of the boat to within a mile and the boat is full of children who may starve if not found.

That right there is the real question.

If Apple keeps its past stance and claims it has no special access to the device, then ordering it to work on cracking it is essentially the government ordering custom programming work to be done for free.

If Apple can easily do what the government asks, then its past public stance is shown to be false.

At this point, only Apple knows if it "has the gps location" so to speak. Or if it would be akin to telling MasterCraft "hey, a boat you made sunk in the Pacific. Tell us where it is."
 
No, in this case the company has the gps location of the boat to within a mile and the boat is full of children who may starve if not found.
GPS is licensed and regulated by the government. Completely different!

Edit: I mean what if that company is out of business? Track down former employees and force them to potentially develop new code to maybe research a new way to circumvent something that may or may not be circumventable?
 

Fat4all

Banned
have they tried, 1234? Just Sayin'

P9jVbSn.jpg
 

CLEEK

Member
The only reason that Apple have gone down the path of hardware encryption is the US government playing fast and loose with privacy, and strong arming US tech companies to provide access.

Apple needs to hold steadfast on this, and hopefully have been smart enough in their implementation that the 10 wrong attempts safeguard can't be disabled. The court order is effectively demanding a security backdoor, which is the very reason Apple have made encryption.

Encryption is an all or nothing thing. As soon a anyone introduces backdoors or methods to break it, it's pointless. It's not Apple's place to assist the FBI.

I'm also dubious of what value to the FBI an unlocked phone would be. They would have other methods to know what calls were made, the geolocation of the phone, even what websites and data had been used. None of which need access to the phone itself.
 

Alavard

Member
I guess the NSA isn't that powerful if the government couldn't figure out how to do this without forcing Apple to do it.

I'm personally of the belief that they can't, but this doesn't prove that they can't. Remember, if they can break encryption like this, if they do so in a criminal case, they reveal that fact. It becomes a matter of public record in the trial.

If they could break the encryption, but rely on that for foreign intelligence or homeland security matters, that never get exposed to the public, they wouldn't necessarily want to give that fact away, and have their targets all change security measures.
 
How many of those phones are key pieces of evidence in murder investigations? It'd be one thing if they asked Apple to do this remotely on people being monitored, it's another thing entirely when the phone has been entered into evidence and the accused is in custody. Again, the encryption is not being cracked here. Only, potentially, the ability to guess the encryption key.

It's a slippery slope, and is exactly what Tim Cook has been talking about publicly. You provide a backdoor to get the bad guys, and that backdoor can also be used against everyone else. Apple is on the side of privacy first, of its customers first. I believe Apple is on the right side here. Making some criminal investigations harder is a fair tradeoff to protect individual privacy of all its customers, whether that protection be from criminals or from a government run amok.

Unlocking by printing a cast of of the fingerprint from photos has been proven to work:
http://www.ibtimes.com/hacker-demon...int-sensors-using-regular-photographs-1769408

Yes, but which finger is registered? Get that wrong four times and the iPhone requires a passcode to unlock.
 

Mistake

Member
The best security is to not have the information in your hands. Hopefully that's the case with Apple here, since you can't have things both ways and ask the impossible. Isn't that what dotcom did with mega?
 
Oh, so now the federal courts need something from Apple?

Yeah, I'm sure after the way the federal court handled the Samsung case and shamelessly railroaded them on the iBooks case with a prejudiced judge, Apple just can't wait to help them out...

LOL.

This would be a very Petty hill to die on for Apple. If apple decided not to help the government with a terrorism case because that same government found them guilty of price fixing, that'd be a very shitty thing to do.

In all likelihood, there isn't a backdoor for this. Although I figure an apple engineer could develop one... It's a company that released an update that bricks phones with screens installed by 3rd parties, so I suspect they could break this if need be.

But I don't know if they should..
 
It's a slippery slope, and is exactly what Tim Cook has been talking about publicly. You provide a backdoor to get the bad guys, and that backdoor can also be used against everyone else. Apple is on the side of privacy first, of its customers first. I believe Apple is on the right side here. Making some criminal investigations harder is a fair tradeoff to protect individual privacy of all its customers, whether that protection be from criminals or from a government run amok.

This is the real issue, it can be justified right up until it does not work n the favour of the greater good. It's not like crimes could not have been solved before iPhones existed anyway.
 

numble

Member
People sometimes kinda fetishize information freedom. If you can't drop the indignation for a murder and terrorism investigation that would STILL require ridiculous brute force tech effort, then god forbid you ever meet the relative of a victim.


Slippery slope is a logical fallacy, btw. (Jack Random post above)

It wouldn't be considered a slippery slope. The next time it will just be the equivalent scope--another court order. One court order is legally equivalent to another court order. And then a Chinese or Russian court order asks for Apple to backdoor into another iPhone, citing the precedent of Apple complying to court orders in the past, with the threat of curtailed market access for only complying to US court orders.

The best move is to tell the courts they cannot do anything about it, it was designed this way.
 
Oh, so now the federal courts need something from Apple?

Yeah, I'm sure after the way the federal court handled the Samsung case and shamelessly railroaded them on the iBooks case with a prejudiced judge, Apple just can't wait to help them out...

LOL.

So Apple shouldn't cooperate in a federal investigation into a brutal mass murder because lol petty bullshit? Really?
 

Briarios

Member
Let's look at this as if it were not data, but a physical thing.

Apple builds a secure safe with a boobytrap mechanism that destroys the materials inside. The police confiscate this safe from criminals and believe there may be documents inside that may or may not have have information regarding other crimes. They get a court order for Apple to disarm the boobytrap to preserve the documents so that the police can force their way into the safe -- not for the key or how to crack the safe, just disarm the destruction mechanism.

I don't think most reasonable people would have an issue with this.

It's the same thing here. The slippery slope argument is overused -- that's why court oversight was required. It prevents the slide down that slope.
 
Poor terrorist boogeymen and their beleaguered rights and freedoms.
But hey, Apple users all over the world could be next in the indiscriminate hunt of government prosecution. Then, Allah forbid, all Android users could be next, and Armageddon ensue after the government or some hacker exposes our naked pictures and raunchy texts all over the place for all to see online.

But really it is a legitimate concern. Apple promised privacy for their customers ($$$), and the customers expect a certain quality of product ($$$), and Apple can't go back on their sacred corporate word regarding customer service and marketing ($$$), they'll stand up for what is right and proper ($$$).
That is why, after the FBI waits two months to crack their phone, then going to court for more warrants and an unlikely-to-exist backdoor program from Apple, and who knows what other legal wranglings, a person would have to kill 14 people and injure 22 to justify the whole spectacle and the assault on their basic civil liberties. Except it doesn't, because fuck-you-I-got-mine, privacy over all, no matter the individual case. It would be oh such a devastating and perilous snowball from here on out.
Court oversights probably won't prevent other related mass murders anyway, so why give it a shot anyway.
 

numble

Member
Let's look at this as if it were not data, but a physical thing.

Apple builds a secure safe with a boobytrap mechanism that destroys the materials inside. The police confiscate this safe from criminals and believe there may be documents inside that may or may not have have information regarding other crimes. They get a court order for Apple to disarm the boobytrap to preserve the documents so that the police can force their way into the safe -- not for the key or how to crack the safe, just disarm the destruction mechanism.

I don't think most reasonable people would have an issue with this.

It's the same thing here. The slippery slope argument is overused -- that's why court oversight was required. It prevents the slide down that slope.

Courts throughout the world operate differently, but they are considered equal in the eyes of the law.

This isn't really about disarming a boobytrap--if the iPhone is secure as Apple says it is, then the tool to "disarm the boobytrap" does the same job as a key or the mechanism to crack the safe.
 

Matt

Member
I don't really get the argument that a warrant for the data on a phone is less valid than a warrant to search someone's house.
 
Let's look at this as if it were not data, but a physical thing.

Apple builds a secure safe with a boobytrap mechanism that destroys the materials inside. The police confiscate this safe from criminals and believe there may be documents inside that may or may not have have information regarding other crimes. They get a court order for Apple to disarm the boobytrap to preserve the documents so that the police can force their way into the safe -- not for the key or how to crack the safe, just disarm the destruction mechanism.

I don't think most reasonable people would have an issue with this.

It's the same thing here. The slippery slope argument is overused -- that's why court oversight was required. It prevents the slide down that slope.

I agree that this is a reasonable request from the judge, but I don't know if Apple engineers can actually remove the "boobytrap".

It seems like a pretty difficult problem if the device really is end-to-end encrypted. It also calls to question what Apple does if they were able to find a way to meet the judges request. Whatever they found would be a huge security hole and they would be obligated to fix it in a software update. I doubt the FBI would be particularly pleased with that.
 
I don't really get the argument that a warrant for the data on a phone is less valid than a warrant to search someone's house.

I don't think that's the issue. A legitimate warrant is fine for anything for a house or a phone. The issue is forcing an uninvolved 3rd party to participate in or develop methods to get somewhere law enforcement have failed to. And if Apple is right in that they have no method of recovering a locked phone, there is nothing they can do anyway.
 

Dr.Acula

Banned
The problem is that Apple wants governments, law firms, doctor's offices, banks, law enforcement etc. to use their phones. All of these sectors have very demanding security requirements, and Apple wants to be able to meet them.

Now they government is saying, okay, now break this thing for us.

Apple is saying, hey, we've got billions in contracts from around the world so that Japanese nuclear scientists can't get hacked by North Korea, the whole point is locking this down.
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
Wouldn't it be an option to obtain all saved data (including encrypted data) from the phone, try it ten times, restore the data on the phone and try again? If there are only 10.000 combinations, this seems like a viable option and I don't see why you wouldn't be able to do that, afterall all data must be stored on the phone somehwere and given a few hardware buffs it shouldn't be a problem to copy all this data (though accessing it in a meaningful way would of course mean having to break the encryption, which would not be worth the effort if guessing the code 10.000 times is sufficient).
 

The Real Abed

Perma-Junior
Wouldn't it be an option to obtain all saved data (including encrypted data) from the phone, try it ten times, restore the data on the phone and try again? If there are only 10.000 combinations, this seems like a viable option and I don't see why you wouldn't be able to do that, afterall all data must be stored on the phone somehwere and given a few hardware buffs it shouldn't be a problem to copy all this data (though accessing it in a meaningful way would of course mean having to break the encryption, which would not be worth the effort if guessing the code 10.000 times is sufficient).
You'll still need the Apple ID password.
 

Montresor

Member
I don't really get the argument that a warrant for the data on a phone is less valid than a warrant to search someone's house.

The great thing about the locking mechanism on an iPhone is that it is absolutely humanly impossible to break through the door. Not so much with the locking mechanism on a house.

I need Apple to stand its ground. It doesn't matter if this is for a terrorism investigation. They've built a beautiful platform that protects your privacy against any intrusion, government or otherwise. It should stay that way.
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
You'll still need the Apple ID password.
They are only asking for being given the leeway of trying the code as often as they want though, no?

That being said, Apple can probably help with that by giving them the hash they must have stored somewhere and then they "only" need to find a collision. Which should make it easier for them.
 

Josh7289

Member
Apple should hold steady. Security is only as strong as its weakest link. If the company behind iOS's security itself helps to circumvent it, then the basic kernel of trust at the center of all of Apple's encryption is gone, and with it the security of the entire platform is compromised.

This terrorist misused iOS's security for evil, but that does not suddenly make it okay to put the other billion iOS users at risk, the vast majority of whom are normal people who rely on the platform's security for legitimate purposes.
 

iLLmAtlc

Member
Let's dispel with the fiction that Apple doesn't know exactly what we're doing on our phones. They know exactly what we're doing.

NSA does too. This is a sad piece of propaganda
 

nynt9

Member
Let's dispel with the fiction that Apple doesn't know exactly what we're doing on our phones. They know exactly what we're doing.

NSA does too. This is a sad piece of propaganda

Let's dispel with the fiction that you know anything about how encryption works.
 

jabuseika

Member
The government can't force them, and apple won't do anything.

Apple's probably already did the math on the damage to the brand if they comply.
 
Status
Not open for further replies.
Top Bottom