• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Should Apple unlock killer's phone?

You can disable iCloud syncing, which was done in this case. The couple months leading up to the attack, nothing got sent up to iCloud, and it's whatever is on the phone from that time period which the FBI wants to get at.

Correct. It's what's on the phone that they want. Apple gave them what was inside the cloud data up to the point they turned that off.
 
Having literally come back from a Digital Forensics course leaving me with the skills to get into phones to seize and exhibit relevant data for criminal cases we, of course, discussed this case. For about 5 minutes. As every other police officer in the room was of the clear opinion that public safety outweighed any notion of collateral intrusion and proportionality.

The sole dissenter in the class, it appeared quite obvious to me (even with my very limited knowledge in software "back doors") that the level of collateral intrusion in this case could be so large and the ramifications so unpredictable that I was boggled they couldn't see it. They appear to have fallen into the same trap as the FBI and Donald Trump - they see the collateral intrusion as a "one time" issue - by cracking this phone we might impinge on other peoples privacy by gaining access to their phone number/data sent or shared with the suspects, but this is more than outweighed by the necessity for public safety.

An honest held and non-malicious mindset, but one clearly directed by tunnel vision and with no foresight that by forcing Apple to create a back-door they compromise everyone, by exposing millions of people to the risk of data theft/corruption/intrusion. This would leave the long-standing threshold test of collateral intrusion to become nothing more than laughable nomenclature. It would also set a precedent that officials could use to force further companies to give back doors to encrypted Apps.

The fight that the UK police have with Apps/companies such as WhatsApp and BBM has already caused massive friction regarding their encryption policies and our inability to intercept or retrieve relevant deleted data does stymie investigations. But I/we have a right to privacy under Article 8 of the Human Rights Act (Currently) and, even as a detective, I still believe that to breach or circumvent that right has to be held to a higher level than many investigators. If Apple are forced into this, the ripples will push out over here and similar strongarm tactics will no doubt be pushed.

Hugo - If you want to keep a secret, you must also hide it from yourself
 
Since this is an instance where they have physically have the device, there should be a way to hack directly into the storage and read an image that they would in turn crack. Most forensic investigations are done with an image of the hard drive and not the drive itself to preserve the original evidence.

Having the means to access the data physically is different then using a software backdoor into a system remotely.
 
Since this is an instance where they have physically have the device, there should be a way to hack directly into the storage and read an image that they would in turn crack. Most forensic investigations are done with an image of the hard drive and not the drive itself to preserve the original evidence.

Having the means to access the data physically is different then using a software backdoor into a system remotely.
The phone is encrypted. They're not going to be able to brute force attack that.
 
A very interesting Thread, and, of course, my answer is:

Absolutely Not, for the reasons that have been correctly stated by many of you. But mostly, because of the ever-so-slippery slope, by doing so.

However

There is simply No Way On Melekon's Groovy Planet that I believe that Apple does not have a way into their own technology, nor do I believe that the NSC could not get into the technology.

Remotely or Physically

No. Way.
 
A very interesting Thread, and, of course, my answer is:

Absolutely Not, for the reasons that have been correctly stated by many of you. But mostly, because of the ever-so-slippery slope, by doing so.

However

There is simply No Way On Melekon's Groovy Planet that I believe that Apple does not have a way into their own technology, nor do I believe that the NSC could not get into the technology.

Remotely or Physically

No. Way.


I don't think for one moment that Apple or any other company don't have their own backdoors for testing software on the fly as it's written? You don't just spend 20+ hours or more coding stuff only to dump it for one problem when you can't back door into your own code to troubleshoot it, especially in an environment where you are running the code you are testing. you can't just dump all that hard work.
 
I don't think for one moment that Apple or any other company don't have their own backdoors for testing software on the fly as it's written? You don't just spend 20+ hours or more coding stuff only to dump it for one problem when you can't back door into your own code to troubleshoot it, especially in an environment where you are running the code you are testing. you can't just dump all that hard work.

That's not how software development works. You have test environments, and you don't deploy anything to a live environment without testing it thoroughly in at least the development and Quality Assurance environments. No matter how urgent the change.
 
That's not how software development works. You have test environments, and you don't deploy anything to a live environment without testing it thoroughly in at least the development and Quality Assurance environments. No matter how urgent the change.


But can't you backdoor into the code you are writing in that environment?
 
But can't you backdoor into the code you are writing in that environment?

I think you're using the term "backdoor" in a different way than what I think of when I hear the term.

To me, "backdoor" means something like allowing certain credentials to access a website or an environment without needing to provide a password, or to allow any account to be accessed with the same, specific password. Alternatively, to allow an admin user to access a different user's account by simulating the login process. (I've had to work with websites that used two of those three methods to allow access to user accounts by admin users.)

So yes, a developer can deploy code which would allow them to access any user's account - but you would first test it in a non-live environment, like any other software change, to make sure that you hadn't introduced some unwanted effect. And a copy of the current live code would stay in that test environment, until the next time you were working on modifications.
 
I think you're using the term "backdoor" in a different way than what I think of when I hear the term.

To me, "backdoor" means something like allowing certain credentials to access a website or an environment without needing to provide a password, or to allow any account to be accessed with the same, specific password. Alternatively, to allow an admin user to access a different user's account by simulating the login process. (I've had to work with websites that used two of those three methods to allow access to user accounts by admin users.)


That's what I was getting at. If you wrote something couldn't you be able to get into your own system if you were the creator?
 
That's what I was getting at. If you wrote something couldn't you be able to get into your own system if you were the creator?

If you coded it that way, sure. But it doesn't necessarily follow that Apple coded the software in a way that would make it easily accessible. For example - and it wouldn't surprise me if this were the case - every user's data in the cloud may be encrypted using a hash key that incorporates the device's passcode in some fashion. So if you don't have that passcode, you can't decrypt the user's data. And the passcode may not be stored anywhere other than on the device itself.

For example, suppose my phone number is 416-555-1234, and my passcode to unlock my phone is 4321. They might have combined those and used "12344321" as a key to encrypt all of my data in the cloud. So without my passcode, you can't decrypt anything. (Note: I do not, and never have, worked for Apple. I have no idea if this is anywhere close to how they encrypt data. I'm just putting out a simple scenario to illustrate the problem here.)

Apple has said that they would have to completely rewrite the OS, and that said rewrites would open up a backdoor into any user's account. If the scenario I've described above is anywhere close to how the OS actually works, then I can see how this would be true. And that's the problem here, and the problem that the people supporting the FBI aren't seeing - changing the OS would essentially allow anybody's data to be decrypted.

Now, I suppose it might be possible to install an OS rewrite on one device only, but again, doing that opens up a whole can of worms. What's to stop the FBI, or local law enforcement somewhere, from going to Apple and saying, "You did it that time - why not this time too? And the time after this time? In fact, why don't you just push that OS change out to all users so that we can decrypt everyone's data?"
 
If you coded it that way, sure. But it doesn't necessarily follow that Apple coded the software in a way that would make it easily accessible. For example - and it wouldn't surprise me if this were the case - every user's data in the cloud may be encrypted using a hash key that incorporates the device's passcode in some fashion. So if you don't have that passcode, you can't decrypt the user's data. And the passcode may not be stored anywhere other than on the device itself.

For example, suppose my phone number is 416-555-1234, and my passcode to unlock my phone is 4321. They might have combined those and used "12344321" as a key to encrypt all of my data in the cloud. So without my passcode, you can't decrypt anything. (Note: I do not, and never have, worked for Apple. I have no idea if this is anywhere close to how they encrypt data. I'm just putting out a simple scenario to illustrate the problem here.)

Apple has said that they would have to completely rewrite the OS, and that said rewrites would open up a backdoor into any user's account. If the scenario I've described above is anywhere close to how the OS actually works, then I can see how this would be true. And that's the problem here, and the problem that the people supporting the FBI aren't seeing - changing the OS would essentially allow anybody's data to be decrypted.

Now, I suppose it might be possible to install an OS rewrite on one device only, but again, doing that opens up a whole can of worms. What's to stop the FBI, or local law enforcement somewhere, from going to Apple and saying, "You did it that time - why not this time too? And the time after this time? In fact, why don't you just push that OS change out to all users so that we can decrypt everyone's data?"
I've read of a few companies, when utilizing encryption, create their hash codes by combining a random number and the MAC address of the device. So every piece of hardware is going be different, and it allows for the possibility of millions of combinations. This is why people need to understand there's really no way for the government, or Apple, to necessarily know how to decrypt the phone without a big piece of the puzzle.
 
Now, I suppose it might be possible to install an OS rewrite on one device only, but again, doing that opens up a whole can of worms. What's to stop the FBI, or local law enforcement somewhere, from going to Apple and saying, "You did it that time - why not this time too? And the time after this time? In fact, why don't you just push that OS change out to all users so that we can decrypt everyone's data?"
I heard it reported on the radio the other day that law enforcement agencies already have requested that Apple unlock 9 other iPhones. None of those cases are related to terrorism. However, I'm having trouble finding that story anywhere because the one case is swamping the news and the others are buried in the mountain of stories about this one.
 
Since this is an instance where they have physically have the device, there should be a way to hack directly into the storage and read an image that they would in turn crack. Most forensic investigations are done with an image of the hard drive and not the drive itself to preserve the original evidence.

Having the means to access the data physically is different then using a software backdoor into a system remotely.

The passcode is combined with a hardware key which only exists in that specific device.

The only way to access those keys physically is through a difficult process called decapping, in which the chips holding the data are carefully removed and then scanned with an electron microscope that reads the actual 1s and 0s of the keys.

Suffice it to say, this is both impractical and not terribly reliable (the risk of damaging the data during the removal process is very high).
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top