Jump to content
IGNORED

Apple v. FBI


iacas
Note: This thread is 2921 days old. We appreciate that you found this thread instead of starting a new one, but if you plan to post here please make sure it's still relevant. If not, please start a new topic. Thank you!

Recommended Posts

  • Administrator

Nicholas Weaver:

Quote

 

The same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn't yet in law enforcement's hand.  So the precedent the FBI seeks doesn't represent just "create and install malcode for this device in Law Enforcement possession" but rather "create and install malcode for this device".

Let us assume that the FBI wins in court and gains this precedent.  This does indeed solve the "going dark" problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say "push out an update to this target".  Once the target's device starts running the FBI's update then encryption no longer matters, even the much stronger security present in the latest Apple devices.  So as long as the FBI identifies the target's devices before arrest there is no problem with any encryption.

But at what cost?

Currently, hacking a target has a substantial cost: it takes effort and resources.  This is one reason why I don't worry (much) about the FBI's Network Investigative Technique (NIT) malcode, they can only use it on suitably high value targets.  But what happens in a world where "hacking" by law enforcement is as simple as filling out some paperwork?

Almost immediately, the NSA is going to secretly request the same authority through the Foreign Intelligence Surveillance Court using a combination of 702 to justify targeting and the All Writs Act to mandate the necessary assistance.  How many honestly believe the FISC wouldn't rule in the NSA's favor after the FBI succeeds in getting the authority?

The NSA's admittedly restrictive definition of "foreign intelligence" target is not actually all that restrictive due to the "diplomatic" catch-all, a now unfortunately public cataloging of targets, and a close association with the GCHQ.  So already foreign universities, energy companies, financial firms, computer system vendors, governments, and even high net worth individuals could not trust US technology products as they would be suceptible to malicious updates demanded by the NSA.

 

The ACLU:

Quote

First, the government’s legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations—but only in obtaining information or evidence the companies already have access to.

The difference between those cases and Apple’s is a radical one. If Apple and other tech companies—whose devices we all rely upon to store incredibly private information—can be forced to hack into their customers’ devices, then it’s hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there’s no way to ensure that it’s only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.

Second, this debate is not about one phone—it’s about every phone. And it’s about every device manufactured by a U.S. company. If the government gets its way, then every device—your mobile phone, tablet or laptop—will carry with it an implicit warning from its manufacturer: “Sorry, but we might be forced to hack you.”

Some might accept that risk if it were possible to limit access to legitimate governmental purposes, overseen by a judge. But as Apple’s Cook points out, backdoors are uniquely dangerous: “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”

 

Erik J. Barzeski —  I knock a ball. It goes in a gopher hole. 🏌🏼‍♂️
Director of Instruction Golf Evolution • Owner, The Sand Trap .com • AuthorLowest Score Wins
Golf Digest "Best Young Teachers in America" 2016-17 & "Best in State" 2017-20 • WNY Section PGA Teacher of the Year 2019 :edel: :true_linkswear:

Check Out: New Topics | TST Blog | Golf Terms | Instructional Content | Analyzr | LSW | Instructional Droplets

Link to comment
Share on other sites

Awards, Achievements, and Accolades

  • Moderator

Here's a more detailed technical explanation of why Apple can't crack its own phones and what's being asked for.

Quote

The encryption chip on the iPhone uses a powerful algorithm called AES to protect customer data. Each iPhone has a unique number called an encryption key that is needed to scramble or unscramble the data on the iPhone. This key is 256 bits long — that is, a string of 256 1s and 0s — which means there a trillion trillion trillion trillion trillion trillion possible values for an iPhone's encryption key. So if you wanted to crack the iPhone's encryption by "brute force" — guessing each possible encryption key in sequence until you find the right one — it would take many lifetimes even if every computer on the planet were working on the problem.

Apple has chosen not to keep copies of iPhone keys after the smartphones leave its factories. So if law enforcement made a copy of the data stored on an iPhone and brought it to Apple for help unscrambling it, it would be literally impossible for Apple to help.

...

And this is where the FBI has sought Apple's help. The FBI isn't asking Apple to directly unscramble the data on the iPhone — something Apple couldn't do if it wanted to. Rather, the FBI is demanding that Apple modify the software on Farook's iPhone to make it easier for the FBI to guess his passcode. The FBI wants Apple to disable delays between passcode guesses, disable the self-destruct feature, and allow passcodes to be entered electronically over a wifi network or using the iPhone's lightning port. Taken together, these measures will allow the FBI to guess Farook's passcode much more quickly than it could have otherwise — and without worrying about triggering the phone's auto-wipe function.

Here's the big picture:

Quote

If this were simply about Farook's phone and the hassle involved in helping the FBI pry it open, it's unlikely Apple would be taking such a big public stand. The concern is that the government is trying to take advantage of a particularly odious defendant to set a precedent that could have much broader implications.

For starters, although the hassle involved in complying with the FBI's request is considerable, once Apple engineers have done the necessary work of creating the custom software it will be much easier to comply with other law enforcement requests for the same service. Today's extraordinary request for an extraordinary suspect, in other words, could be tomorrow's routine request.

...

Apple has tried to tie the debate over the San Bernardino request to this larger debate over back doors, arguing that it shouldn't be forced to provide law enforcement with a back door into its products. But a crucial difference here is that Apple isn't being asked to proactively introduce a security vulnerability into every iPhone. Rather, it's being asked to help hack into the phone of a dead terrorism suspect.

The FBI, having heard the concerns of technology companies, has asked Apple to make custom software that is tied specifically to the device ID of Farook's iPhone.

...

In the San Bernardino case, the FBI is seeking access to data on an iPhone that's already in its possession. But Julian Sanchez, a civil liberties expert at the Cato Institute, argues that we shouldn't expect the government's requests to stop there. It would be even more useful for law enforcement if they could get Apple to use its software update functionality to install software on phones not in its possession. For example, the Drug Enforcement Agency could ask Apple to install software on a suspected drug kingpin's phone to record all his conversations and send the audio back to DEA headquarters.

...

Another worry, Sanchez says, is that if Apple develops software to help the US government bypass iPhone encryption, foreign governments are sure to take notice. And countries like China and Russia have a much more expansive idea of what constitutes a crime.

So if the FBI gets its way, the arms race between law enforcement and technology companies will only continue. Apple is clearly betting that whatever its stance might be, standing up for its own customers' privacy will be a hit in the marketplace.

http://www.vox.com/2016/2/17/11037748/fbi-apple-san-bernardino

The San Bernardino terrorist disabled iCloud backup a month and a half before the attack.

Quote

The FBI wants more information about who Farook and Malik communicated with and where they traveled before the attack, and they got a search warrant to obtain that information.

But the FBI isn't able to get into Farook's phone. Only the person with the passcode for an iPhone can access the data inside. After 10 attempts to break into the phone, the data is permanently erased. And in the San Bernardino case, even though Farook's phone was owned by his employer, who granted the FBI permission to search it, the only person with a passcode (that we know of) was Farook.

The FBI was able to get data from Farook's backups to iCloud, which isn't encrypted. According to a court filing, that data showed that Farook was in touch with some of the victims of the shooting in the months preceding it. But he disabled iCloud backup about six weeks before the shooting, which federal law enforcement officials argue shows he might have been trying to hide evidence from his phone.

http://www.vox.com/2016/2/17/11031902/apple-encryption-fbi-san-bernardino-backdoor

  • Upvote 2

Steve

Kill slow play. Allow walking. Reduce ineffective golf instruction. Use environmentally friendly course maintenance.

Link to comment
Share on other sites

Awards, Achievements, and Accolades

Couldn't Apple simply open the phone for FBI use?

-Matt-

"does it still count as a hit fairway if it is the next one over"

DRIVER-Callaway FTiz__3 WOOD-Nike SQ Dymo 15__HYBRIDS-3,4,5 Adams__IRONS-6-PW Adams__WEDGES-50,55,60 Wilson Harmonized__PUTTER-Odyssey Dual Force Rossie II

Link to comment
Share on other sites

Awards, Achievements, and Accolades

  • Administrator
11 minutes ago, 14ledo81 said:

Couldn't Apple simply open the phone for FBI use?

No.

They don't know the guy's passphrase.

Erik J. Barzeski —  I knock a ball. It goes in a gopher hole. 🏌🏼‍♂️
Director of Instruction Golf Evolution • Owner, The Sand Trap .com • AuthorLowest Score Wins
Golf Digest "Best Young Teachers in America" 2016-17 & "Best in State" 2017-20 • WNY Section PGA Teacher of the Year 2019 :edel: :true_linkswear:

Check Out: New Topics | TST Blog | Golf Terms | Instructional Content | Analyzr | LSW | Instructional Droplets

Link to comment
Share on other sites

Awards, Achievements, and Accolades

34 minutes ago, 14ledo81 said:

Couldn't Apple simply open the phone for FBI use?

That's the main issue, the just don't have any way of unlocking the phones. If they did already have the means to do it I imagine there wouldn't be the issue about doing it that is happening now.

KICK THE FLIP!!

In the bag:
:srixon: Z355

:callaway: XR16 3 Wood
:tmade: Aeroburner 19* 3 hybrid
:ping: I e1 irons 4-PW
:vokey: SM5 50, 60
:wilsonstaff: Harmonized Sole Grind 56 and Windy City Putter

Link to comment
Share on other sites

Awards, Achievements, and Accolades

  • Administrator
14 minutes ago, Jeremie Boop said:

That's the main issue, the just don't have any way of unlocking the phones. If they did already have the means to do it I imagine there wouldn't be the issue about doing it that is happening now.

Right.

They also consider it incredibly "bad" (dangerous, a violation of privacy, a slippery slope, whatever you want to call it) to create that tool.

Erik J. Barzeski —  I knock a ball. It goes in a gopher hole. 🏌🏼‍♂️
Director of Instruction Golf Evolution • Owner, The Sand Trap .com • AuthorLowest Score Wins
Golf Digest "Best Young Teachers in America" 2016-17 & "Best in State" 2017-20 • WNY Section PGA Teacher of the Year 2019 :edel: :true_linkswear:

Check Out: New Topics | TST Blog | Golf Terms | Instructional Content | Analyzr | LSW | Instructional Droplets

Link to comment
Share on other sites

Awards, Achievements, and Accolades

Count me firmly with Obama on this (1st time I've said that, and last).  Apple has done it before although not to this degree.  I have a hard time believing they could not secure the code required to get this done, something else is in play none of us know about.

In my Bag: Driver: Titelist 913 D3 9.5 deg. 3W: TaylorMade RBZ 14.5 3H: TaylorMade RBZ 18.5 4I - SW: TaylorMade R7 TP LW: Titelist Vokey 60 Putter: Odyssey 2-Ball

Link to comment
Share on other sites

Awards, Achievements, and Accolades

  • Administrator
48 minutes ago, Gunther said:

Count me firmly with Obama on this (1st time I've said that, and last).  Apple has done it before although not to this degree.  I have a hard time believing they could not secure the code required to get this done, something else is in play none of us know about.

They haven't "done it before" and of course they could write the code. That's not the point.

Erik J. Barzeski —  I knock a ball. It goes in a gopher hole. 🏌🏼‍♂️
Director of Instruction Golf Evolution • Owner, The Sand Trap .com • AuthorLowest Score Wins
Golf Digest "Best Young Teachers in America" 2016-17 & "Best in State" 2017-20 • WNY Section PGA Teacher of the Year 2019 :edel: :true_linkswear:

Check Out: New Topics | TST Blog | Golf Terms | Instructional Content | Analyzr | LSW | Instructional Droplets

Link to comment
Share on other sites

Awards, Achievements, and Accolades

1 hour ago, iacas said:

They also consider it incredibly "bad" (dangerous, a violation of privacy, a slippery slope, whatever you want to call it) to create that tool.

We talk of this type of privacy as if it's something we once had. This is the first time, at least publicly, that we have privacy that the government can't access, even if it has a reasonable reason to do so.

I think the issue here is the decision of whether or not a company or industry should create encryption that isn't accessible by the government. More importantly, who should make that decision? Apple?

Edited by chspeed
Link to comment
Share on other sites

Awards, Achievements, and Accolades

Maybe the government should get better hacking skills. ;)

Matt Dougherty, P.E.
 fasdfa dfdsaf 

What's in My Bag
Driver; :pxg: 0311 Gen 5,  3-Wood: 
:titleist: 917h3 ,  Hybrid:  :titleist: 915 2-Hybrid,  Irons: Sub 70 TAIII Fordged
Wedges: :edel: (52, 56, 60),  Putter: :edel:,  Ball: :snell: MTB,  Shoe: :true_linkswear:,  Rangfinder: :leupold:
Bag: :ping:

Link to comment
Share on other sites

Awards, Achievements, and Accolades

  • Administrator
5 minutes ago, chspeed said:

We talk of this type of privacy as if it's something we once had. This is the first time, at least publicly, that we have privacy that the government can't access, even if it has a reasonable reason to do so.

I think the issue here is the decision of whether or not a company or industry should create encryption that isn't accessible by the government. More importantly, who should make that decision? Apple?

It's not "encryption that's not accessible by the government."

It's encryption that's not accessible by anyone. Including Apple.

An important distinction.

Erik J. Barzeski —  I knock a ball. It goes in a gopher hole. 🏌🏼‍♂️
Director of Instruction Golf Evolution • Owner, The Sand Trap .com • AuthorLowest Score Wins
Golf Digest "Best Young Teachers in America" 2016-17 & "Best in State" 2017-20 • WNY Section PGA Teacher of the Year 2019 :edel: :true_linkswear:

Check Out: New Topics | TST Blog | Golf Terms | Instructional Content | Analyzr | LSW | Instructional Droplets

Link to comment
Share on other sites

Awards, Achievements, and Accolades

13 minutes ago, iacas said:

They haven't "done it before" and of course they could write the code. That's not the point.

Can they write the code for a "one time" use?

-Matt-

"does it still count as a hit fairway if it is the next one over"

DRIVER-Callaway FTiz__3 WOOD-Nike SQ Dymo 15__HYBRIDS-3,4,5 Adams__IRONS-6-PW Adams__WEDGES-50,55,60 Wilson Harmonized__PUTTER-Odyssey Dual Force Rossie II

Link to comment
Share on other sites

Awards, Achievements, and Accolades

6 minutes ago, iacas said:

It's not "encryption that's not accessible by the government."

It's encryption that's not accessible by anyone. Including Apple.

An important distinction.

I guess I don't really understand the distinction.

If I was hiding a terrorist in my garage, with a hypothetical lock for which a key doesn't yet exist, but I know how to create the key, can't the government force me to create the key (assuming they have a court order)? Even if that key can open up someone else's door too?

Is it really my decision just because I built the lock?

Edited by chspeed
Link to comment
Share on other sites

Awards, Achievements, and Accolades

25 minutes ago, iacas said:

They haven't "done it before" and of course they could write the code. That's not the point.

They have helped law enforcement unlock phones 70 times, although it didn't require this code.

Of course they can write the code.  Apple's "concern" is they cannot secure it, i.e., it will leak.  I call BS on that.

In my Bag: Driver: Titelist 913 D3 9.5 deg. 3W: TaylorMade RBZ 14.5 3H: TaylorMade RBZ 18.5 4I - SW: TaylorMade R7 TP LW: Titelist Vokey 60 Putter: Odyssey 2-Ball

Link to comment
Share on other sites

Awards, Achievements, and Accolades

37 minutes ago, iacas said:

It's not "encryption that's not accessible by the government."

It's encryption that's not accessible by anyone. Including Apple.

An important distinction.

Big time. This is a wet dream for the NSA and once that door (a door that doesn't currently even exist) is open, even a crack, it's never closing again. Decisions like this change the world. 

Yours in earnest, Jason.
Call me Ernest, or EJ or Ernie.

PSA - "If you find yourself in a hole, STOP DIGGING!"

My Whackin' Sticks: :cleveland: 330cc 2003 Launcher 10.5*  :tmade: RBZ HL 3w  :nickent: 3DX DC 3H, 3DX RC 4H  :callaway: X-22 5-AW  :nike:SV tour 56* SW :mizuno: MP-T11 60* LW :bridgestone: customized TD-03 putter :tmade:Penta TP3   :aimpoint:

Link to comment
Share on other sites

Awards, Achievements, and Accolades

It's impossible to do what the FBI is wanting to do.  Encryption only works if it works for everybody.

45 minutes ago, chspeed said:

I guess I don't really understand the distinction.

If I was hiding a terrorist in my garage, with a hypothetical lock for which a key doesn't yet exist, but I know how to create the key, can't the government force me to create the key (assuming they have a court order)? Even if that key can open up someone else's door too?

Is it really my decision just because I built the lock?

 

You couldn't open the lock if you wanted to...

Tony  


:titleist:    |   :tmade:   |     :cleveland: 

Link to comment
Share on other sites


53 minutes ago, Ernest Jones said:

 Decisions like this change the world. 

Agreed. Which is why we have a constitution and a government ostensibly elected by us, with checks an balances, to guide us.

I'm imagining a similar scenario, one in which instead of Tim Cook, we've got Jamie Diamond (CEO of JP Morgan), telling the FBI that he'd really like to hand over info on those money transfers to Syria, but he really can't, because the bank is now using encryption technology to hide the identity of the recipients. And sure, they could change that encryption technology, but then all their consumers' transactions would be in danger.

I don't think we'd have the same outpouring of support we see for Apple. At least not on the left side of the political spectrum.

 

 

Edited by chspeed
  • Upvote 2
Link to comment
Share on other sites

Awards, Achievements, and Accolades

1 hour ago, Gunther said:

Of course they can write the code.  Apple's "concern" is they cannot secure it, i.e., it will leak.  I call BS on that.

Then I have to question your knowledge on encryption technology, specifically as it's implemented in recent iPhones. All of the tech people, everyone involved with the actual technology, are saying if you open a backdoor for someone, it's inevitably going to be accessed by someone else. 

In my bag:

Driver: Titleist TSi3 | 15º 3-Wood: Ping G410 | 17º 2-Hybrid: Ping G410 | 19º 3-Iron: TaylorMade GAPR Lo |4-PW Irons: Nike VR Pro Combo | 54º SW, 60º LW: Titleist Vokey SM8 | Putter: Odyssey Toulon Las Vegas H7

Link to comment
Share on other sites

Awards, Achievements, and Accolades

Note: This thread is 2921 days old. We appreciate that you found this thread instead of starting a new one, but if you plan to post here please make sure it's still relevant. If not, please start a new topic. Thank you!

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

Welcome to TST! Signing up is free, and you'll see fewer ads and can talk with fellow golf enthusiasts! By using TST, you agree to our Terms of Use, our Privacy Policy, and our Guidelines.

The popup will be closed in 10 seconds...