Jump to content
Sign in to follow this  
iacas

Apple v. FBI

Note: This thread is 1674 days old. We appreciate that you found this thread instead of starting a new one, but if you plan to post here please make sure it's still relevant. If not, please start a new topic. Thank you!

275 posts / 15540 viewsLast Reply

Recommended Posts

Nicholas Weaver:

Quote

 

The same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn't yet in law enforcement's hand.  So the precedent the FBI seeks doesn't represent just "create and install malcode for this device in Law Enforcement possession" but rather "create and install malcode for this device".

Let us assume that the FBI wins in court and gains this precedent.  This does indeed solve the "going dark" problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say "push out an update to this target".  Once the target's device starts running the FBI's update then encryption no longer matters, even the much stronger security present in the latest Apple devices.  So as long as the FBI identifies the target's devices before arrest there is no problem with any encryption.

But at what cost?

Currently, hacking a target has a substantial cost: it takes effort and resources.  This is one reason why I don't worry (much) about the FBI's Network Investigative Technique (NIT) malcode, they can only use it on suitably high value targets.  But what happens in a world where "hacking" by law enforcement is as simple as filling out some paperwork?

Almost immediately, the NSA is going to secretly request the same authority through the Foreign Intelligence Surveillance Court using a combination of 702 to justify targeting and the All Writs Act to mandate the necessary assistance.  How many honestly believe the FISC wouldn't rule in the NSA's favor after the FBI succeeds in getting the authority?

The NSA's admittedly restrictive definition of "foreign intelligence" target is not actually all that restrictive due to the "diplomatic" catch-all, a now unfortunately public cataloging of targets, and a close association with the GCHQ.  So already foreign universities, energy companies, financial firms, computer system vendors, governments, and even high net worth individuals could not trust US technology products as they would be suceptible to malicious updates demanded by the NSA.

 

The ACLU:

Quote

First, the government’s legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations—but only in obtaining information or evidence the companies already have access to.

The difference between those cases and Apple’s is a radical one. If Apple and other tech companies—whose devices we all rely upon to store incredibly private information—can be forced to hack into their customers’ devices, then it’s hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there’s no way to ensure that it’s only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.

Second, this debate is not about one phone—it’s about every phone. And it’s about every device manufactured by a U.S. company. If the government gets its way, then every device—your mobile phone, tablet or laptop—will carry with it an implicit warning from its manufacturer: “Sorry, but we might be forced to hack you.”

Some might accept that risk if it were possible to limit access to legitimate governmental purposes, overseen by a judge. But as Apple’s Cook points out, backdoors are uniquely dangerous: “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”

 

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

Here's a more detailed technical explanation of why Apple can't crack its own phones and what's being asked for.

Quote

The encryption chip on the iPhone uses a powerful algorithm called AES to protect customer data. Each iPhone has a unique number called an encryption key that is needed to scramble or unscramble the data on the iPhone. This key is 256 bits long — that is, a string of 256 1s and 0s — which means there a trillion trillion trillion trillion trillion trillion possible values for an iPhone's encryption key. So if you wanted to crack the iPhone's encryption by "brute force" — guessing each possible encryption key in sequence until you find the right one — it would take many lifetimes even if every computer on the planet were working on the problem.

Apple has chosen not to keep copies of iPhone keys after the smartphones leave its factories. So if law enforcement made a copy of the data stored on an iPhone and brought it to Apple for help unscrambling it, it would be literally impossible for Apple to help.

...

And this is where the FBI has sought Apple's help. The FBI isn't asking Apple to directly unscramble the data on the iPhone — something Apple couldn't do if it wanted to. Rather, the FBI is demanding that Apple modify the software on Farook's iPhone to make it easier for the FBI to guess his passcode. The FBI wants Apple to disable delays between passcode guesses, disable the self-destruct feature, and allow passcodes to be entered electronically over a wifi network or using the iPhone's lightning port. Taken together, these measures will allow the FBI to guess Farook's passcode much more quickly than it could have otherwise — and without worrying about triggering the phone's auto-wipe function.

Here's the big picture:

Quote

If this were simply about Farook's phone and the hassle involved in helping the FBI pry it open, it's unlikely Apple would be taking such a big public stand. The concern is that the government is trying to take advantage of a particularly odious defendant to set a precedent that could have much broader implications.

For starters, although the hassle involved in complying with the FBI's request is considerable, once Apple engineers have done the necessary work of creating the custom software it will be much easier to comply with other law enforcement requests for the same service. Today's extraordinary request for an extraordinary suspect, in other words, could be tomorrow's routine request.

...

Apple has tried to tie the debate over the San Bernardino request to this larger debate over back doors, arguing that it shouldn't be forced to provide law enforcement with a back door into its products. But a crucial difference here is that Apple isn't being asked to proactively introduce a security vulnerability into every iPhone. Rather, it's being asked to help hack into the phone of a dead terrorism suspect.

The FBI, having heard the concerns of technology companies, has asked Apple to make custom software that is tied specifically to the device ID of Farook's iPhone.

...

In the San Bernardino case, the FBI is seeking access to data on an iPhone that's already in its possession. But Julian Sanchez, a civil liberties expert at the Cato Institute, argues that we shouldn't expect the government's requests to stop there. It would be even more useful for law enforcement if they could get Apple to use its software update functionality to install software on phones not in its possession. For example, the Drug Enforcement Agency could ask Apple to install software on a suspected drug kingpin's phone to record all his conversations and send the audio back to DEA headquarters.

...

Another worry, Sanchez says, is that if Apple develops software to help the US government bypass iPhone encryption, foreign governments are sure to take notice. And countries like China and Russia have a much more expansive idea of what constitutes a crime.

So if the FBI gets its way, the arms race between law enforcement and technology companies will only continue. Apple is clearly betting that whatever its stance might be, standing up for its own customers' privacy will be a hit in the marketplace.

http://www.vox.com/2016/2/17/11037748/fbi-apple-san-bernardino

The San Bernardino terrorist disabled iCloud backup a month and a half before the attack.

Quote

The FBI wants more information about who Farook and Malik communicated with and where they traveled before the attack, and they got a search warrant to obtain that information.

But the FBI isn't able to get into Farook's phone. Only the person with the passcode for an iPhone can access the data inside. After 10 attempts to break into the phone, the data is permanently erased. And in the San Bernardino case, even though Farook's phone was owned by his employer, who granted the FBI permission to search it, the only person with a passcode (that we know of) was Farook.

The FBI was able to get data from Farook's backups to iCloud, which isn't encrypted. According to a court filing, that data showed that Farook was in touch with some of the victims of the shooting in the months preceding it. But he disabled iCloud backup about six weeks before the shooting, which federal law enforcement officials argue shows he might have been trying to hide evidence from his phone.

http://www.vox.com/2016/2/17/11031902/apple-encryption-fbi-san-bernardino-backdoor

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

34 minutes ago, 14ledo81 said:

Couldn't Apple simply open the phone for FBI use?

That's the main issue, the just don't have any way of unlocking the phones. If they did already have the means to do it I imagine there wouldn't be the issue about doing it that is happening now.

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

14 minutes ago, Jeremie Boop said:

That's the main issue, the just don't have any way of unlocking the phones. If they did already have the means to do it I imagine there wouldn't be the issue about doing it that is happening now.

Right.

They also consider it incredibly "bad" (dangerous, a violation of privacy, a slippery slope, whatever you want to call it) to create that tool.

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

Count me firmly with Obama on this (1st time I've said that, and last).  Apple has done it before although not to this degree.  I have a hard time believing they could not secure the code required to get this done, something else is in play none of us know about.

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

48 minutes ago, Gunther said:

Count me firmly with Obama on this (1st time I've said that, and last).  Apple has done it before although not to this degree.  I have a hard time believing they could not secure the code required to get this done, something else is in play none of us know about.

They haven't "done it before" and of course they could write the code. That's not the point.

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

1 hour ago, iacas said:

They also consider it incredibly "bad" (dangerous, a violation of privacy, a slippery slope, whatever you want to call it) to create that tool.

We talk of this type of privacy as if it's something we once had. This is the first time, at least publicly, that we have privacy that the government can't access, even if it has a reasonable reason to do so.

I think the issue here is the decision of whether or not a company or industry should create encryption that isn't accessible by the government. More importantly, who should make that decision? Apple?

Edited by chspeed

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

5 minutes ago, chspeed said:

We talk of this type of privacy as if it's something we once had. This is the first time, at least publicly, that we have privacy that the government can't access, even if it has a reasonable reason to do so.

I think the issue here is the decision of whether or not a company or industry should create encryption that isn't accessible by the government. More importantly, who should make that decision? Apple?

It's not "encryption that's not accessible by the government."

It's encryption that's not accessible by anyone. Including Apple.

An important distinction.

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

13 minutes ago, iacas said:

They haven't "done it before" and of course they could write the code. That's not the point.

Can they write the code for a "one time" use?

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

6 minutes ago, iacas said:

It's not "encryption that's not accessible by the government."

It's encryption that's not accessible by anyone. Including Apple.

An important distinction.

I guess I don't really understand the distinction.

If I was hiding a terrorist in my garage, with a hypothetical lock for which a key doesn't yet exist, but I know how to create the key, can't the government force me to create the key (assuming they have a court order)? Even if that key can open up someone else's door too?

Is it really my decision just because I built the lock?

Edited by chspeed

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

25 minutes ago, iacas said:

They haven't "done it before" and of course they could write the code. That's not the point.

They have helped law enforcement unlock phones 70 times, although it didn't require this code.

Of course they can write the code.  Apple's "concern" is they cannot secure it, i.e., it will leak.  I call BS on that.

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

37 minutes ago, iacas said:

It's not "encryption that's not accessible by the government."

It's encryption that's not accessible by anyone. Including Apple.

An important distinction.

Big time. This is a wet dream for the NSA and once that door (a door that doesn't currently even exist) is open, even a crack, it's never closing again. Decisions like this change the world. 

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

It's impossible to do what the FBI is wanting to do.  Encryption only works if it works for everybody.

45 minutes ago, chspeed said:

I guess I don't really understand the distinction.

If I was hiding a terrorist in my garage, with a hypothetical lock for which a key doesn't yet exist, but I know how to create the key, can't the government force me to create the key (assuming they have a court order)? Even if that key can open up someone else's door too?

Is it really my decision just because I built the lock?

 

You couldn't open the lock if you wanted to...

Share this post


Link to post
Share on other sites

53 minutes ago, Ernest Jones said:

 Decisions like this change the world. 

Agreed. Which is why we have a constitution and a government ostensibly elected by us, with checks an balances, to guide us.

I'm imagining a similar scenario, one in which instead of Tim Cook, we've got Jamie Diamond (CEO of JP Morgan), telling the FBI that he'd really like to hand over info on those money transfers to Syria, but he really can't, because the bank is now using encryption technology to hide the identity of the recipients. And sure, they could change that encryption technology, but then all their consumers' transactions would be in danger.

I don't think we'd have the same outpouring of support we see for Apple. At least not on the left side of the political spectrum.

 

 

Edited by chspeed

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

1 hour ago, Gunther said:

Of course they can write the code.  Apple's "concern" is they cannot secure it, i.e., it will leak.  I call BS on that.

Then I have to question your knowledge on encryption technology, specifically as it's implemented in recent iPhones. All of the tech people, everyone involved with the actual technology, are saying if you open a backdoor for someone, it's inevitably going to be accessed by someone else. 

Share this post


Link to post
Share on other sites
Awards, Achievements, and Accolades

Note: This thread is 1674 days old. We appreciate that you found this thread instead of starting a new one, but if you plan to post here please make sure it's still relevant. If not, please start a new topic. Thank you!

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  



  • Want to join this community?

    We'd love to have you!

    Sign Up
  • Support TST Affiliates

    SuperSpeed
    TourStriker PlaneMate
    Golfer's Journal
    Whoop
    FlightScope Mevo
    Use the code "iacas" for 10% off Mevo
  • Posts

    • Courses that are close to Universities have this. Some Universities even have their own courses. I’ve played the U Maryland course. URI has a Par 3 near it that has intramural leagues on it at a reduced rate. It is a decent par 3 course too. Alas, I think San Jose is a pretty busy area and may not have these things available.
    • You can think about making the putt...or worry about missing it.  Guess which option is more beneficial in the long run.
    • I am no scientist but IMHO yips are similar to a beginner skier going down a steeper slope than what their comfort level is. It is not a condition as much as it is a phenomenon. Combination of panic and indecision. I can't simply tell myself to not care as much as I care. There is no 'mental solution'. You just have to find a comfort level by practicing the shit out of it. If you can make a full swing without hitching you can putt too. 
    • I received the unit from @Zwingit Golf, thanks Kevin. I will need to spend a few more sessions to get a full feel for the unit. How it works: The Zwingnetic+ Golf Swing Trainer is designed to help you keep your lead arm straight in the backswing and downswing prior to impact. The Zwingnetic+ slides over the elbow of your lead arm and is hinged to bend at the elbow. It works by using a sliding pin that is kept in place by a magnetic that locks the unit from bending. It will work for right and left handed players. The player slides the pin in the locked position before set up. The magnet holds the pin in place during the backswing. This also encourages you to keep the arm straight. As you swing down, if your arm remains straight, the pin will slide down due to the centrifugal force created by your swing. If your arm bends too much, friction will keep the pin from sliding out. First impressions: The unit is well made and lighter than it looks. The straps to secure the Zwingnetic+ to your arm are pretty long. I ended up tucking them in to keep the straps from flopping around. I’m not sure whose arms would be big enough to need all that strap length. Once on, it keeps the left arm straight. I think if you had a significantly bent arm before impact, it would keep the pin from releasing. My arm is slightly bent at impact and the unit still released. I need to spend more time testing this out.  Below is a video of my first use. You can here a double click during the downswing. First is the pin sliding out and second is impact. In one swing, I tried to keep the arm bent, but the pin still released although later than usual. BTW, it was 39F when I did this video.🥶 More to come.
    • I started golfing three months ago, and found this site yesterday. The two things I focused on at the range session were key #1 and flaring both feet outwards...simple guidance that even a newbie like me can handle!  In looking at some of my old swing videos, I was shocked at how much I was violating key 1, and pleasantly surprised at how flaring my feet has helped with my turn and full swing. Thank you!
  • Today's Birthdays

    1. Eburris74
      Eburris74
      (46 years old)
    2. Jim White
      Jim White
      (70 years old)
    3. tapan
      tapan
      (45 years old)

×
×
  • Create New...

Important Information

Welcome to TST! Signing up is free, and you'll see fewer ads and can talk with fellow golf enthusiasts! By using TST, you agree to our Terms of Use, our Privacy Policy, and our Guidelines.

The popup will be closed in 10 seconds...