Tales from the Orchard:Apple’s iOS 11.4 update with ‘USB Restricted Mode’ may defeat tools like GrayKey

The iOS 11.4 beta contains a new feature called USB Restricted Mode, designed to defeat physical data access by third parties — possibly with forensic firms like Grayshift and Cellebrite in mind.

 

“To improve security, for a locked iOS device to communicate with USB accessories you must connect an accessory via Lightning connector to the device while unlocked — or enter your device passcode while connected — at least once a week,” reads Apple documentation highlighted by security firm ElcomSoft. The feature actually made an appearance in iOS 11.3 betas, but like AirPlay 2 was removed from the finished code.

The change blocks use of the Lightning port for anything but charging if a device is left untouched for seven days. An iPhone or iPad will even refuse to sync with computer running iTunes until iOS is unlocked with a passcode.

USB Restricted Mode may be intended to impose a seven-day window on when digital forensics specialists like Grayshift can break into a device, at least using any simple techniques. Those firms will often employ a “lockdown” record from a suspect’s computer to create a local backup of iPhone data, skipping passcode entry.

iOS 11 already has some restrictions on lockdown records, namely automatic expiration, and full-disk encryption that renders them useless if a device is rebooted. The 11.3 update shrank the life of iTunes pairing records to seven days.

ElcomSoft suggested that connecting a device to a paired accessory or computer could extend the Restricted Mode window, and centrally-managed hardware may already have that mode disabled.

“If the phone was seized while it was still powered on, and kept powered on in the meanwhile, than the chance of successfully connecting the phone to a computer for the purpose of making a local backup will depend on whether or not the expert has access to a non-expired lockdown file (pairing record),” ElcomSoft elaborated. “If, however, the phone is delivered in a powered-off state, and the passcode is not known, the chance of successful extraction is slim at best.”

The exact details of the hacking techniques used by Cellebrite and Grayshift’s GrayKey have been kept secret, so it’s possible they may still work after iOS 11.4 is released. The companies could however resort to more extreme methods to get at data, such as removing the flash memory from the devices, copying them, and using the copies to attack the password.

 

What do think of Apple’s move to thwart hackers and the FBI? Sound off in the comments below!

Tales from the Orchard: Apple addresses iOS source code leak, says it appears to be tied to three-year-old software.

 

 

 

By Brian Heater of The Verge

Earlier this week, iOS source code showed up on GitHub, raising concerns that hackers could find a way to comb the material for vulnerabilities. Apple has confirmed with TechCrunch that the code appears to be real, but adds that it’s tied to old software. 

The material is gone now, courtesy of a DMCA notice Apple sent to GitHub, but the occurrence was certainly notable, given the tight grip the company traditionally has on such material. So, if the code was, indeed, what it purported to be, has the damage already been done?

Motherboard, which was among the first to note the code labeled “iBoot,” reached out to author Jonathan Levin, who confirmed that the code certainly looks real and called it “a huge deal.” While the available code appears to be pretty small, it could certainly offer some unique insight into how Apple works its magic.

“Old source code from three years ago appears to have been leaked,” the company said in a statement provided to TechCrunch, “but by design the security of our products doesn’t depend on the secrecy of our source code. There are many layers of hardware and software protections built into our products, and we always encourage customers to update to the newest software releases to benefit from the latest protections.”

Much of the security concern is mitigated by the fact that it appears to be tied to iOS 9, a version of the operating system released three-and-a-half years ago. Apple’s almost certainly tweaked significant portions of the available code since then, and the company’s own numbers show that a large majority of users (93-percent) are running iOS 10 or later. But could the commonalities offer enough insight to pose a serious potential threat to iPhone users?

Security researcher Will Strafach told TechCrunch that the code is compelling for the information it gives hackers into the inner workings of the boot loader. He added that Apple’s probably not thrilled with the leak due to intellectual property concerns (see: the DMCA request referenced above), but this information ultimately won’t have much if any impact on iPhone owners.

“In terms of end users, this doesn’t really mean anything positive or negative,” Strafach said in an email. “Apple does not use security through obscurity, so this does not contain anything risky, just an easier to read format for the boot loader code. It’s all cryptographically signed on end user devices, there is no way to really use any of the contents here maliciously or otherwise.”

In other words, Apple’s multi-layered approach to keeping iOS secure involves a lot more safeguards than what you’d see in a leak like this, however it may have made its way to GitHub. Of course, as Strafach correctly points out, the company’s still probably not thrilled about the optics around having had this information in the wild — if only for a short while.

Do you think this code leak warrants concern? Sound off in the comments below!

Tales from the Orchard: FBI Hacker Says Apple Are ‘Jerks’ and ‘Evil Geniuses’ for Encrypting iPhones

An FBI forensic expert lambasted Apple for making iPhones hard to hack into.

 

By Lorenzo Franceschi-Bicchierai of Motherboard at Vice .com

 

Ever since Apple made encryption default on the iPhone, the FBI has been waging a war against encryption, complaining that cryptography so strong the company itself can’t break it makes it harder to catch criminals and terrorists.
On Wednesday, at the the International Conference on Cyber Security in Manhattan, FBI forensic expert Stephen Flatley lashed out at Apple, calling the company “jerks,” and “evil geniuses” for making his and his colleagues’ investigative work harder. For example, Flatley complained that Apple recently made password guesses slower, changing the hash iterations from 10,000 to 10,000,000.

That means, he explained, that “password attempts speed went from 45 passwords a second to one every 18 seconds,” referring to the difficulty of cracking a password using a “brute force” method in which every possible permutation is tried. There are tools that can input thousands of passwords in a very short period of time—if the attempts per minute are limited, it becomes much harder and slower to crack.

“Your crack time just went from two days to two months,” Flatley said.

“At what point is it just trying to one up things and at what point is it to thwart law enforcement?” he added. “Apple is pretty good at evil genius stuff.”

On the other hand, Flatley repeatedly praised the israeli company Cellebrite, which sells hacking devices and technologies to law enforcement agencies around the world. Flatley said that they are the ones who can counter Apple’s security technology.

“If you have another evil genius, Cellebrite,then maybe we can get into that front,” he said, facetiously coughing as he said “Cellebrite.”

Flatley’s statements come a day after FBI director Christopher Wray renewed former director James Comey’s rhetorical war against encryption, calling it an “urgent public safety issue.”

Cybersecurity experts and civil liberties organizations, meanwhile, have long made the case that iPhone encryption keeps the average consumer’s data safe from hackers and authoritarian surveillance, a net benefit for society.

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: