Apple’s Huge iPhone Mistake—New Warning For 1 Billion Users

Apple has a very serious problem that has suddenly become a headline issue, undermining claims about iPhone’s security and privacy credentials. It turns out that what happens on your iPhone, doesn’t always stay on your iPhone after all.

I have warned before about the dangerous flaw in Apple’s iPhone security when it comes to the private messages sent between its billion-plus users. Privacy is built in from the beginning,” Apple says. “Powerful security features help prevent anyone except you from being able to access your information.” If only it was that clear-cut. Now a new warning from a very surprising source has hit the news.

iMessage is Apple’s stock end-to-end encrypted messenger. Designed to compete with WhatsApp, it seems to have the same security—albeit only when communicating within Apple’s ecosystem. Message an Android user and you fallback to SMS, which is unacceptable in 2021—more on that later. But even when you think you’re secure, you’re probably wrong. iMessage has an alarming catch.

The issue is iCloud and the general backups you make from your iPhone. If you use Apple’s default, recommended settings, then you run Messages in iCloud—meaning you sync your messages across all your devices, and you also run a generic iCloud backup, meaning you save a copy of your phone’s data and settings to Apple’s cloud.


Here’s where it gets complicated. iMessage is secured by end-to-end encryption, the idea being that the keys to decrypt messages between you and those you message are only shared between you. That stops anyone intercepting your content. But in a bizarre twist, Apple stores a copy of those encryption keys in that iCloud backup, which it can access. That means the end-to-end encryption is actually fairly pointless.

This issue came to the fore this week, with the publication of a sensitive FBI document that advises on which messaging platforms its agents can most easily access. The iMessage issue was front and center: “if target uses iCloud backup, the encryption keys should also be provided with [lawful access] content return; can also acquire iMessages from iCloud returns if target has enabled Messages in iCloud.”

When it comes to security, WhatsApp’s assurances are clear: “We use end-to-end encryption so no one can read or listen to your personal conversations.” Apple’s wording is different—and the nuance is critical. “End-to-end encryption protects your iMessage conversations across all your devices,” it says, “so that there’s no way for Apple to read your messages when they’re in transit between devices.”

That “in transit” wording is critical. It’s absolutely correct that when messages travel between phones they’re highly secure, but on the device and backed up to the cloud, that changes. This issue is fast becoming the most critical one for secure messaging.

Despite its clearer assurances, WhatsApp didn’t do that well, with the FBI saying it can request contact lists and metadata “sent every 15 minutes” on who is messaging who. It can also access message content “if target is using an iPhone and iCloud backups enabled, iCloud returns may contain WhatsApp data to include message content.”

WhatsApp’s new encrypted backup feature is designed to resolve this exact issue, and WhatsApp even advises during the setup process that users need to remove it from iCloud backups. It turns out that little iCloud backup setting is bad news for private messaging all round—a major vulnerability Apple needs to address urgently.

Unsurprisingly, Signal comes out best on the document: “No message content, [just] data and time a user registered [and] last date a user’s connectivity to the service.” Signal avoids all on-device backup options and famously doesn’t capture any metadata at all. Essentially, if the data doesn’t exist, it can’t be provided.

“It’s impossible to turn over data that we never had access to in the first place,” Signal says. “Signal doesn’t have access to your messages; your chat list; your groups; your contacts; your stickers; your profile name or avatar; or even the GIFs you search for.”

If you want to keep your messaging private, my advice is to use WhatsApp as your daily, just given its scale, but make sure you use encrypted backups and don’t have the iCloud backup option enabled. You should definitely use Signal where your contacts also have the app. If you’re an Android user, you can set Signal as your default messenger, such that it handles SMS as well—that’s a great option to have.

This has not been a good year for Apple and its iMessage platform. The storage of encryption keys in accessible backups is one mistake, but it has made two others as well, both of which significantly reduce the security and privacy of iMessage.

First, Apple’s decision to steer clear of Google’s now coordinated RCS rollout across Android is a bad move for users. It means that Apple users messaging Android contacts, or vice versa, have to revert to a third-party platform like WhatsApp or will default to non-secure SMS, a crazy situation for 2021.

Google has been gently pressing Apple to get onboard this SMS v2 upgrade, and we saw more of that recently with a Google Messages update that translates the emoticon iMessage responses to something similar on an Android device. This is fairly pitiful as a token gesture for cross-platform interoperability, but it makes the point.

Second, Apple has also misstepped with its decision to add an on-device AI classifier to iMessage to warn minors sending or receiving explicit imagery. While this doesn’t technically breach end-to-end encryption, it does open a backdoor to outside interference within the overall secure messaging enclave and has been heavily criticized by security and privacy advocates as a result.

And let’s not forget Pegasusgate, where an iMessage compromise was implicated in zero-click attacks on Apple users reportedly perpetrated using NSO technology. Apple says it has patched those vulnerabilities in iOS 15, but it was damaging, nonetheless.

This is an interesting time for secure messaging. There has never been more awareness of the value in preserving privacy using end-to-end security, but at the same time there’s never been more pressure on tech providers to open those platforms to law enforcement. The publication of this FBI document shows that despite protests from more hawkish lawmakers, there’s a fair amount of access to data even now.

Pegasus and this lawful access revelation is all about endpoint compromise. Whether on the device and tapped by malware, or backed up to the cloud, once end-to-end encrypted data reaches one of those ends, you need to take care it’s secure. And that means protecting your device and being mindful of what you back up to where.

Meanwhile, for Apple and its security and privacy USP, this is another awkward set of headlines it could really do without. I have approached Apple for any comments on the FBI document and its serious implications for the billion-plus iPhone users.

The stark truth is that Apple needs to change its iCloud approach as a matter of urgency, to cease storing encryption keys and to avoid backup up end-to-end encrypted data unless its protection carries over or users have been specifically warned that their privacy is being compromised. This update is now critical.

Why You Should Delete Your Facebook App

A stark new warning for almost all iPhone users, as Facebook is suddenly caught “secretly” harvesting sensitive data without anyone realizing. And worse, there’s no way to stop this especially invasive tracking other than by deleting the app.

A week ago, I warned iPhone users that Facebook still captures location data using the metadata from your photos and your IP address, even if you update your settings “never” to track your location. Facebook admits to this harvesting, refusing to be drawn on why that’s so wrong when users specifically disable location tracking.

Now security researchers have suddenly warned that Facebook goes even further, using the accelerometer on your iPhone to track a constant stream of your movements, which can easily be used to monitor your activities or behaviors at times of day, in particular places, or when interacting with its apps and services. Alarmingly, this data can even match you with people near you—whether you know them or not.

Just like the photo location data, the most serious issue here is that there is absolutely no transparency. You are not warned that this data is being tracked, there is no setting to enable or disable the tracking; in fact, there doesn’t seem to be any way to turn off the feature and stop Facebook (literally) in its tracks.


Researchers Talal Haj Bakry and Tommy Mysk warn that “Facebook reads accelerometer data all the time. If you don’t allow Facebook access to your location, the app can still infer your exact location only by grouping you with users matching the same vibration pattern that your phone accelerometer records.”

The researchers say the issue impacts Facebook, Instagram and WhatsApp, albeit with WhatsApp, it’s possible to disable the feature and the platform assured me that no data ever leaves a user’s device. “In Facebook and Instagram,” Mysk told me, “it is not clear why the app is reading the accelerometer—I couldn’t find a way to disable it.” That means you need to delete the app and access Facebook via your browser instead.

Facebook is awkwardly exposed here, with Mysk telling me: “I tested TikTok, WeChat, iMessage, Telegram and Signal. They don’t do it.”

Given Facebook dominates iPhone social media installs—this will impact almost all the billion-plus iPhone users around the world. Facebook confirmed to me that “we use accelerometer data for features like shake-to-report, and to ensure certain kinds of camera functionality such as panning around for a 360-degree photo or for camera.”

“Although the accelerometer data seems to be innocuous,” Mysk says, “it’s jaw-dropping what apps can make up of these measurements. Apps can figure out the user’s heart rate, movements, and even precise location. Worse, all iOS apps can read the measurements of this sensor without permission. In other words, the user wouldn’t know if an app is measuring their heart rate while using the app.”

MORE FROM FORBESHow To Disable Facebook’s Image Location Harvesting On Your iPhone

While there may be valid benefits in using the camera, this does not explain why your movements are tracked constantly, rather than only when those camera features are in use. It would be simple for Facebook only to tap the accelerometer when needed. As for the shake to report function, Facebook could use Apple’s functionality to limit how much data it pulls—but that’s not how Facebook operates. Worse, even when users toggle off this reporting feature in the Facebook app, Mysk told me, “nothing happens when you shake the phone, but the app continues to read the accelerometer.”

The researchers cite the example of a bus journey to show how such data might be used. “If you are on the bus and a passenger is sharing their precise location with Facebook,” they explain, Facebook can easily tell that you are in the same location as the passenger. Both vibration patterns are going to be identical.”

If you think this is spurious, Facebook actually has a patent application to use wireless phone signals to connect strangers, and even cites the example of just such a bus ride, “it can be advantageous to provide an approach for users, who have met or have likely met, to connect with one another if they so choose.” Remember, none of this information exists in isolation, Facebook’s trillion-dollar magic is joining the data dots. Put more simply, you know all those mysterious new friend connection ideas…

“We tested several apps,” Mysk explains, “and Facebook and Instagram stood out. While Facebook reads the accelerometer all the time, Instagram only reads it when the user is texting in the DM. In addition, WhatsApp also reads the accelerometer by default to animate chat wallpapers. So, this puts these three apps together, and you wonder if they are matching vibration patterns among users. This can get nasty, and the way to end it is by protecting this valuable sensor with a permission.”

You need to remember that Facebook is a trillion-dollar empire built on data, and only data—with Facebook, it’s not so much a metaverse as a dataverse. If the company can use this data, combined with everything else it holds on you and those around you, then it will. Why would it suddenly decide to exercise restraint?

Just look at the staggering privacy labels behind Facebook’s iPhone app—while much of the data Facebook gathers comes from its platform and services, the data it can pull from the app simply adds more third-party information into its mix. All this is linked to your identity, nothing is wasted or thrown away.

As ESET’s Jake Moore warns, “this is, in clear terms, another violation which seems to have gone under the radar when scooping up yet more personal data from iPhones. Many people may not even think twice what sensors an iPhone has, let alone fully understand what this information can offer companies.”

This is another app permission issue. If you use the Facebook app on your iPhone, then you essentially give Facebook permission to access data and information on and about your phone. And while you can restrict some of this, there is other data—just as here with the accelerometer—that you will not know about.

Mysk and Haj Bakry have form for just such privacy exposures. They discovered the iOS clipboard issue that ultimately prompted Apple to change its settings and provide a clipboard warning, which has now led to Android 12 doing the same.

Just as then, Apple needs to act here. The accelerometer should not be a free-for-all, not when data giants such as Facebook can use this as yet another data point to feed into their algorithms, plotting social graphs and tracking locations and behaviors.

MORE FROM FORBESGoogle’s Latest Tracking Nightmare For Chrome Comes In Two Parts

“All data which is personal and unique should be viewed as sensitive and must be protected,” Moore says. “This permission needs to be restricted along with other obtrusive data tracking especially if users were previously unaware this information was being analyzed.” And it’s that lack of awareness that is most critical here.

Apple has done a great job this year, preventing data abuses from the likes of Facebook and Google. App Tracking Transparency has already inflicted a drastic impact on data-fueled revenues. In iOS 15, we have seen new privacy innovations around mail tracking, web anonymity and privacy reports. Now we have another simple update that Apple needs to develop, to clamp down on this clear-cut data abuse.