Here we go again: There’s another example of government surveillance involving smartphones from Apple and Google, showing how sophisticated government-backed attacks can become and why keeping mobile platforms completely locked is justified.
I don’t intend to focus too much on the news, but in short it’s like this:
- Google’s Threat Analysis Group has released information about the hack.
- Italian surveillance company RCS Labs created the attack.
- The attack has been used in Italy and Kazakhstan, and possibly elsewhere.
- Some generations of the attack are carried out with the help of ISPs.
- On iOS, attackers abused Apple’s corporate certification tools that enable internal app deployment.
- About nine different attacks were used.
The attack works like this: the target is sent a unique link that aims to trick them into downloading and installing a malicious app. In some cases, the ghosts teamed up with an ISP to disable the data connection to trick targets into downloading the app to restore that connection.
The zero-day exploits used in these attacks have been fixed by Apple. It had previously warned that malicious parties are taking advantage of its systems that allow companies to distribute apps internally. The disclosures tie in with recent Lookout Labs news about enterprise-grade Android spyware called Hermit.
What is at risk?
The problem here is that surveillance technologies like these have been commercialized. It means that capabilities that in the past were only available to governments are also being used by private contractors. And that poses a risk, as highly confidential tools can be exposed, exploited, reverse engineered, and misused.
As Google said, “Our findings underscore the extent to which commercial surveillance vendors have extensive capabilities that have historically been used only by governments with the technical expertise to develop and operationalize exploits. This makes the Internet less secure and threatens the trust users have. depend on.”
Not only this, but these private surveillance companies allow dangerous hacking tools to proliferate, while making these high-tech snooping facilities available to governments — some of which seem to enjoy spying on dissidents, journalists, political opponents and human rights workers.
An even greater danger is that Google is already tracking at least 30 spyware makers, suggesting that the commercial surveillance-as-a-service industry is strong. It also means that it is now theoretically possible for even the least credible government to access tools for such purposes. Research.
What are the risks?
The problem: These close ties between privatized surveillance and cybercrime providers won’t always work in one direction. Those exploits — at least some of which seem difficult enough to discover that only governments would have the resources to do so — will eventually leak.
And while Apple, Google, and everyone else remain committed to a cat-and-mouse game to prevent such crime and shut down exploits where they can, there’s a risk that a government-imposed backdoor or device security flaw could end up in commercial business. markets, from which it will reach the criminals.
The European Data Protection Supervisor warned: “Revelations about the Pegasus spyware raised very serious questions about the potential impact of modern spyware tools on fundamental rights, and in particular on privacy and data protection rights.”
That’s not to say there aren’t legitimate grounds for security research. There are flaws in every system and we need people who are motivated to identify them; security updates would not exist at all without the efforts of various kinds of security researchers. Apple pays up to six figures to researchers who identify vulnerabilities in its systems.
What happens now?
Earlier this year, the EU’s data protection regulator called for a ban on the use of NSO Group’s infamous Pegasus software. The call went even further, seeking outright a “ban on the development and deployment of spyware with the capability of Pegasus”.
NSO Groep is now apparently for sale.
The EU also said that in case such exploits are used in exceptional situations, for such use companies such as NSOs should be subject to regulatory oversight. As part of this, they must respect EU law, judicial review, criminal procedural rights and agree not to import illegal intelligence, not to politically abuse national security and to support civil society.
In other words, these companies need to be aligned.
What you can do
Following revelations about NSO Group last year, Apple has released the following best practice recommendations to mitigate such risks.
- Update devices to the latest software, including the latest security solutions.
- Protect devices with a passcode.
- Use two-factor authentication and a strong Apple ID password.
- Install apps from the App Store.
- Use strong and unique passwords online.
- Do not click on links or attachments from unknown senders.
Please follow me Twitteror join AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Copyright © 2022 IDG Communications, Inc.