How Apple's Emergency Spyware Snafu Sets a Dangerous Precedent for Average Users
With Analyst Robert Bateman
Today on That Tech Pod, Laura and Gabi talk to analyst and research director for privacy, data protection, and security at GRC World Forums, Robert Bateman, about the recent emergency update from Apple.
Listen here!
Preview today’s episode below!
Do you prefer to read our conversation? No problem! See full transcript below (please forgive typos):
Photo by Szabo Viktor on Unsplash
That Tech Pod
Hey, everybody, we have a super amazing bonus episode today. Laura, what do we have going on? Today on that tech pod, we do have a special bonus episode where we will dive into everything apple. On this episode, we will go over Apple's most recent emergency update, past vulnerabilities around Icloud Photo uploads for child sexual abuse material and breakdown, if Apple is the best option, or if Google out stands Apple when it comes to privacy to help us do this. We have Rob Batman. Rob is an analyst and research director at GRC World Forms. So let's start with this most recent Apple vulnerability. So just this week, this is what happened and I'm reading from Computer World. So on Monday, Apple issued an emergency security updates for iOS Mac OS and other operating systems to plug a hole that some researchers claimed had been planted on a Saudi political activists device by NSO Group and Israeli seller of spyware and surveillance software to governments and their security agencies. So there's a lot going on there. Can you break that down for us? And what's the latest on all of that?
Robert Bateman
Sure. And Hi, Gabi and Laura, thanks for having me on. It's great to be here. So this discovery, it's a vulnerability discovered by a group called Citizen Lab. It actually came up last month. Citizen Lab is a research group from the University of Toronto. Now, this group discovered that the iphone is vulnerable to being hacked by a piece of software known as Pegasus. This a spyware app developed by an Israeli company called NSO Group, as you mentioned. Last month, the Guardian ran an explosive report about NSO Group and how governments around the world have been using Pegasus to spy on their citizens. So just for background, NSO Group says that this software is intended to be used for fighting suspected terrorists and pedophiles and so on. But the Guardian discovered that it was being used against innocent citizens, journalists, political activists, civil society groups and so on. This software is really very intrusive. If it's on your phone, you're liable to have your text read, your calls recorded, they can even turn on your camera and record video without you knowing. And it was thought that iPhones were resistant to having their software planted on them, thanks to a protection that Apple had in place known as blastdoor. Citizen Map found that NSO group could get around this blastdoor protection and install Pegasus on iphones. They're calling this exploit forced entry for that reason. So this is a zero click exploit. What that means is that users don't have to download a file or follow a link. There's no phishing involved to get this spyware install on your phone. You wouldn't even know it was being installed. You wouldn't know anything has gone wrong. So Citizen Lab discovered Pegasus last month, initially on the phone of a Barini activist, and then more recently, they found it on a Saudi activist phone. And on September 7, they sent more details about this vulnerability to Apple and they think people's phones have been vulnerable to it since at least March. So then Apple sent out this update on Monday the 13th and told users who patch their iphones urgently. And that's where we are now.
That Tech Pod
Yeah and how are people reacting to this news? I know that the EU Commissioner has come out and said that we need to tighten up Privacy laws and calling for urgent action in response to this spyware. So can you just break down for us some of the reactions that we're seeing from this in the US and globally?
Robert Bateman
Well, there's been a lot of panic about this exploit, and the cyber security community has been urging people to update very urgently. Apple has actually been criticized for not having pushed this update loud enough. There has been some people saying that Apple has been a little bit Fergon. I mean, they patched the vulnerability fairly quickly, but some say they could have been more urgent in pushing the update to their users. On the other hand, I mean, the software is very much targeted, has been used on thousands of people, but the chances of most people having it installed on their iphone is very low. So some people, a BBC reporter named Joe Tidy, for example, mentioned the other day. While it's very serious, the average person doesn't necessarily have to worry about having Pegasus on their phone. I think it's very important to keep your phone secure as far as possible. And we have to remember that some very vulnerable people and in risky situations are targeted by this software, so it's extremely important to act on it as soon as you can.
That Tech Pod
Rob, being a specialist in this area, is this something that you've seen before? Have you seen any kind of forced entry specific to Apple or any of the other providers in this area?
Robert Bateman
So Apple didn't come out very well from the Pegasus revelations, some of the investigations into the basically, there was a list of targets or potential targets of Pegas software. And when researchers asked to see some of these people's phones for the research purposes, they found that many of them were indeed infected with Pegasus. Now it's always a cat, a mouse game with this sort of thing. So someone will discover and a vulnerability iphone Apple will patch it. And so we'll find a new vulnerability and patch that, too. But this blast door protection was thought to be effective and has now proved not to be. So I think that's why this is a particularly big deal for Apple.
That Tech Pod
So I know you mentioned before that right now, average users don't really have to worry about this right now that there's very little chance that this is impacting them personally. But spyware folks, cyber and Privacy experts are pretty alarmed by it. Do you think does this set like a dangerous precedent for maybe right now, average citizens don't really have to worry about it. But down the line, this is actually a red flag or a red alarm?
Photo by Lianhao Qu on Unsplash
Robert Bateman
Absolutely. I mean, it's always a question of the visual risk and also societal risk. While I might not have much chance of having my phone effective with Pegasus, because this government at this time probably is not interested in what I'm doing with it. That's not to say that some future government won't be interested in what I'm doing on my phone. And indeed, these are vulnerabilities are open to anyone smart enough to exploit them. So it's really not enough to say that the average person doesn't have to worry about these problems. There are non average people out there that are at risk of being caught by these exploits. And once the infrastructure is in place for these sorts of hacking and Privacy invasions by government, then there's potential to therefore some very well, let's put it this way. Citizen Lab describes an so group as a company that conducts death Putin as a service so they can act on behalf of governments to to tap all sorts of people's phones. And that day is a very worrying thing.
That Tech Pod
Yeah, you heard it here. All people average people be scared as well. Anyone or don't. It may not affect any of us, but it does kind of make us go back to some of the other things that have been going on when it comes to Apple and some of their other big Privacy and security headaches that have been going on. To our knowledge, there was a plan to scan people's Icloud Photo uploads for child sexual abuse material. This is a very controversial issue. But Rob, can you kind of go through that situation and a little bit more detail because it kind of makes you question, yes, you had the most recent vulnerability, but it's a reminder that this isn't the first of Apple's vulnerabilities to come to light.
Robert Bateman
So what you're referring to, there was a plan by Apple to introduce scanning on iphones for child sexual abuse material, or Sam, as is known. And this has been very controversial among Privacy and security experts. And sort of goes, it's been a bit of a PR disaster for Apple two because they've really had a big person recent years to betray themselves as the best option for Privacy. So let me explain a bit about how this Sam scanning idea would work. Now, most tech plan forms already scanned for Sam on their cloud storage service. So many use a program called Microsoft Photo DNA to scan uploads to their own servers. Facebook, for example, makes hundreds of thousands of reports every year to the National Center for Missing and Exploited Children and other platforms like Google or Microsoft. Twitter make many reports, too. But the controversial aspect of Apple's proposals were that the company planned to scan users photos on their devices. So before they were uploaded to the cloud server. Now, Apple said, and apparently genuinely believed that people would go along with this, that this was the more private option. It's better to have stuff scanned on your phone then in the cloud, as it were. But the response from the Privacy community was very negative and it caused a big problem for Apple. So as mentioned, Apple scanning is quite different from other companies, and this is how it works. So they use a tool called Neural hash to break up pictures into hashes, so that strings of numbers that sort of represent the characteristics of the picture. And these hashes can't be reverse engineered to let people see the image, and they're not affected by sort of crude techniques that can be used to disguise a photo, like resizing it or changing the color profile. For example, they also don't generally lend themselves to false positives, at least not in the same way that looking directly an image would do. So a picture of your kid in the bath, for example, might not generate a hash that looks like unknown child sexual abuse material photo. So Apple's plan and this is the controversial part is to upload a list of hashes that are already known to be associated with Ssam poses provided by the National Center for Missing and Exploited Children. And keep that list on your phone. Update it regularly and then compare the hashes of your pictures that you upload to. Icloud with the hashes on that list. So if Apple finds a match, they wouldn't immediately report you to the police or to the center for Missing and Children. But what they do is to add what they call a safety valve to your Icloud profile. This allows for a few force positives. Once you get a predetermined number of safety vouchers, Apple then refers your phone to authorities.
That Tech Pod
Do you think Apple is currently looking at our phones, Gabby? Do you think right now Apple has all of our great photos because I know we have some really good ones and I'm little they're probably shifting through in there. Like, is this a child?Well, 100%. I'm cynical. I think all of our data is just out there and being looked at by everyone. Yeah, but Rob, kind of going back to what you were saying. So Apple, in light of the pushback on this policy on this new sort of system, and you mentioned this, Apple kind of put in a little bit more of these things in place to reduce the amount of false flagging. Apple said this new sort of system will ensure there's less than one in a million or 1 trillion chance of incorrectly flagging and account per year. What do you think about their response? Does that appease critics to this policy? What do you think?
Robert Bateman
Yeah. I mean, one in a trillion. That is true. That really is quite an impressive technique they got. I think one of the problems that people have with this, it's not so much looking for the Siam images, not so much the chance of innocent people being caught out by it. But just again, putting this infrastructure in place that could be abused by by government. So the EFF, for example, the Electronic Frontier Foundation, by the way, I'll mention this. Now I'm running a session next week on this. I'm not on my company is about the sound scanning. We've got someone from EFF come to talk about it. I'll mention at the end how to register for that. So the criticism is that this is turning your phone into a sort of black box with a back door that could be exploited by governments again. So first they're looking for child detection abuse material then what if they start looking, for example, for terrorists related content? And then what is if they start looking for piracy videos, for example, copyrighted material and moving on from that? What if they then start looking for images associated with political activism? This is slippery slope idea that once you have this infrastructure in place, you have a backdoor on millions of people's phones. There's no way back from that. The government could turn and ask them to start scanning for other stuff. And it's quite a dangerous situation. That's the argument against this idea.
That Tech Pod
A follow up question to that and this is obviously not my expertise. So this is what I'm going to ask you. So tell me how off I am on this. But kind of like my other question, it could set a precedent. And like you're just saying, right now, future governments may use this to search for other things, but do the companies do big tech companies or governments have to announced these different sort of intention? So, for example, if they have the capacity with trying to rule out child abuse, but then they're like, oh, to want to target Black Lives Matter activists, and then they start searching, like using the tools that are in place to search for that. Does that have to be notified? Is there any way we would know that was happening or is that completely able to happen on under the table and just to jump on that further? I think when you were describing that, Gabi, it made me really think about all of the body cam footage stuff where people are saying or scanning your face where they're saying, but hey, we can detect if that was the killer by doing this. And then as an everyday person, I'm like, figure out other ways to do that. I don't want my face scanned. And so it is always that fine line.
Robert Bateman
Yeah. So it's a great question and two kind of elements to that question. So first of all, I mean, whether or not governments can abuse this facility without us knowing depends on the strength of the rule of law in the relevant country. Now, this feature is called a feature. I mean, this idea was only going to be rolled out in the US initially. And I don't say that necessarily in a positive sense, because the US, as we know from Edward Snowden and so on, had a lot of sort of secret programs for surveilling the citizens and people abroad. I think it would be difficult to order Apple to start scanning for other types of material without people finding out but that's not to say that it's impossible. Certainly, there have been secret surveillance programs before, and those have evaded the legislature and the press and so on for some time regarding other ways to do this. I mean, there are other legitimate ways that governments can spot suspicious activity among phone users and Internet users by looking at metadata, for example. So not necessarily the content of people's photos or messages, but the information about them, who they were sent to when they were sent, for example, someone who's sending lots of images to lots of teenage girls. That's a suspicious behavior. You can already tell that people are doing that despite into an encryption. So on. Because metadata is not exactly a fair game, but it's much easier for governments T to detect those suspicious behaviors using metadata. So like you say, Laura, it might be worth just having the government find more creative and less intrusive ways of protecting this sort of behavior. Perhaps a measure this extreme is not even that helpful. I mean, I personally have not really made up my mind on this emphatically just yet.
That Tech Pod
Yeah, you bring up a lot of good points and help us break that down. So lastly, today, before we let you go. And this may be the most important question that we dive into today for myself personally and obviously all your listeners I care about you. And so hopefully if you're like me, this is a hitter for you as well. I have an iphone. I tried to get an Android and to be honest, I do have an Android as well, but I don't really claim it. Outside of today's episode. I'll try not to bring it up again because I'm a shame, but mostly I use my iphone because I like the blue thing I know my message went through and stupid little things like that that I do think make a difference to me. And I find my Android a little bit more complicated, a little bit more annoying and I don't know if they received my message. And it's probably a silly reason because I do think a lot of times I look at their photos and they just look better and better quality and a lot of things. And I always kind of have this back and forth question. But when we're looking at Apples Privacy and security issues, it does beg the question, is Apple more secure than Google? So can you help us break down the difference, which is better? And did I make the right choice by being more interested in my Apple phone than the Android?
Photo by Hardik Sharma on Unsplash
Robert Bateman
So I think there's two elements, the Privacy or Privacy, as we call it over here in the UK or security, and Apple basically wins on both? Still, now, the company has had a lot of changes in recent years. There's been this push by them to really represent themselves as the Privacy option. The last update. Well, maybe one before last is 14.5. They introduced this consent mechanism where they require third party developers to get consent from the users before tracking them, meaning compiling their activity across apps and on the Web to target advertising. Basically, this is not something that Google does. Google is very much a data driven companies, so any opportunity for apps to collect data about their users is going to benefit Google and Apple. This policy really does make a difference in terms of Privacy. Now, Apple has been accused of hypocrisy here because it doesn't get optin. Well, it didn't get opt in consent for its own tracking activities. In fact, there was a legal case about this in France or a complaint, at least accusing the company of applying different standards to itself. But in the most recent update is 15, I think Apple has stopped even doing that. So if you're looking for a company that asks your consent before collecting your data, then Apple is definitely the one to go for in terms of security. Again, because the iphone environment is a lot more closed, it's more difficult to download. Well, you can't, in fact, unless your jailbroken your iphone, as they call it. You can't download apps or use apps that are not on the App Store, and they are very rigorous in their enforcement of the App Store guidelines. Android is a bit freer. It's a lot more open. It's open source as well, so you can download apps from the Internet and use those even when they haven't come from a Google Play Store. This means you're more liable to download malware and so on, which creates a bit of a security vulnerability. So despite these problems, I think Apple still wins hands down for both Privacy and security.
That Tech Pod
Whoa, this is great news for me. I feel safer, I feel reassured and I still feel scared because you said it's a better option. Didn't mean this is the best out of what we have. So I don't know how great I feel about it, but I'll still take it as a win. So thank you, Apple, for the blue marks I get sending the text message you psychologically won me over. So I'll cheers to that. And Rob, you have been a great guest today, so we will make sure to put the information on your event. But before we go, if you want to tell anyone about that, because it is sort of tied into this, and we always encourage people to get more knowledge, attend more events and learn what's out there.
Robert Bateman
That's right. So this event is part of privsecc global next Thursday, so the 22nd. Sorry, that's the 23 September we're having a panel on Apples child safety features, which have been delayed for the time being. By the way, the host is Albert Fox came from the Stop Surveillance Technology Oversight Project. Great guy, Albert, and two panelists include Jillian T. York from EF like Elena Avanti from Custer. And that should be a really great discussion about if you're interested in Apple's plans and why they might be problematic. That should be a great discussion about that. Other events going on as well throughout two days between Wednesday and Thursday. Next week, it's free to attend. I really do recommend people go to presell dot com that's P-R-I-V sec global dot com and register to send any events that interest you.
That Tech Pod
So, Rob, thank you so much for joining us and helping us understand what all these Privacy and security concerns are with Apple and big tech in general. So thank you so much.
Robert Bateman
Great thanks for having me on.
Robert Bateman is an analyst and research director for privacy, data protection, and security at GRC World Forums.To follow Robert Bateman click here.
email us at contact@thattechpod.com