Preview Mode Links will not work in preview mode

Thanks for joining us! Let me know if there are any topics you'd like us to cover by sending an email to me at craigpeterson . com!

Aug 21, 2021

Apple is Adding Tech to Look At Your Photos For Child Abuse

This is a tough one. Apple has decided that it will build into the next release of the iPhone and iPad operating systems, which monitors for child porn.

[Automated transcript]

Apple has now explained that they will be looking for child abuse images in specific ones. And I just am so uncomfortable talking about this, but the whole idea behind it is something we need to discuss. Apple said they're going to start scanning for these images and confirmed the plan. In fact, when people said, are you sure you're going to be doing that?

[00:00:44] Here's what. IOS 15, which is the next major release of Apple's operating system for I-phones. And for I pad is going to use a tie to something called the national center for missing and exploited children. And the idea behind this is to help stop some of this child abuse. And some people traffic in children; it's just unimaginable.

[00:01:14] What happens out there really is some people. It's just such evil. I, I just don't get it. Here's what they're going to be doing. There are ways of taking checksums of pictures and videos so that if there is a minor change in something that might occur because it was copied, it does not mess it up.

[00:01:40] It still can give the valid checksum and. Iman, that technology is detailed, but basically, just think of it as a checksum. So if you have a credit card number, there is a checksum digit on that bank accounts have checked some digits. If you mess it up a little bit, okay, it's an invalid checksum, so that number's obviously wrong in this case.

[00:02:04] What we're talking about is a checksum of a pitcher or oven. And these various child safety organizations have pictures of children who are abused or who are being abused, who are being exploited. And they have these checksums, which are also called hashes. So that is now going to be stored on your iOS device.

[00:02:34] And yes, it's going to take some space on the device. I don't think it's going to take an enormous amount of space, considering how much space is on most of our iPhones and iPads that are out there. Apple gave this detection system is called CSam, an absolute thorough technical summary. It is available online, and I've got a to this article in this week's newsletter, but they released this just this month, August of 2021.

[00:03:07] And they're saying that they're using a threshold, that is. Quote set to provide an extremely high level of accuracy and ensures the less than one in 1 trillion chance per year of incorrectly flagging a given account. So now I can say with some certainty in having had a basic look through some of the CSM detection documentation that they're probably right about that, that the odds are excellent.

[00:03:40] Small that someone that might have a picture of their kids in a bathtub, the odds are almost so close to zero. It is zero that it will be flagged as some sort of child abuse because it's not looking at the content of the picture. It's not saying that this picture maybe a picture of child exploitation or a video of her child being exploited.

[00:04:02] If it has not been seen before by the national center for missing exploited. It will not be flagged. So I don't want you guys to get worried that a picture at the beach of your little boy running around and just boxer trunks, but a lot of skin showing is going to get flagged. It's not going to happen.

[00:04:25] However, a pitcher that is known to this national center for missing and exploited children is, in fact, going to be flagged, and your account will be flagged. Now it's hard to say precisely what they're going to do. I haven't seen anything about it, of the apples. Only say. That that they're going to deploy software.

[00:04:51] That will analyze images in the messages application for a new system that will warn children and their parents from receiving or sending sexually explicit photos. So that's different. And that is where again, a child, you put parental settings on their iPhone. If they're taking these. Pictures, selfies, et cetera.

[00:05:14] Girls sending it to a boyfriend, sending it to his girlfriend, whatever it might be. The parents will be warned, as are the children looking for things that might be of sexual content. Okay. It really is. It's really concerning. Now let's move on to the part that I'm concerned about. I think everyone can agree that both of those features are something good that will ultimately be very good, but here's a quote.

[00:05:41] Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship. Now, I should say this guy who's co-director for the center for democracy and technology security and surveillance product project. He's Greg, no, him, no Chaim, is saying this. He said Apple should abandon these changes and restore its users, faith in the security and integrity of apple devices and services.

[00:06:15] And this is from an article over a tech. So this is now where we're getting. Because what are they doing? How far are they going? Are they going to break the end encryption in something like I messages? I don't think they are going to break it there. So they're not setting up, necessarily, an infrastructure for surveillance and censorship. But, still, Apple has been called on, as has every other manufacturer of the software.

[00:06:45] I remember during the Clinton administration, this whole thing with eclipse. The federal government was going to require anyone who had any sort of security to use this chip developed by the federal government. And it turns out, of course, the NSA had a huge backdoor in it, and it was a real problem.

[00:07:04] Look at Jupiter. That was another encryption chip, and it was being used by Saddam Hussein and his family to communicate. And it turns out, yeah, there's a back door there too. This was a British project and chip that was being used. So with apple, having resisted pressure. To break into phones by the US government.

[00:07:28] But some of these other governments worldwide that have been very nasty have been spying on their citizens who torture people who don't do what apple are not happy, what the government wants them to do been trying to pressure Apple into revealing this. So now I have to say, I have been very disappointed in all of these major companies, including Apple. When it comes to China, they're just drooling at the opportunity to be there.

[00:07:57] Apple does sell stuff there. All of these companies do. Yeah, Google moves their artificial intelligence lab to China, which just, I cannot believe they would do something like that. AI machine learning, those or technologies that will give the United States a real leg up technology-wise to our competitors worldwide.

[00:08:18] They move to China, but they have complied with this great firewall of China thing where the Chinese people are being censored. They're being monitored. What's going to happen now because they've had pressure from these governments worldwide to install back doors in the encryption systems.

[00:08:39] And apple said, no, we can't do that because that's going to undermine the security for all users, which is absolutely true. For example, if there is a door with a lock, eventually, that lock will get picked. And in this case, if there's a key, if there's a backdoor of some sort, the bad guys are going to fight. So now Apple has been praised by security experts for saying, Hey, listen, we don't want to undermine security for everybody, but this plan to do ploy, some software that uses the capabilities of your iPhone to scan.

[00:09:16] Your pictures, your photos, videos that you're sharing with other people and sharing selected results with the authorities. Apple is really close to coming across that line to going across it. Apple is dangerously close to acting as a tool for government surveillance. And that's what John Hopkins university cryptography professor Matthew Greene said.

[00:09:47] This is really a key ingredient to adding surveillance to encrypted messages. This is again, according to our professor over John Hopkins, green professor green, he's saying that would be a key in Greece and then adding surveillance, encrypted messaging, the ability to add scanning systems like this to end encrypted messaging systems has been a major ask by law enforcement, the world.

[00:10:15] So they have it for detecting stuff about missing and exploited children. That's totally wonderful. And I'm okay with that. No problem. But that now means that Apple's platform can add other types of scanning. All right. We'll see what ends up happening next, which is warning children and their parents about sexually explicit photos is also a bit of a problem here.

[00:10:47] Apples. Yeah, on this is messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages it's saying. If it detects it, they're going to blur the photo. The child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view them.

[00:11:17] And the system will let parents get a message. If children view a flagged photo, similar protections are available for child attempts to send sexually explicit images. Interesting. Isn't it. Interesting world. So I think what they're doing now is, okay, they're really close to that line, going over.

[00:11:39] It could mean the loss of lives in many countries that totally abuse their citizens or subjects, depending on how they look at them. Hey, make sure you check me out online. Craig Peterson.com.