Apr 10, 2021
For all of my listeners who purchased my course on Improving Windows Security - THANK YOU!
We have a whopper of a warning this week about what the Department of Homeland Security is planning under the Biden Administration -- They are going to let Big Tech and Private Companies create the NO-Fly and Terrorist Watch Lists on their behalf -- Scary beyond measure. Then Apple is doing more to protect your privacy. We have another hack of a Commercial VPN provider and there is more so be sure to Listen in.
For more tech tips, news, and updates, visit - CraigPeterson.com.
Tech Articles Craig Thinks You Should Read:
DHS Preparing to Use Private Contractors to “Scour Public Data and
Social Media” To Compile Dissident Citizens for Watch List and
Another Reason to hate VPNs -- Feds say hackers are likely exploiting critical Fortinet VPN vulnerabilities
Mark Zuckerberg's cell phone number is among leaked personal data from 533 MILLION Facebook users, including two other founders that have been released for FREE by hackers
How scammers siphoned $36B in fraudulent unemployment payments from the US
Are self-driving cars safe? Will they ever be? Fender bender in Arizona illustrates Waymo’s commercialization challenge
Apple is enforcing its new privacy standards and rejecting apps - New wave of App Store rejections suggests iOS 14.5, new iPad may be imminent
My biggest complaint about Android? The lack of security updates. Google is trying to solve it -- What we’re expecting from Google’s custom “Whitechapel” SoC in the Pixel 6
NFTs Weren’t Supposed to End Like This
Embracing a Zero Trust Security Model
Turns out Most Manufacturing, Water Supply, and Power Companies Use Controllers with a Security Severity Score of 10 out of 10
Chromebooks outsold Macs worldwide in 2020, cutting into Windows market share
Clubhouse is the New Up-and-Comer but Security and Privacy Lag Behind Its Explosive Growth
New York sues to shut down 'fraudulent' Coinseed crypto platform
Former SolarWinds CEO blames intern for 'solarwinds123' password leak
TikTok breaching users’ rights “on a massive scale”, says European Consumer Group
Automated Machine-Generated Transcript:
Craig Peterson: [00:00:00] We're going to be talking about a fender bender in Arizona and when will these autonomous cars be safe, at least measured safe.
We've got a new wave of app store rejections from Apple. That means a couple of things, including better privacy for all of us.
Hello, everybody. Craig Peterson here. Thanks for joining us today.
This is an interesting question, because we are looking at a future that we assume anyways is going to be full of autonomous vehicles. Why autonomous? What does it mean? There are various levels of autonomous, degrees, if you will. Everything from what we have today in a lot of cars, which is an assist cruise control, that'll keep you a certain distance from the car in front of you.
We've got assisted braking control, where the car notices, Oh, wait a minute. Someone just hit the brakes right in front of you. I should apply the brakes and it hits the brakes even before your foot is pushing down.
Another way to do this is if you slam your foot on the brake, the car assumes you know something that it doesn't, and it increases the force that you're pushing down with. So even though you might just hit the brake fast and not necessarily hard the car will make it hard.
If you think about these types of braking, for instance, you can start to realize where we're running into a problem when it comes to defining whether or not autonomous vehicles are safe.
Bottom line is autonomous vehicles, which are all the way on the other side of this scale, it started with the brakes and now is hopefully going to end with a car that just drives itself. That's everybody's goal, Ford and GM and Chrysler- Fiat, whatever they're called nowadays of course, these autonomous vehicle companies, such as Tesla. We're going to see a way of measuring them that's different than we've ever seen before.
Right now, if you have a motor vehicle, you have a driver's license, most likely. And do you have insurance. Again, most likely and you have insurance because stuff happens. You don't really mean to hit something. You don't mean to wander out of your lane and end up in the woods. Right?
There's a lot of different things that can happen to you, including having another driver get into your way. My wife has been rear-ended. I was rear-ended. She had a beautiful little car, a little MG, and I can tell to this day that she absolutely loved that little car and she used to drive it around and go down to work. I think it was at Baxter Travenol and she'd be driving down there, just having a great time in Southern California. While she was at a stoplight and somebody rear-ended her and totaled her car. Which is just an absolute shame that wasn't her fault. Was it?
I got rear-ended, I've been ruined it, I think two or three times, never to the point that she was at, where the vehicle had major damage, let alone have to be written off, but it happened right.
People aren't attentive. They misjudged the distance. It might be following too close for the conditions, rain or snow or fog or ice. There's a lots of reasons. So we have insurance and we have a driver's license to prove that we indeed at least understand the basics of driving. We passed a test, right? What is it? 70% pass rate, which frankly, isn't such a great rate if you get right down to it.
Anyway, how do we measure these cars? I mentioned the rear end collisions for a very specific purpose. These autonomous cars are racking up millions of miles on roads out West, really California, Arizona is a very popular place for them to be tested because they don't have a whole lot of weather conditions to worry about. The roads are there and they're not changing very much, particularly in Southern California. They've all been built and there's not another square inch that isn't paved, including people's front lawn, which just absolutely boggled my mind.
Why would you have a cement slab for front lawn anyways? That's California for you. These cars driving millions of miles in California are having accidents. They're not having these types of accidents you and I have.
There is a police report that was obtained by the Phoenix new times this last week that revealed a minor Waymo related crash. Now this crash occurred last October and it isn't the only one. This is, kind of, a pattern, but these have not been publicly reported until now. I'm going to read here just a quick paragraph from what the new times in Phoenix had to say, "a white Waymo minivan" Waymo, of course, Google's little spinoff, to make these autonomous vehicles. "A white Waymo mini-van was traveling westbound in the middle of three westbound lanes on Chandler Boulevard in autonomous mode when it unexpectedly breaked for no reason." "A Waymo backup driver behind the wheel at the time told Chandler police that all of a sudden the vehicle began to stop and gave a code to the effect of stop recommended and came to a sudden stop without warning." A red Chevy Silverado pickup behind the vehicle swerved to the right, but clipped its back panel causing minor damage."
No one was hurt. Overall Waymo has a pretty strong safety record. By the way, that was from an article over at ARS Technica. They have more than 20 million testing miles in the Southwest United States. If you think about it. I was adding these numbers up, 20 million miles. My wife and I, we have put well more than a million miles on cars. That's what happens when you have eight kids, right? Over the years you rack it up, 250,000 this car, 300,000 on that car. Yeah. It adds up. That's a lot of miles.
If you start looking at how many miles the average person drives a year and start doing some comparisons with the accident numbers, you'll see really that the autonomous vehicles are having far fewer accidents. Fewer accidents involving a death, which is actually very good, but the accidents it's having, even though they tend to be minor are usually the fault of the other driver. A large majority, in fact of the accidents where these Waymo vehicles, this is according to Waymo, large majority of those crashes have been the fault of the other driver.
So what is the fault of the other driver? Who was at fault here? If that red Chevy Silverado pickup truck hit that Waymo autonomous car, it's the Chevy's fault.
Why did the Chevy do it? It isn't just because he's driving a Chevy or because it's red or a pickup, he hit that car most likely, I don't know, I'm not talking to the guy, but most likely because the car did something unexpected.
If you read again, that police report it saying that even the driver quote unquote, in, in the Waymo car, this white minivan, who's sitting there to make sure the minivan doesn't run somebody over, that driver said, it was all of a sudden it began to stop. It all of a sudden began to stop and gave this code about a stop recommended and stopped with a warning. Put all of those things in a pot and stirred up and what do you have? You now have a different way of driving.
See that Chevy Silverado, if he's a good driver, he's looking ahead right down the road. If you look too close in front of you, you're going to be over-correcting. You're going to be steering all over the place. You're not going to go in a straight line. So with experience, you're looking down the road, two, three, four minimum car lengths ahead. Depends how fast you're going and that's where you're aiming.
You don't see an obstruction in front of that Waymo minivan. So you're not starting to slow down. It's just like I come up to your traffic light there's cars in front of me, and that light is red. I'm not going to be accelerating and then leaning on the brake, like so many people do. I see, there's a red light ahead. There's cars stopped at the light. I'm just going to coast to a stop. Right? Save some energy. You save some brake pads. Stop global warming by not heating up those brake pads.
It's not something most people expect. I've never been rear-ended by doing that, but I've certainly been given the finger for doing that even though I tend to get to the cars in front of me, right? About the time the light turns green.
It's fascinating to look at, but what's going to happen? What is ultimately the way to determine how safe these cars are?
We cannot use the types of assessments that our insurance companies are using. Rear end collisions, like this, rarely get anyone killed. That's where the real high expenses come in. The driver in the back is usually considered to be at fault.
But, what happens when the self-driving cars suddenly comes to the stop in the middle of the road. It's interesting to think about it, isn't it?
Waymo's vehicles sometime hesitate longer than a human would because they have to do all kinds of computations and consider complex situations that they're not used to.
If you've ever written code, say a hundred lines of code. It's going to be in case with cars millions of lines, but out of a hundred lines of code, about 90% of it is for the edge conditions. In other words, things that are unlikely to happen.
So when something weird happens that car's going to hesitate, and that frankly is a problem, the idiosyncrasies of self driving cars.
We're going to talk about a wave of app store rejections by Apple iOS for your iPhone, iPad, et cetera. We'll tell you why right here.
You're listening to Craig Peterson, online Craig peterson.com.
Apple is making another major change in order to give us more privacy. I just started this, Improving Windows Privacy and Security Course. If you using an Apple iOS device, you're halfway there.
Hello, everybody. Craig Peterson here. Thanks for tuning in . You can always hear me email@example.com slash podcast.
Apple has been really the only major vendor out there in the smartphone industry to really have security as their prime motivation. Okay, you could argue money istheir prime motivation, right?
Apple has always tried to be secure. The hardware is quite secure. They haven't licensed their operating system to third parties and that gives them control. Like you can't have anywhere else.
Think about all of the different Android-based smartphones that are out there. There are thousands of different models. Within each model, sometimes there are dozens of different hardware configurations. So, Google comes out with a security patch and sends it on out to the vendors, well actually makes it available for the vendors to pick up. Then the vendors go and grab it, and they have to test it, and they have to work in their own code, and then they have to work in all of the device drivers stuff, and they have to package it up.
They have to test it on all of the different models. Just think about Samsung, how many models Samsung has, just by itself, a whole lot of models. It is almost impossible for Android phones to get security patches. Any Android phone that's more than two years old is guaranteed to not get security patches.
I talked last weekend about what Samsung is doing to try and solve this. Finally, they must be listening to the show. Samsung had been more or less supporting it's top of the line models for about two years. If you bought a top of the line Galaxy phone from Samsung or another real top hot model, you might get security updates for a couple of years, and that's kind of it. Forget about it beyond that, which is why I said, if you absolutely must use Android, there's only one vendor you can use in that Samsung.
There's only one model phone that you can buy, which is Samsung top of the line phone, and you have to replace it every two years. So Samsung has come out now and said, We're going to provide support security support for our phones for five years.
So they're trying to compete with Apple here. Apple has long provided support for five years. And as we saw just a couple of weeks ago with this big act of zero day attack against Apple iOS devices. They will actually provide security updates for much longer than five years, but it's way easier to provide security updates for 30 models of phones than it is for a few hundred models, which is what Samsung has. Expect Samsung to narrow down their product line and also to only really be providing support for the top models within their product lines.
Now, here's what Apple is doing right now. Apple is starting to reject some of these apps that have been in the app store for a long time, as well as new apps. They're rejecting them for a couple of reasons. The biggest reason is that as of iOS 14.5, Apple is requiring all of the vendors to tell you when you go to the app store, what information of yours they're storing, they're using, and they're selling. Okay. Pretty big deal. Isn't it? It's pretty bad deal, frankly, when you get right down to it for facebook and others. Facebook took out full page ads in major newspapers in the US saying, Oh, Apple can't do this. This is terrible. It's going to destroy a small business. They said, it's going to destroy small business because Facebook can't pry into our lives as much. You know how it is. People say all the time, they're saying, Hey, I, why am I getting these ads? I've never even searched for it and somehow it's coming up.
There's a number of reasons why, but the bottom line is called big data. These apps like Facebook use all kinds of big data to figure out what we might like and part of that is based on what our friends are searching for. So, it puts together this massive mesh and figures it all out. Something that the Obama campaign really pioneered when Facebook gave them all of the data that they had on everyone and anyone.
I'm sitting here shaking my head because somehow that's okay, but having this Cambridge Analytica company do some of it from a paid standpoint and not get wholesale data somehow that was the most evil thing that ever happened.
They forgot about Obama, but you know, I guess that's political. I criticize both sides of the aisle. I am an equal opportunity criticizer. They deserve it.
We've got Apple now telling Facebook and every other app developer, you have to tell the users. In fact, if you go right now to your phone, your iPad or your iPhone, or the iPod touch, you'll see if you go to the app and you scroll up. You can open a little tab and that tab will all of a sudden become a very big part of the screen because it's tell me what this app is doing with my data. If you don't tell it, Apple's going to block you from the store.
Google has, of course, a bunch of apps. You've probably used them things like Google maps, which I try not to use. Use the Apple maps its gotten much, much better than it was, and they're not tracking you and selling your data like Google does.
Google has its own little app for doing searches. Of course, you've got Google Chrome, all these different things from Google. Google stopped updating their apps on the Apple app store because Apple was telling Google, you have to tell people what your doing with their data. Google didn't want to do it. We just want to update the apps, kind of, loophole that was in this whole thing. They can't not update it forever. Now we're seeing rejections of these developers.
Here is a few lines again from ARS Technica, from a rejection letter that some developers received. "We found in our review that your app collects user and device information to create a unique identifier for the users, devices, apps that fingerprint the user's device in this way are in violation of the Apple developer program license agreement and are not appropriate for the app store."
Now, we're not talking about the fingerprint, as in the fingerprint reader, we're saying that they are looking for unique information about the phone, so they know it's you, they can put it all together. That letter goes on specifically, "your app uses algorithmically converted device and usage data to create a unique identifier in order to track the user."
Apple is really making it clear now to developers. To the ire of Facebook and Google and other companies who rely on that type of tracking to maximize the advertising revenue. I can understand that, right. I really can. It's also clear that this app tracking transparency means that apps that are trying to track you by any means without your consent are going to face rejection.
Bravo to Apple, yet again.
Now I'm not so happy about the statement they made this week. Yeah, Georgia. That's another thing entirely.
Stick around everybody. We will be right back talking more about technology. We're going to talk a little bit about what Google's planning to do in order to help with all of these Android developers and people that are selling them. Carriers, et cetera. How's Google is going to help them with their security updates. This is an interesting way to do it. It's exactly what Apple's been doing.
You're listening to Craig Peterson.
Apple's really gotten into the chip business and it isn't just because they wanted a chip for their iPhone that they could control. In fact, Apple has even gone further and looks like Google's going to do the same.
Hello, everybody. Craig Peterson here.
Google has been an interesting beast over the years. Remember they used to say that their motto was don't be evil. Then a few years ago they removed it from the website and evil seems to be their middle name, a little bit.
One of the things Google has been doing is offering an operating system that can be used and is being used to run almost anything. We're talking mostly, however, about smartphones, certainly by number. That's called Android. Android was a little operating system, of sorts, that was developed by a kid actually Google bought it from him. They have continued to develop on it. It's not a bad little platform. The biggest problems with it really have to do with what I talked about a little earlier, the security, right? Getting the updates.
I mentioned how Apple really has a walled garden. They have their own environment where everything is contained so they can control it all. Google cannot control anything other than the Google pixel phone.
It cannot control what Samsung is doing with the operating system, Android can run on pretty much any chip that's manufactured from Intel chips, through all of these, a little fast chips, these snapdragons and many others that have been used over the years. There's a lot of them.
One of the biggest problems, of course, is the chip set. I've mentioned that Google can come out with an operating system release to fix some security problems, and then those are pushed out, but nothing's done by the carrier or maybe the developer of the handset. What Google's decided to do is make their own walled garden.
If you buy an Apple iPhone, you buy an Apple iPad, or you buy a new Apple Mac, they're all using the same basic chip set that's designed by Apple. They have some fabs where they're making some of these components. Apple has done that so again, they can control it even better. They don't have to pay that exorbitant Intel tax.
Also over at Apple trying to figure out how can we avoid the Qualcomm tax. It isn't just a Qualcomm, you know, I say tax, as in you pay way more for Intel than you would for another equivalent or better chip.
In fact, I have an Apple right in front of me here, an Apple Mac. This is a Mac mini M1 based. It is way faster and cheaper than the Intel version.
You can still get the Intel version of the mini $200 more. There's your Intel tax. And it's about half the speed for some of these things.
For instance, Adobe said that this mac with the Apple chip set in it can be twice as fast as the Mac, same Mac with an Intel processor.
Apple is moving away from not just Intel now, but from Qualcomm. Google wants to move away from Qualcomm. In many of these smartphones, including the pixels, they're using a Qualcomm Snapdragon chip.
The Qualcomm makes a lot of different types of chips. They also tend to make the radio chips that are in our smartphones. The radio chips are used to talk to the cell towers, just send data to send our voice. That's what they're used for.
Apple is hiring developers right now to develop their own chip set. It might not be there for 5G. It might be 6G. In fact, that's what the advertisements for those jobs were about as 6G. But they're going to move away from all of these standard devices that are very expensive and hard to control.
Google is saying the biggest problem we have with making sure that users of the Android operating system get updates is Qualcomm. Interesting, isn't it?
Google is coming out with, what's known as a system on a chip, SOC. What that is, think of the motherboards of years past. One of my first computers was an IBM three 60 30 mainframe, and this thing was huge and not much power. It's just amazing to think about, but it really could sling data around even way back then. It was a nice little computer, if you will. Think about how big that motherboard was. Yeah. It had the main processor. You had the memory controllers, the bus controllers, you had everything right that needed to be there to support it. All of your IO stuff. I might have had serial UARTS built into it, et cetera, et cetera.
A system on a chip is basically you got one chip and that's pretty much all you need. Obviously you got to have memory and you're going to have some sort of storage, other more permanent storage devices, but that's the basics of what a system on a chip is. Google reportedly anyways, that the pixel six is expected to ship with Google custom white chapel is what it's going to be called system on a chip internally.
It's referred to as a GS101. And that GS could be for Google silicone. There's all kinds of people speculating that seems to be the kind of the big one. There is a pixel six in the works. We do know that. Nine to five Google, is a website out there and they've done a lot of little spying on what's going on, but apparently it's a, I'm not going to get into all of the details, but basically it's going to have three CPU cores in it and everything. It's going to be really quite nice. A large arm core for single threaded work loads and three medium cores for multi work.
We've had a problem over the years. How do you make your computer faster? And you can use Intel's approach, which is let's just throw more processors at it. That's great if the software you're using can handle multithreaded environments where you have multiple processors. Okay. You got multiple processors, but how about the access to the memory? What if the process is all one access to the same area of memory at the same time? Then you have to start blocking. It gets very complicated, very fast. Intel chips fade very, very fast. You don't have to get to too many CPU's before all of a sudden the addition of one more CPU cuts the performance of that new CPU by 50%. It really doesn't. It really doesn't take much.
They're all trying to get away from Intel. Many of them have, right? Obviously Google Android phones outside of Google as well have been based on non-Intel hardware for awhile, but they're also now trying to get rid of Qualcomm. And I think that's a good thing. Ultimately, it's going to help out a lot. We're going to see more of this thing in the future, and we're all going to benefit from it, right? With the Google having control over their system on a chip, at least their pixel, it's going to make their life easier, which means if you buy a pixel, you're probably going to be able to get the upgrades better.
Thinking in the back of my mind that maybe Samsung is looking to do the same thing. Maybe Samsung's looking to move away from the Qualcomm chips and move to Google's new system on a chip. I have no idea. I have no inside information, but that would seem to make sense for me, particularly if they want to provide support for years.
By the way, Google is in the embarrassing position of offering less support for Android devices than Samsung, which is now up to three years of major updates, which by the way, is Qualcomm's maximum. Samsung has four years of security updates for some of their devices as well.
You're listening to Craig Peterson.
You can find me firstname.lastname@example.org. Don't go anywhere.
You've heard about the no fly list, right? Yeah. How about the terrorist and other watch lists? These lists that people have found it's impossible to get their names off of, even when there was no reason to be there in the first place?
Well, I got some news.
Hi, everybody. Craig Peterson here.
Department of Homeland security has been criticized for many things over the years. One of the things that's been criticized quite a bit about is this watch list that they maintain. They have a watch list for no fly. People get put on that watch list. It was originally intended to be, we know this guy's a terrorist, so we're going to put them on, right.
It's not always the way it goes. It starts out almost innocuous and before you know, it, there's all kinds of people getting caught in this big, big net.
That's what's been happening lately and it's going to get worse because the Department of Homeland security has decided that they are going to hire regular old companies to help develop this no fly list and also this terrorist watch list.
Apparently these companies are going to be looking through all kinds of public data, maybe some private data, social media in order to provide information for this new domestic terror watch list. So you look at that and say, okay, I can see that.
We've talked to before problem, man, 20 years ago, I think I was talking about these data aggregators and the problems they create. Cause they're taking public records, they're putting them all together. They're figuring out how it all meshes together and they come up with a pretty accurate picture of who you are.
Now, I've got to say when I've had them on my show here before I was talking to them and said, okay, I want to look up my own records. So I looked them up on their platforms. I did not see a single one that was more than about 30% correct about me.
Now, this was again, some years ago. I think it's been probably almost a decade since I last spoke with the data aggregators. They really are trying to blend into the background, nowadays. This data that's put together by these artificial intelligence systems is not necessarily that accurate and that gets to be a real problem.
So who is DHS gonna hire? Well, from the description that has been reported on here by the Conservative Tree House, it is going to be big tech, specifically, Google, Facebook, YouTube, Instagram, Snapchat, Twitter, and more.
DHS is going to put them under contracts to hire and organize internal monitoring teams to assist the government by sending information on citizens, they deem dangerous again, what could go wrong?
Our government is not allowed to spy on us. How many times have we talked about this? You have of course the five eyes and then they added more and more. These are governments that spy on each other's citizens for each other.
So for instance, US cannot spy on US citizens. So we have an arrangement with the United Kingdom, New Zealand, Australia, Canada, to spy on the US citizens for us that makes sense to you. Can you believe that?
We spy on their citizens for them and they spy on our citizens for us and all is good.
What's happening here is The Department of Homeland security realizes it cannot spy on us directly. This is what they've been doing for very long time, they go to the data aggregators and they pull up the data that they want.
They want to see if this guy maybe selling illicit drugs and they pull up public records. What cars does he have? How many homesdo they own? Who's he dating? Has she all of a sudden been buying diamonds and mink coats? What's going on here?
So now we're seeing that the US intelligence apparatus. It's really now going live quickly, to put together lists of Americans who could be potential threats to the government and need to be watched.
Now it's all well, and good. It's just like president Biden this week saying, Oh, we're going to have these red flag laws. We're going to stop the sale of certain types of firearms and things. It all sounds good. The reality is we have known about some of these people before, right? This is all just a red herring that the federal government is doing right now because the real problem is these terrorists, the domestic and otherwise that have shot up schools have almost always been reported to law enforcement as dangerous people. Some of them have even been on lists that say they cannot buy firearms, and yet they get firearms. Bad guys.
It's like here in the US. Where does our fentanyl come from? We're not making a domestically. Our fentanyl is coming from China often through Mexico, and it is killing people here in the US. The whole George Floyd incident, and what's happening with fentanyl in his system, right. The question is, did the police operate properly? What killed him? According to the coroner's report? It was the fentanyl. that killed him.
One way or the other that fentanyl got here from China and is being used on the streets and people are dying from it. Fentanyl's illegal. How, how could they possibly get it?
It's illegal for a felon to be in possession of a firearm. How did it felon get the firearm? The police were warned about people in San Bernardino, California, they were warned. The people in that business told the police. We were calling. we're really worried about this guy and nothing happened.
So now what are we going to do? We're going to cast an even wider net, when we cannot take care of the reports that come in right now. We're going to get even more reports and they're going to be coming from these AI systems. Again, what, what could possibly go wrong here? It's absolutely incredible.
They look at these reports, they try and determine are these actionable, the FBI or other law enforcement agencies. They've been deciding no, it's not actionable. They've been right sometimes and they've been wrong other times. This is a real problem.
What shocked me is NBC news with Andrea Mitchell, NBC news. Not a centrist news organization, very far left. NBC news is even reporting on this. They're realizing the consequences. Here's a quote from NBC. "DHS planning to expand relationships with companies that scour public data for intelligent and to better harness the vast trove of data it already collects on Americans."
"The department is also contemplating changes to its terrorist. Watch listing process." Absolutely amazing. "Two senior Biden administrative administration officials told NBC news that Homeland security whose intelligence division did not publish a warning of potential violence before the January sixth Capitol riots, is seeking to improve its ability to collect and analyze data about domestic terrorism, including the sorts of public social media posts that threatened a potential attack on the Capitol."
"DHS is expanding its relationships with other companies that scour public data for intelligence. One of the senior officials said, and also to better harness the vast trove of data it already collects on Americans, including travel and commercial data through customs and border protection, immigration, customs enforcement, the coast guard, secret service, and other DHS components". There you go from NBC news. So remind yourself what the FBI contractors with access to the NSA database already did in their quest for political opposition, research and surveillance, and then get everything we were just talking about.
The director of national intelligence declassified, a FISA judge's ruling. So this is judge James Boasberg, 2018 ruling, where the FBI conducted tens of thousands of unauthorized NSA database queries. Do you remember that story? Very, very big deal. This judge obviously passing these things out like candy and the FBI misusing its power and authority. Again, what could possibly go wrong?
By the way, President Obama apparently has been telling us that we should use the no fly list to keep people from owning guns.
There's already a database maintained by the FBI. This whole thing is, as I said, a red herring things are going to get really bad if law enforcement does this. Frankly, they're going to do it. There's no two ways about it.
We have to be more careful about keeping our information, our data private. That's what this whole course that started last week was all about. Improving your Windows privacy and security. Locking it down because the way Microsoft ships windows and the way it installs and configures itself by default does not keep your data private. That's a problem. So that's what we're going through. Hopefully, you were able to get into that before we closed it Friday night.
Remind yourself of this and just keep chanting nothing bad could happen here, right? Ah, the joys of all of these computers and databases and the way the work in nowadays.
By the way, if your information is out there at all, even if you use fake names and numbers and addresses and things like I do when it's not required. Right.
I don't lie to the bank. I don't lie to the IRS. Nobody else needs to know the truth. Even if you have been, keep it private, good chance that they know who you are and where you are. Crazy. Crazy.
Hey, visit me online. Craig peterson.com.
Make sure you subscribe to my weekly newsletter.
Hi everybody. Of course, Craig Peterson here. We're going to talk today about these drone swarms, your personal privacy risk tolerance breach highlights here over orgs individuals. What's going on? Ransomwares way up.
As usual, a lot to talk about. Hey, if you miss part of my show, you can always go online to Craig peterson.com. You'll find it there. If you're a YouTube fan CraigPeterson.com/youtube.
This is really an interesting time to be alive. Is that a good way to put it right? There used to be a curse "May you live in interesting times" Least that was the rumor.
One of the listeners pointed this out, there was a TV show that was on about five years ago, apparently, and it used this as a premise. I also saw a great movie that used this as a premise and it was where the President was under attack. He was under attack by drones.
The Biden administration has a policy now where they're calling for research into artificial intelligence, think the Terminator, where you can have these fighting machines.
These things should be outlawed, but I also understand the otherside where if we don't have that tech and our enemies end up having that tech, we are left at a major disadvantage.
Don't get me wrong here. I just don't like the idea of anybody doing Terminators, Skynet type of technology. They have called for it to be investigated.
What we're talking about right now is the drone swarms. Have you seen some of these really cool drones that these people called influencers? Man, the term always bothers me. So many people don't know what they're doing. They just make these silly videos that people watch and then they make millions, tens of millions, I guess it's not silly after all.
These influencers make these videos. There are drones that they can use if they're out hiking, you might've noticed or mountain biking or climbing. They have drones now that will follow them around, automatically. They are on camera. It's following them. It focuses in on their face. They can make the drone get a little closer or further away. As long as the sky is clear there's no tree branches or anything in the way that drone is going to be able to follow them, see what they're doing and just really do some amazing shots. I've been just stunned by how good they are.
Those drones are using a form of artificial intelligence and I'm not going to really get into it right now, but there are differences between machine learning and artificial intelligence, but at the very least here, it's able to track their faces.
Now this is where I start getting really concerned. That's one thing. But they are apparently right now training. When I say they, the Chinese and probably us, too, are designing drones that not only have cameras on them, but are military drones. They have without them having to have a central computer system controlling them or figuring out targets, they're able to figure out where there's a human and take them out.
These small drones, they're not going to take them out by firing, a 50 caliber round at them. These drones can't carry that kind of firepower. It's just too heavy, the barrels and everything else -- it's a part of that type of a firearm. We're talking about small drones again. So obviously they're not going to have a missile on them either.
What they do is they put a small amount, just a fraction of an ounce, of high explosives on the drone. The idea is if that drone crashes into you and sets off its explosives, you're dead, particularly if it crashes into and sets off explosives right there by your head. Now that's pretty bad when you get down to it.
I don't like the whole Skynet Terminator part of this, which is that the drones are able to find that human and then kill them.
Think of a simple scenario where there is, let's say there's a war going on. Let's use the worst case scenario and, enemy troops are located approximately here. You send the drones out and the drone has of course, GPS built into it, or some other inertial guidance system or something in case GPS gets jammed.
That drone then goes to that area.
It can recognize humans and it says, Oh, there's a human and it goes and kills the human. Now that human might be an innocent person. Look at all of the problems we've had with our aerial drones, the manually controlled ones, just the ones that we've been using in the last 10 years where we say, okay, there's a terrorist here. Now they fly it in from, they've got somebody controlling it in Nevada or wherever it might be, and they get their strike orders and their kill orders. They go in and they'd take it out. There are collateral damages. Now that's always been true.
Look at Jimmy Stewart. For example, a younger kids probably don't know who it is. Mr. Smith goes to Washington was one of his movies. He had some great Christmas movies and stuff too. Anyhow, Jimmy Stewart was a bomber. I think he was a pilot actually in World War II. He flew combat missions over Germany. Think of what we did in Germany, in Japan, where we killed thousands, tens of thousands, hundreds, probably of thousands of civilians.
We now think, Oh we're much better than that. We don't do that anymore. We're careful about civilian casualties. Sometimes to the point where some of our people end up getting in harm's way and killed. For the most part, we try and keep it down.
A drone like this that goes into an area, even if it's a confined area, and we say, kill any humans in this area, there are going to be innocent casualties. It might even be friendly fire. You might even be taking out some of your own people.
They've said, okay we've got a way around this. What we're going to do is we're going to use artificial intelligence. The drone doesn't just pick out, Oh, this is a human. I'm going to attack that person. It looks at the uniform, it looks at the helmet. It determines which side they're on. If they're wearing an American or Chinese uniform, whatever, it might be programmed for it again, it goes into the area, it finds a human and identifies them as the enemy. Then it goes in and hits them and blows up killing that person. That's one way that they are looking to use drones.
The other way is pretty, scary. It's, you can defend yourself against a drone like that. You've got a drone coming. You're probably going to be able to hear it. Obviously it depends. That drone gets close. I don't know if you've ever had the kids playing with drones, flying them around you, or you've done the same thing. You can always hit it out of the air, can't you?
If you're military and you have a rifle in your arm, you can just use the rifle and play a little baseball with that drone. There's some interesting stories of people who've been doing that already.
What happens if we're not talking about a drone, we're talking about a drone swarm. I don't know that you could defend against something like that. There have been studies that have been done. So think, you think there nobody's really working this suit? No, they sure are.
What's going to happen? Well, the Indian army is one that has admitted to doing tests and they had a swarm of 75 drones. If you have 75 drones coming after you, let's say you're a high value target. There is no way you're going to be able to defend yourself against them, unless you can duck and cover and they can't get anywhere near you with their high explosives. The Indian army had these Kamikaze-attack drones. They don't necessarily have to even have high explosives on them.
This is a new interpretation under Joseph Biden. Mr. President of the Pentagon's rules of use of autonomous weapons. We've always had to have "meaningful human control." That's the wording that they Pentagon uses meaningful human control over any lethal system. Now that could be in a supervisory role rather than direct control. So they call it "human on the loop" rather than "human in the loop." But this is a very difficult to fight against.
The US army is spending now billions of dollars on new air defense vehicles. These air defense vehicles have cannons two types of missiles, jammers. They're also looking at lasers and interceptor drones, so they can use the right weapon against the right target at the right time.
That's going to be absolutely vital here because it's so cheap to use a drone. Look what happened. What was a year plus ago now? I'm trying to remember, Central America, Venezuela, somewhere in there where El Presidente for life was up giving a speech. I'm sorry. I didn't mean that to be insulting, but that often is what ends up happening. A drone comes up and everybody's thinking: Oh, it's a camera drone, wave to the camera thing. It got very close to the President and then blew up. On purpose, right? They were trying to murder the president. That's a bad thing. He was okay. I guess some of the people got minor injuries, relatively speaking.
When we're looking at having large numbers of incoming threats, not just one drone, but many drones, many of those drones may be decoys.
How cheap is it to buy one of these drones? Just like the ones that were used in China over the Olympic stadium, where they were all controlled by a computer. You just have these things, decoys, all you need is a few of them that can blow up and kill the people you want to kill very concerning if you ask me.
We're paying attention to this as are other countries as they're going forward.
We're going to talk about building your privacy risk tolerance profile, because if you're going to defend yourself, you have to know what you're going to defend against and how much defense do you need?
Hey, we take risks every day. We take risks when we're going online. But we're still getting out of bed. We're still going into the bathroom. We're still driving cars. How about your online privacy risk tolerance? What is it?
Hi everybody. Thanks for joining me.
We all take risks, and it's just part of life. You breathe in air, which you need. You're taking the risk of catching a cold or the flu, or maybe of having some toxic material inhaled. We just don't know do we?
Well on any given day, when we go online, we're also facing risks. And the biggest question I have with clients when I'm bringing businesses on or high value individuals who need to protect themselves and their information is: okay... what information do you have that you want to try and protect? And what is your personal privacy risk tolerance? So we build a bit of her profile from that and you guys are going to get the advantage of doing that right now without having to pay me my team. How's that for simple?
First of all, we got to understand that nothing is ever completely safe. When you're going online, you are facing real risks and no matter what people tell you, there is no way to be a hundred percent sure that your data is going to be safe online or that your individual personal, private information is going to be safe while you're online. And there's a few reasons for this.
The most obvious one, and the one we think about, I think the most has to do with advertising. There are a lot of marketers out there that want to send a message to us at exactly the right time. The right message too obviously? So how can they do that? They do that by tracking you via Google. So Google that's their whole business model is to know everything they can about you and then sell that information.
Facebook, same thing. Both of those companies are trying to gather your information. They're doing it when you are not just on their sites, but when you are on other people's sites. Third party sites are tracking you. In fact, if you go to my website @ craigpeterson.com, you'll see that I do set a Facebook cookie. So I know that you're on Facebook and you visited my site and you might be interested in this or that.
Now I'm not a good marketer. Because I'm not using that information for anything, at least not right now, hopefully in the future, we'll start to do some stuff. But that's what they're doing. And the reason why I don't think it's a terrible thing don't know about you.
I don't think it's bad that they know that I'm trying to go ahead and buy a car right now. Because if I'm trying to buy a car, I want advertisements about cars and I don't want to advertisements about the latest Bugatti or Ferrari, whatever it might be. I want a Ford truck, right? Just simple something I can haul stuff around. You already know I have a small farm, and I need a truck because you need one. I'd love to have a front loader and everything too, those costs money and I ain't got it. So that makes sense to me.
And now there's the other side, which is the criminal side. And then there's really a third side, which is the government side.
So let's go with the government side here. In the United States our government is not supposed to track us. Now I say "supposed to," because we have found out through Edward Snowden and many other means that they have been tracking us against the law. And then they put in some laws to let them do some of it, but our government has been tracking us.
And one of the ways it tracks us is through the "five eyes" program and now that's been expanded and then expanded again. But the five eyes program is where the United States asks the United Kingdom. Hey, listen. Hey bro. Hey, we can't and we're not allowed to track our citizens, but your not us. How about we have you track, Trump and his team? Yeah, that's what we'll do.
So there's an example of what evidence is showing has happened. So they go to a third party country that's part of this agreement,d where all of these countries have gotten together, how signed papers and said, yeah, we'll track each other citizens for each other.
And that way the United States could say, Hey, we're not tracking you. And yet they're tracking because they're going to a third party country. And the United States, if you are going out of the country, then again, they can track you. Any communications are going out of the country. So that's the government side.
And then of course, there's governments that track everything. You look at China and how they control all of the media. They control all of the social networking sites. They basically control everything out there.
We have to be careful with all of that stuff because it can and will be used. And we've seen it has been used to really not just harass people, but do things like throw them in prison disappear them. Look at what just happened in China, with the head of China's biggest company, basically the Amazon competitor over there. And he disappeared for months and then came back, just praising the Chinese Communist government and how great it is to have all of these people over there. Just telling them what to do and how to do it.
We obviously don't live in China. We obviously, I think have oligarchs nowadays. We have people who are rich, who are running the country. They're giving money to campaigns, they get the ear. You seen all of the bribery allegations against the Biden crime family, or his brother, his son, other members, himself as well, based on a hundred Biden's laptop.
So I don't trust government for those very reasons.
The hackers let's get into the hackers here. When it comes to hackers, there are, again, a few different types. You've got hackers that are working for governments. And what they're doing is in the case of a small government, like North Korea, they're trying to get their hands on foreign currencies so that they can use those currencies to buy grain, to buy oil, coal, whatever it is they might need to buy.
You have governments like China and Russia that are trying to basically run World War three. And they're out there with their hacking teams and groups and trying to figure out how do we get into the critical infrastructure in the United States? Okay. So this is how we get in. Okay. We're in over there. So if we ever want to shut down all of the power to New York City, this is what we do.
Now remember, that's what happened back in, in when was that 2004, I guess that was, yeah. I remember I was down in, I was heading actually to New York city and then all of a sudden, all of the power went out.
That apparently was an accident, but it didn't need to be an accident. There are all kinds of, allegations about what actually happened there. But that's why China and Russia are trying to get into our systems. And then they obviously want to play havoc. Look at the havoc that was caused in the U S economy by this China virus that came obviously from China for Huan. if they wanted to shut down our economy, they now have proof that's all it takes. And they are working on the genetics of some of these viruses over there in China. And they're trying to modify the genes and they are the running experiments on their troops to enhance them, to make these super soldiers that maybe, need less sleep or less food are stronger or et cetera, et cetera, they are doing that.
So China is a real threat from just a number of different ways. What would it be like if they could shut down our banking system or make it so we don't trust it anymore?
Okay. That's part one of your Personal Privacy Risk Tolerance Profile. Stick around because we're going to talk more about this and what you can do to help you have privacy.
What is your online, personal privacy risk tolerance? It's going to vary, I help high value individuals. I help businesses with this, and now I'm helping you as well. So let's get into part two.
Craig Peterson here.
When people ask me, what should I do? That is a very nuanced question. At least it's a very nuanced to answer because you could say something like: if you want to be private, use Signal for messaging and useTorr for web browsing, that's fine. And it works in some ways and not in others. For instance, Tor is a web browser that is like a super VPN. It is set up so that you're not just coming from one exit point, you're coming from a whole bunch of different points on the internet. So it's hard to track you down. The problem, however, with Tor is the same problem that you have with VPN services. And I talk about this all the time.
VPN services do not make your data secure. It does not keep it private. And in the case of VPN services that you might get for free or even buy, and also the case with Tor. Using those VPN services that can make you less secure again. Why did Sutton rob banks? He robbed banks because that's where the money was, where he is a bad guy going to go. If they want easy and quick access to lots of peoples. Private information?
They're going to hack a VPN server aren't they? Yeah. And if they can't hack the VPN server, why not just have server space in the same data center that VPN provider is renting their space from and then hack it from there, try and get in from there. Or maybe get into the service; the data centers will logs or the VPN servers logs, because even when they say they don't log, they all log, they have to log, they have to have your information otherwise, how can they bill you? And the ones that say we don't log, which are those people are "lieing" by the way. But those guys that have these VPN servers and they're trying not to log, they're trying not to log where you're going. They get fooled all of the time as well. Because their servers have logs, even if they're deleted and disappear.
So I just wanted to make it clear that you, I, if you have a low risk tolerance, when it comes to your privacy, Tor is not going to do it for you. VPN services are not going to do it for you. You have to look at all of the individual things you're doing online and then decide based on those. What is it that is the most. Beneficial for you in that particular case. Okay.
So Signal, I brought it up. So let's talk about it for a minute. Signal is the messaging app to use bar none. Signal is encrypted and do, and it is known to be highly secure, which again, Doesn't mean it's a hundred percent, but with Signal, you can talk to people on other platforms. You can have a Mac and talk to somebody on a, on an Android or a windows device.
But another consideration is who are you talking to? If you're talking to other people that have Macs and you don't want your information to get out, but you're not horrifically worried about it, right? You want it to be private. You want end to end encryption. You're better off using iMessage on your Mac.
If you're on Windows or Android, there are not any great built-in messaging apps. WhatsApp. If you listened last week and I've got it up on my website, WhatsApp is not great. They claim it's not horrible, but why would you use it if there's a question use Signal instead.
All right. So there's just a lot to consider when we're talking about it, but here is your big bang for the buck thing. That you can do. And that is use password manager. Now we talked about how Google Chromium Google's Chrome and of course now Microsoft edge. Actually it was the other way around Microsoft edge came up with it first and now Google's adding it.
But Edge has this password manager built-in. That's all well and good, but I don't know, trust those. I use a third party password manager that is designed for password management and that's all the company does. They're focused on the security behind it, which is why I recommend 1Password and lLastPass. 1Password being my absolute favorite. Use those password managers. That's the biggest bang for your buck if you have a low tolerance for your information, getting out. All right?
Now that will help to enforce good password habits. It will generate passwords for you, both of those, and it'll generate good passwords and it'll keep them for you, which is really great.
If you don't want to be tracked while you're browsing online, you can use an ad blocker. I have a couple of webinars I've done on that. If you want a video of one of those webinars to go through that talks about these different blockers ad blockers and others. I'd be glad to send you a link to one of them, but you're going to have to email Me@craigpeterson.com.
And I will send you a link to one of those webinars I did on that stuff. No problem. But some websites are going to break when you use an ad blocker. So sometimes you have to turn it off and you have to turn it back on. The ones I tell you how to use and how to configure, I actually show you a step-by-step we walked through it. Those allow you to turn off that particular ad blocker on an individual site that was broken because of the ad blocker. So pretty straightforward. You don't have to remember to turn it all on and all off. All right.
Now studies are showing that people are concerned about their privacy. In fact, I believe last I saw said that I think it was about 70% of Americans believe that their smart phones are being tracked by advertisers, and the tech companies provide them with the information. May, 2020 Pew research report talked about this, but 85% of consumers worry, they can trust corporations with their data. So what do you do? Because. Most people don't have the support or the tools. They don't have. I have the money, they didn't get a big inheritance. They're not a high value individual that needs my help and can afford it -- where we go through everything that they do and make sure they have the best solution for each thing, including banking, including going online and trading stocks, all of that stuff.
You gotta be very careful with all of that stuff. I'm really sad that I have to say this here, but there are no online privacy solutions that will work for everybody. And there are no solutions that work in every situation either.
So what you need to do is understand what you care the most about. And I think for all of us, what we should care the most about is our financial situation and anything associated with that: our intellectual property, if we're businesses, our bank accounts, all of that sort of stuff is stuff we really should be concerned about. And that means you need to watch it. Make sure you're not sharing stuff that you really don't want to share.
So even privacy experts like myself, don't lock everything down. We locked most of it down. Particularly since we have department of defense clients, we have to maintain a very high standard.
All right. Stick around and visit me online. CraigPeterson.com. Make sure you sign up for my newsletter.
You'll get all of the latest news and the tips I send out every week.
I don't want to leave you hanging. We're going to get into a few more things to consider here, because obviously we are going to share some of our personal information. So I'm going to tell you how I share my personal information and it might be a bit of a surprise.
Hello everybody. Thanks for listening.
We all enjoy products and services, and that's what I'm saying. When when I talk about security experts, we don't lock everything down. I've used 23 in me. I did that thing, of course, I'm sending in my DNA. That's been an issue in some cases, but that's what I did.
I use these online map programs. I use Google maps. I use weighs more than Google maps. I use Apple maps cause I'm trying to figure out how do I get to where I want to go in a reasonable amount of time. But what I do is I lie about the answer to the security questions. Okay. I don't want them to know my dad's name.
My mother's maiden name, the street. I was, I grew up on my first school, my first car, none of their business. Because it's a lot of that information is actually publicly available. How many of us on LinkedIn have right there in our profile? Yeah I went to McGill university or I w I grew up here's pictures of my childhood home, and that picture has GPS coordinates in it.
So if we use the real information. We are giving away way too much. I use a little phrase I coined here, which is lie to your bank. And you might remember. I did a show on that sometime ago. And the idea here is in your line to the bank about your financial situation, it's nothing like that. You're lying to your bank about this personal information.
They don't need to know these personal questions. They give you for their security questions. It's really important to understand all of this stuff. Okay. For instance, this is Jennifer Granick, she's at the ACL, you and she said her dad died recently. And the accountant said it's really important to report the death to credit companies because the answers to many of the security questions are on the public death certificate.
So answers to security questions really can be a nightmare, but that doesn't mean you have to give them the right answers. So for instance, I found a site online. I should try and dig that up again, but it generated fake identities. And I had a generate like 5,000 of them for me thinking, okay, they might go at some point and it even generated fake social security numbers, fake phone numbers, names, addresses, everything, everything you'd need for a fake identity. And the idea here isn't to cheat anybody out of anything. The idea is, Hey, Mr. Website, you don't know, you don't need to know who I really am. So on some websites, I'm female some websites I've, I'm only 30 years old on other websites. I'm 80 years old. It doesn't matter.
You can call it a lie if you want. But in reality, you're just trying to keep your information straight not and another advantage. Of these password managers. Cause you're trying to keep your information straight, right? It's hard to remember a lie and you have to tell a lie to enforce a lie. You're not, all that stuff your mother told you.
And she's right about that too, by the way. But if you're using a password manager, what I do is I create a unique email address. In fact, my email addresses are extremely unique, so I'll use a plus sign as part of my email address and my mail server knows. Oh, okay. That's just Craig trying to track who is using.
That email address. So I'll have Craig plus YouTube for instance, email@example.com. I actually have a whole bunch of domains that I use as well. And if you want a secure email service have look at proton mail. They're actually very good from a security standpoint. So there's nothing illegal about giving them this information.
Yeah. You're lying to them, but you gotta keep your lies straight. Another reason to use a password manager because I have the password manager generate my. My password I put in the email, which is unique for every website I go to, I never use that same email address twice if I can avoid it. And then I, and I use aliases too in my email server.
And then I go and in my notes section for that website in my password manager, I put in the answers to the security questions and I just make stuff up nonsensical stuff. So it's asking what my first car, it might be a transformational snooze. There you go. I just made something up. So I'll put those notes into my notes in my password manager and save them.
So if I ever have to do some sort of a recovery with those guys, it's going to be simple. Because I just look in my password manager, I got to go in there anyways to get my password right. And my email address or username to login. And there it is, there's my security questions. And then the password manager, cause I'm using one password.
It has a little database, it keeps and everything in there is encrypted. And the only way to decrypted is with my password, my one password, that's it. You only have to remember one password and that's the password to one password so that you can decrypt that little vault of a database of all of your information.
So I have, I bought a, I think it's a 30 plus character password I use for one password because yeah, I'm a little bit paranoid about all that sort of stuff. So that's a really good way to be able to keep your information safe. I talked last week about a friend of mine. Whose wife went on Facebook to get some help, some tips on selling her investments investment anyways, and the disaster.
That was okay. So a lot of people have regrets about what they've posted on Facebook, and there's a really cool thing out of CMU. Carnegie Mellon university, where these, how many, it's six guys and gals. They put together this special report. I regretted the minute I pressed share a qualitative study of regrets on Facebook.
Very interesting. So they looked at all of this stuff as best they possibly could. And what did they find? Some examples, just think for yourself what regrets you might have. I know friends of mine in the grads that they have had. But there are a lot, so they go through privacy risk. I can send you a copy of this article if you're interested.
It talks about their methodology. They analyze comments on the New York times website and others Craig's list to regroup people. They, so they've got all of the stuff. Here you go sensitive content. Number one. So alcohol and illegal drug use. Think about that. Think about your employer, your next employer or the police.
They got a report on you. Oh my, this is a bad person. So they go onto your Facebook page and they find. Oh, photos posted from a party with some very non unflattering photos in it. And even maybe mentioning a illegal drug use, what it thinks is going to happen. How about if you get stopped at the border coming back from Canada, Mexico, Europe.
And they decide to do a little deeper look into you and they find this stuff online. The next one sexual content, you can imagine what that is. Think of a Congressman from New York, in fact, religion and politics apparently is one of the things people have regretted posting online, profanity and obscenities, personal and family issues.
Working company here, negative or offensive comments it's arguments, lies and secrets, venting frustration, good intentions intended purposes. I didn't think about it. Hot. State. Yeah. Oh, my this thing just goes on and on, but keep all of this in mind, when you are trying to keep your information private, whether you are a business or an individual, you have to have eternal vigilant watch when your emotions are high, right.
It's like drunk dialing. Don't do it. Or your emotions are high. Something's been going on. Don't put it online. So that's I think a real good bottom line about your. Personal privacy, risk tolerance profile. Okay. Be very careful. , don't put stuff that you don't want other people to see. It's not true that once it's out on the internet, it's there forever.
It's not true that once you've posted it, it's there for anyone to discover. None of that's true. Not at all. Okay. But be very careful cover up your laptop cameras. In fact, in the improving windows security course, I go into this in quite a bit of detail, what you can do, what kind of cameras you can and should use, what sort of microphones you can or should use.
Many people just cover up the laptop cameras with the sticky note. When they're not using it disable automatic image loading in your mail program. That's important. I do that as well, because that image that's in the email is usually being used to track you. It's really that simple. You've got new privacy laws in many States and in Europe, they are really not going to work or help you with your privacy, except with the really big companies out there.
So keep all of that in mind. All right, everybody. I want to encourage you go to Craig peterson.com. You'll see all kinds of great information there. You're going to be able to also listen to my whole show, pick up all the little training tips and even find out about the courses that I'm offering. Craig peterson.com
More stories and tech updates at:
Don't miss an episode from Craig. Subscribe and give us a rating:
Follow me on Twitter for the latest in tech at:
For questions, call or text: