Showing results for
Did you mean:
Sorry, something went wrong. Please refresh your browser and try again.

Currently Reading

Artificial Intelligence is Supercharging How You (and Your Home) Are Surveilled

Next Up

Read the Fine Print Before Signing Up for the New Regal Unlimited Movie Pass Read the Fine Print Before Signing Up for the New Regal Unlimited Movie Pass

How a DUI Affects Your Auto Insurance How a DUI Affects Your Auto Insurance

Tips For Traveling With Your Dog Tips For Traveling With Your Dog

Artificial Intelligence is Supercharging How You (and Your Home) Are Surveilled

Lidia Davis

Lidia Davis

Home Security Writer

7 min. read

All products and services mentioned on Reviews.com are chosen by our editorial staff. If you click on a link, we may earn a commission. Learn more.

You’ve surely walked past a security camera at some point in time. This encounter probably didn’t faze you, and you might not have even noticed the camera’s existence. With cameras increasingly accessible for use in the home (even on our doorbells), people are living in a new age of surveillance. At the same time, artificial intelligence (AI) is unlocking a whole new world of potential for the technology, as documented in a recent report by the American Civil Liberties Union.

Jay Stanley, senior policy analyst with the ACLU and author of the report — “The Dawn of Robot Surveillance,” analyzed the $3.2 billion video analytics industry, which, he says, is supercharging surveillance cameras into an awakening.

“Institutions, computer scientists, and government agencies are turning to AI to do the grunt work of watching hours upon hours of boring video for anything that the camera and owners want to look for,” Stanley said. “There can be good uses for that technology, but there can also be very frightening ones, because it does have the potential to supercharge a police state — to turn video cameras from something that passively record us to active watchers that are constantly judging us.”

Where We Could Be Headed

The AI we see used today — whether with Google Nest cameras that detect familiar faces, or by police forces to help aggregate biometrics from hundreds of cameras across cities to catch criminals – has the potential to unlock innovation that hasn’t been possible relying on cameras and their operators alone.

“What we are seeing today is that AI-enabled public surveillance allows law enforcement to act much faster when calamities arise,” said Arjan Wijnveen, CEO of CVEDIA, in a statement to Reviews.com. “While in the past a manual review of surveillance footage using hundreds of agents to find a suspected terrorist would’ve taken days to weeks, this can now be achieved within hours due to major breakthroughs in AI-based people re-identification.”

In another context, that same technology could one day be used to pick up on people’s physical and emotional state. For example, the ACLU report points to a 2016 study from the Massachusetts Institute of Technology (MIT) that explores how cameras could be used to “monitor health and stress levels for people during daily life” and “intervene in high stress situations.”

One of the report’s more Orwellian scenarios reads: “Hi there, we saw you jogging by at an 11:31/mile pace earlier today. We have a special life insurance policy just for people like you!”

While the technology isn’t quite there, according to Stanley and other AI experts we’ve talked to, there was a time when people couldn’t have imagined such technology as voice recognition or a personal assistant named Siri that lives in your pocket.

“Maybe the most unlikely thing about that scenario is that they let you know that they’re doing it,” Stanley said. “You know, personal information about people is worth a lot of money … so I think it’s highly likely we’re going to see a lot of these uses.”

What We’re Already Seeing

The Department of Homeland Security (DHS) plans to use facial recognition technology on 97% of departing air passengers from the U.S. by 2023. DHS reported earlier this month that over 2 million passengers on 15,000 flights have been scanned by U.S. Customs and Border Patrol (CBP) biometric facial recognition technology (Biometric Exit Program) since the technology’s 2017 inception. The service uses photographs from an existing “gallery” of information, including passports, visas, or interactions with CBP at border encounters, and uses AI facial recognition to cross reference this gallery data with live photos taken before departure.

According to the DHS report, the Biometric Exit Program helped CBP discover 7,000 people who had overstayed their visas, and six people who had attempted to enter the country with travel documents that didn’t belong to them.

In the residential and home security spaces, the AI used to differentiate between humans and animals is gaining traction. Even some of the most inexpensive cameras on the market — like those from Wyze — allow users to minimize false alarms by opting to receive alerts only on people gracing the sensors, rather than animals and wind. Google Nest, on the other hand, takes the technology a step further with its “familiar face detection.” So not only can Google Nest’s more advanced security cameras distinguish between humans and animals, you can also teach them to recognize individual family members, friends, and other guests. The AI doesn’t work automatically, however, and it might take your Google Nest cameras several weeks to recognize everyone accurately.

In pursuit of new and efficient ways to harness AI, facial recognition technology opens the door for inaccuracies and unfair discrimination. Amazon began pushing its facial recognition feature — Rekognition — to law enforcement not long after its release, according to the ACLU. But the technology received backlash from researchers who found the feature’s algorithms weren’t able to distinguish different skin tones and genders as well as AI from IBM and Microsoft. Another MIT study shows how algorithms trained with biased data cause “algorithmic discrimination.” The study found that, in general, big names in AI like Microsoft and IBM are better at detecting male faces and “worst” at “detecting darker female subjects.”

The Electronic Privacy Information Center, a nonprofit research organization, shared documents with Buzzfeed News, which reported there were originally “no limitations” on how partnering airlines could use the facial recognition data gathered at the airports, and that CBP skirted a mandatory system of checks and balances designed to seek public feedback prior to implementation.

Now, CBP states partners aren’t allowed to use these photos for their own business purposes, but because digital privacy laws — let alone facial recognition laws — are scant to nonexistent in the U.S., privacy experts say the expectations are slightly different.

“You know, in many ways, it’s a wild west when it comes to privacy, and that throws it on consumers way more than it should to look out for themselves,” Stanley said.

Management at the Consumer Level

Although it might be difficult, you have the right to opt out of biometric facial scanning at the airport (which only applies to U.S. citizens and exempt aliens). You also have the right to ask your neighbor with a doorbell camera to stop recording you. In Google’s own words, for its Nest users:

“Use familiar face detection in compliance with the law. Depending on where you live, you might need to get consent to have your camera help identify people visiting your home.”

Although privacy policy in the U.S. has largely been led by a self-regulatory structure, certain cities, like San Francisco, and even whole states are taking an interest in biometric harvesting. For example, the Illinois Biometric Information Privacy Act states that not only do companies have to seek explicit consent before collecting and storing biometric information, users no longer have to prove an “injury” to bring it up in court, meaning you don’t have to prove the data gathering negatively affected your life. Washington and Texas also have biometric privacy laws in place, lawmakers in Massachusetts, Florida, and Arizona are considering similar legislation.

If anything, we could see explicit consent protocols become more commonplace (as seen with Google Nest).

Such state regulation “has a lot of companies in the industry concerned about continuing to use biometrics without employing proper consent protocols,” David O. Klein, managing partner at Klein Moynihan Turco, told us earlier this year.

In fact, Google has decided not to even allow familiar face detection in Illinois in an effort to avoid conflict with the law.

“I think that there will be battles over these technologies, and I’m optimistic that the American people won’t let their life be completely transformed in ugly ways by these surveillance technologies,” Stanley said.

You’ve just read the latest edition of Magnified, our newsletter for those who want to dig a bit deeper into the services we review. At Reviews.com, we’re obsessed with finding the best services out there. If you like what you see on our site, you probably are too. The mission of Reviews.com — finding the best services to spend your hard-earned money on — is just the end of a long journey. If you’re curious about some of the more interesting parts of that journey, not just telling you what’s good, but why it’s good, then you should probably sign up for our newsletter.

Every month our expert writers tell the story behind the story of one of our reviews. When’s the best day of the week to book travel? Why do inkjet photos fade so quickly? What’s the worst thing about air fryers? If you’re like us, the world starts to look a lot more interesting when you look beyond “What should I choose?”

Here’s a look at what we’ve done in the past:

Silicon Valley vs. Wall Street: The Battle for Your Money

Venmo Is the Gateway Drug to Digital Currency

The Common Cold Is Still Incurable, But Science Isn’t Giving Up

Sign up below.

Related Articles

Keeping Your Home Secure on a Fixed Income

Keeping Your Home Secure on a Fixed Income

Google’s simpler Nest Aware plans make it easier to stay in its ecosystem

Google’s simpler Nest Aware plans make it easier to stay in its ecosystem

Smart Homes Aren’t Turning on Us, but Security Remains an Issue

Smart Homes Aren’t Turning on Us, but Security Remains an Issue

Amazon’s Holiday Lineup Brings New Echoes, a Smarter Alexa, and a Fancy Oven

Amazon’s Holiday Lineup Brings New Echoes, a Smarter Alexa, and a Fancy Oven