What happens when your smart home data is used against you?

0
108

By now, you’ve probably heard about China’s Social Credit scoring system. It combines facial recognition technology with all sorts of personal data to determine if you’re a good citizen. Jaywalk, for example, and you’ll be fined in under a minute, plus your social credit score could get dinged. If you’re seen hanging out with the wrong crowd — as deemed by China’s government — and you end up on a blacklist. And you don’t want to be on that list. Well unless you want your internet speeds slowed or denied access to public transportation.

Apparently, China isn’t the only country looking to gamify social behaviors. There are hints of similar strategies right here in the US, ranging from being kicked off a communications platform to being banned from bars that share patron data electronically. This recent article at Fast Company shed a little light on such concerning behavior monitoring.

But after some deeper thought, I became even more concerned. The systems used for this social surveillance are all online or out in public, often optional. What happens in a worst-case scenario and social data from inside our smart homes is used against us?

To some smaller degree, this has already happened, although not in the widespread fashion that worries me. 

For example, you can opt-in with an auto insurance company and allow them to monitor your driving for several weeks: A small connected device plugs into your car, gathering and then sending your driving data back to the company. In turn, if your data shows that you’re a responsible driver, you earn a reduced insurance rate. Some consumers have already taken to sharing such data on a full-time basis as this recent State Farm ad explains in a fun way.

And we’ve seen a few cases where data from either a wearable health tracker or a digital assistant that might have heard something incriminating has been used in criminal lawsuits. Even so, these are mostly one-off situations.

What if these go from the exception to the norm though? I’m thinking worst-case here but there are worrisome potentials.

That insurance company that gave you a safe-driver discount thanks to the limited time you willingly provided driving data would likely be very interested if your smart home cameras noticed you sitting around smoking or drinking all the time. What you saved on auto insurance goes out the window with an automatic life insurance premium bump, for example. I’m sure potential employers would want to see that same camera feed — or at least the data from it — in this future world. So much for that job interview.

It doesn’t even have to be that scary of a privacy invasion or have that big of an impact. Think about all of the connected thermostats we have installed. What you were a “bad citizen” by not conserving energy as much as the average citizen because you like your house as cool or as warm as you want it. In today’s world: It’s your house and as long as you pay your utility bill, it’s all good, right? In a future world, the all-knowing utility company could be allowed to hit you with higher rates than what others pay, just because you’re a drain on the system.

And what about the company you keep in your home? Between GPS and cellular location services, not to mention video doorbells and cameras with facial recognition, it’s not exactly difficult to see who is coming and going from your house. If that visitor is on a questionable “not good for society” list, you could be guilty by association. 

In a conversation with me earlier this week, Stacey actually noted a personal example of how our smart home devices and the data they capture can raise such questions. Although he was quite confident he knew the answer, Stacey’s husband asked her “who was sleeping in the bed with you?” Sounds like an odd question but she was testing a sleep sensor and the data suggested to her husband that there were two people in the bed. There was: Stacey and her daughter. 

The takeaway here isn’t “Don’t have affairs in a smart home”; instead it’s that we don’t always realize what smart home data can be used for. And I suppose users could also turn against Big Brother as well: I could put a step tracker on my dog to fool my health insurance company, for example. Or perhaps we see GPS spoofing apps (some of which already exist) that show our employer that we were working while out of the office when we’re really playing hooky at a ball game.

Do I think we’ll follow in China’s footsteps with a full-blown social credit score? I hope not. And our situation is different in that we have a mix of public and private companies to diminish the chance of a single entity watching our every behavior. Still, the possibilities and untapped use of our data, particularly in the smart home, has me more concerned than ever.

The post What happens when your smart home data is used against you? appeared first on Stacey on IoT | Internet of Things news and analysis.

* This article was originally published here


LEAVE A REPLY

Please enter your comment!
Please enter your name here