The Edge of Risk Menu Search
New thinking on corporate risk and resilience in the global economy.
Technology

How Much Should Your Smart Product Know About You?

Thanks to AI and better bandwidths, smart products are set to become increasingly interwoven with our lives. But there are risks involved in integrating such technology into our homes and workplaces. Carla Diana is head of design for Diligent Robotics and the author of the newly released My Robot Gets Me: How Social Design Can Make New Products More Human. BRINK spoke to her about what the next generation of smart products are likely to look like and how we can ensure our privacy.

DIANA: At the moment, in terms of things that we interact with, like microwave ovens and stereo systems, we’re using what I might call pre-baked messages that are set up to anticipate interaction. 

For example, if you’re interacting with your microwave, you might press a button that says “popcorn,” and then it makes a certain number of beeps or a screen display that says, “Okay, I’m cooking popcorn now.” And then, “The popcorn is done!” These are pre-baked messages, if you will.

Predicting Your Needs

In the next phase of development, we will be able to give products the capability to be entities unto themselves. The device will be able to interact with us more or less on the fly, through embedded electronic systems that allow them to process much more information and essentially be socially responsive.

For example, you might press the “popcorn” button, and it might know that you’re in the middle of a party because it’s read your calendar, and it might know that you like your popcorn very well done. So using AI, it might read this along with other patterns and be able to respond to you with a suggestion: “Should I make four bowls for you and your guests, and do you want it all to be well done the way you usually like it?” 

There is a confluence of factors, including AI, cloud computing and more ubiquitous broadband, enabling this to happen. It is now possible to have multiple devices connected and accessing information on the cloud, along with more powerful computing power. 

BRINK: What are the ethical risks of where this is heading, because these products will have more and more knowledge of our lives? 

DIANA: My book is mostly exuberant because, as a product designer, I enjoy the potential and the possibility for increased usability. But there are concerns.

The Risk of Being Persuaded by a Smart Product

Firstly, the product could start to use social engineering, for want of a better word, to persuade us to buy more of a certain brand of thing, e.g., popcorn, or to start engaging in habits that are not to our benefit, but instead benefit the entity that has programmed this product. I think that that’s an enormous red flag. 

It is particularly concerning when you have vulnerable populations, such as children or people with cognitive disabilities, who could be easily swayed. 

To have greater interaction, these new AI tools need to take in a lot of information. You might have a camera that understands people’s facial expressions or gestures and is embedded deep inside a person’s home, in their bathroom, bedroom or baby room. As designers, we need to be aware of what information is being taken in by these products. 

Trust is going to become increasingly important as people become more educated about how smart products work.

The Need for Privacy Health Warnings

It is important for the designer to think about ways of making the information flow transparent. That is a big open opportunity for businesses, because trust is going to become increasingly important as people become more educated about how smart products work.

For example, if someone walks into my home and they’re not aware that I have a smart speaker that is listening, then it would benefit that device to have some indication of what it’s doing, such as a little light glowing in our peripheral vision. 

That way the person can ask, “What is that?” And I can say, “Oh, it started glowing to let you know that there is a microphone on. It’s not going to be recording our conversations, but it is listening to our conversation in order to understand certain instructions that we might give it, so we can enjoy our time together, listen to music, get the weather,” etc.

It’s about giving our products the ability to communicate with us in this shorthand of light, sound and movement. Essentially, “privacy health” warnings. 

Finding the Balance Between Privacy and Convenience

BRINK: In the workplace, do you see these products increasingly taking the role of the manager and giving requests and orders to employees?

DIANA: I already see that starting. With cameras and microphones, your workplace could be a set of devices that see where you are, see what you’re doing, see how well you’ve completed your task, and measure the quality of your tasks. 

You might sign a contract when you start a new position and need to be aware of what’s stipulated in that contract. There might be protocols that you’re accustomed to for home devices, in terms of respecting privacy, but your employer might need a contract where you have different privacy expectations, where you’re expected to have a camera that sees what apps you’re using and how many hours you’re using them and whether or not you’re at your desk. 

It gives you the freedom to do desktop work at 3:00 in the morning, as long as you’re getting it done well, but it may mean that you have a camera that’s trained on your home space 24 hours a day.

Different Social Skills Needed

BRINK: Do you think that these products will require different social skills in the way that we interact with these products?

DIANA: Oh yes — it’s almost like a new language, if you will, between a person and the product. And it’s very much a shorthand, because we’re going to lean toward less verbose, not more verbose. But that doesn’t mean it is any less social. 

I might have a certain way that I wave my hand and that’s how I want the drapes to be open, or a way that I snap my fingers, let’s say, and it means one set of lights goes on, as opposed to another set of lights.

The most successful products will have these electronic behaviors feel like a part of the material. I realize that that’s a little esoteric, but the best products that I’ve seen build on existing archetypes of product, a kitchen table that still feels like a kitchen table, but has smart functionality built into it.

It does not need to feel like an overwhelming shift in technology, but an enhancement of existing material. We’re going to see this happening more and more, because the technology itself is getting smaller, more flexible, more resistant to water, and so forth.

Carla Diana

Head of Design for Diligent Robotics @carladiana_

Carla Diana is the head of design for Diligent Robotics and cohosts the Robopsych Podcast. She also runs the tech-focused 4D Design two-year MFA Program at Cranbrook Academy of Art and is the author of the forthcoming My Robot Gets Me: How Social Design Can Make New Products More Human.

BRINK’s daily newsletter offers new thinking on corporate risk and resilience. Subscribe