The Subtle Ways Your Digital Assistant Might Manipulate You

Amazon/Google

Amazon/Google

TODAY WE GOOGLE for information, but in the future, we might not need to. Instead we may rely on our butler, namely the intelligent, voice-activated digital assistant on our smart phones, smart watches, or devices like Amazon’s Echo and Alphabet’s Home. Rather than searching the web, we’ll be able to ask our digital assistant how to remove the stain from our shirt. It’ll perform other perfunctory tasks, like adding groceries to our shopping list, checking the weather, sending a text, or ordering an Uber.

Besides providing us with information, digital assistants can also anticipate and fulfill our needs and requests, based on what they know about us. Instead of worrying over life’s small details, we shall entrust our digital assistant to dim the lights and lower the thermostat when we leave the house. As our butler learns our preferences, we can rely on it for dinner or entertainment suggestions. As our butler surfs the web to seamlessly provide more of what interests us and less of what doesn’t, we will grow to like and trust it.

And yet, despite the promise of digital assistants, they also carry significant social, political, and economic concerns. The leading platforms’ plans, the Guardian reports, are clear: They envision “a future where humans do less thinking when it comes to the small decisions that make up daily life.” To work well, the digital butler will likely operate from an existing platform and tap into the vast personal data and services that platform offers. Four super-platforms—Apple, Amazon, Facebook, and Alphabet—dominate today’s online world. Not surprisingly, each is aiming for its digital assistant (Apple’s Siri, Amazon’s Alexa and Echo, Facebook’s M, and Google’s Assistant and Home) to become our head butler.

Why is each super-platform scrambling to be first? The more we rely on our butler, the more data it collects on us, the more opportunities for the algorithms to learn, and the better the butler can predict our needs and identify relevant services. The more we use the butler, the more power it will have.

Amazon’s Echo and Alphabet’s Home cost less than $200 today, and that price will likely drop. So who will pay our butler’s salary, especially as it offers additional services? Advertisers, most likely. Our butler may recommend services and products that further the super-platform’s financial interests, rather than our own interests. By serving its true masters—the platforms—it may distort our view of the market and lead us to services and products that its masters wish to promote.

But the potential harm transcends the search bias issue, which Google is currently defending in Europe. The increase in the super-platform’s economic power can translate into political power. As we increasingly rely on one or two head butlers, the super-platform will learn about our political beliefs and have the power to affect our views and the public debate.

If you’re one of the world’s 1.8 billion Facebook users, the service collects data on the things you and your friends do, the information you provide, your devices, your connections, and much more. It shares some of this information with your friends and some of it with third parties, and it makes deductions about your political leanings based on your activity. 

In 2012 Facebook conducted a study in which it manipulated some users’ news feeds to examine how people transmit positive and negative emotions to others. When Facebook surreptitiously reduced positive content in the News Feed, the users’ own status updates were also less positive; when Facebook surreptitiously reduced the friends’ negative content in its News Feed, the users were less negative themselves.

If Facebook can affect users’ mood and engagement by simply promoting some content in the users’ News Feed, just imagine the power of digital butlers to affect our feelings and behavior. By complimenting and cajoling, encouraging us to communicate with others, and sending personalized notes on our behalf, it potentially can affect our moods and those of our friends. Further, as many have reported recently, Facebook’s personalization may affect our views and opinions through a selective news feed.

As we welcome the digital assistants into our homes, we may appreciate the free service. But we won’t know the exact cost. As the digital butler expands its role in our daily lives, it can alter our worldview. By crafting notes for us, and suggesting “likes” for other posts it wrote for other people, our personal assistant can effectively manipulate us through this stimulation. “With two billion ‘likes’ a day and one billion comments,” psychiatrist Dr. Eva Ritvo wrote in Psychology Today, “Facebook stimulates the release of loads of dopamine as well as offering an effective cure to loneliness.” Imagine the dopamine spike when your butler secures a personal record in the number of “likes” for a political message it suggested. Your friends won’t know that your butler drafted the post. And none of us will know how that post might sway the public discourse in ways that benefit the super-platform.

Digital assistants have much to offer, but the next technological frontier may not be entirely rosy. As our digital butler increasingly controls our mundane tasks, it will be harder to turn off. It will be tempting to increasingly rely on the butler for the news we receive, the shows we watch, and the things we buy and even say. We may feel that we roam the fields of free ideas. And yet, we are increasingly ushered by the super-platform’s digitalized hand, not recognizing its toll on our well-being.