By Annie Pettit
We’ve been hearing for years that if you get something for free, you are the product. But for a lot of people, that adage never really sunk in until the recent Cambridge Analytica scandal in which the company was accused of misusing and failing to secure Facebook data from more than 71 million people.
Before the scandal erupted, Canadians were perhaps complacent about privacy. But, this Google Trends chart shows a stark reversal beginning in March 2018. Now, though interest in Cambridge Analytica has quickly dropped off, searches related to privacy continue to rise.
Privacy and personalization create a double-edged sword. For many people, personalization is what you get when emails and newsletters address you by your first name. Our names have been public information since the day we were named, so we don’t normally feel a huge loss of privacy when someone we don’t know uses that information. And for the 2 BILLION people who use Facebook, the personal data we share on that website, from friends and family to favourite musicians and politicians, is shared under the assumption that it will be safe and secure within the website.
But for early adopters who have plunged head first into all that technology has to offer, the broader application of personalization is the magic that happens with a voice activated home assistant such as Amazon’s Alexa or Echo, Apple’s HomePod, or Google Home. When you literally tell a small electronic device such as Alexa to order more slow cooked beef pot roast, personalization of this device means that it recognizes YOUR voice. It knows that you usually buy pot roast from M&M Food Market. It uses your saved credit card numbers and places the order to be delivered to your home after 6pm that day. That instant gratification is the ultimate goal of personalization. And the consequence is the ultimate loss of privacy.
Many of us willingly give up our most personal and risky details to companies and brands, because we love them and believe that the relationship improves our lives. We give those companies our kids’ names and our credit card numbers because it makes things easier and lets us spend our time doing the things we want to do in the way we want to do them.
On the other hand, personalization can sometimes be a less than wonderful thing. Social media games that ask for personal information such as pets’ names, favourite activities, authors, books, and more, probably are used to tell you which celebrity you’re most similar to. But, in some cases, these data are also used to profile your shopping personalities and determine which products and services you could be persuaded to buy. Which isn’t necessarily bad. But in some cases, these data could be used to facilitate serving deliberately slanted or misleading information. As we are discovering from the Cambridge Analytica fiasco.
We need to find a happy medium.
We know that privacy standards, even when very strict and enforced, are not always sufficient to safeguard data. We know that we share too much information with websites we don’t completely trust. We know that laptops get forgotten, lost, and stolen allowing access to files and software that are highly confidential. We know that hackers around the world are actively trying to access private information, whether for fun, status, or malice. Privacy with technology is impossible.
The happy medium lies in giving consumers good options. Companies that are willing to put in the work to earn consumer trust will enjoy long-lasting success. Consumers will reward companies that have a track record of good behaviour, and quick and friendly customer service. Consumers will even reward companies that make the occasional privacy or security mistake as long as the desired and necessary apologies are quick, genuine, and the resolutions are purposeful.
It might cost more to create winning customer service experiences, and build appropriate compromises between personalization and privacy, but the reward is loyal consumers. And nothing is more valuable than that.