Breaking News

Industry Watch: Beware the dark patterns of privacy

https://ift.tt/eA8V8J

You know how sometimes, when an app requires you to sign in, it will give you the option of signing in via Facebook, or Google mail? And you think, ‘Wow, this is nice of them to make it so convenient for me. I’ll have to send Facebook a card or something to thank them.’ (Well, maybe you don’t think that last part.)

RELATED CONTENT: A new approach to personal data discovery

You sign into the app with Facebook, only to come to learn that by signing into the app that way, Facebook has gained access not only to your personal data — the things you buy, how you stand politically, what kind of food you like to eat when dining out, where you went to elementary school — but it now can take your list of friends as well and sell those name and their data to third parties hungering for people and data.

So what appears to be a benign service actually represents what is known in the user experience design world as a ‘dark pattern.’

And that, said John Biondi, vice president of experience design at Nerdery, “is a UX not crafted to respect privacy.” Instead, he thinks companies — and the people who design their websites and applications — will gain more trust and have more success if they offer their users both agency to control their data and transparency into how their data is being used. “When I both tell you what I’m going to take from you and what my intentions are with that, and I give you the ability to opt in or out of it — the agency to make your own decisions — then that’s a respect of privacy,” Biondi said. “But I think there are many situations where you have one of those things but not both. A lot of times, like with social sign-ins, you do have agency not to use it, but you don’t have transparency. You have almost no information about what they’re going to do with it.”

Biondi speaks in terms of a privacy user experience. “When something says, ‘This ad wants to use your location,’ should you let it? That’s a privacy user experience, but those are pretty tactical and pretty rare. I think when you talk about increasing privacy through user experience, you’re adding friction into a process, which is exactly the opposite of what user experience design usually does.”

But many companies are loathe to do that, because selling data for profit, or using data to fine-tune marketing and sales leads to greater revenue. Mostly, though, it’s because users aren’t demanding that their data remain under their control and private.

“To add friction into a process, you would have to get to a point where people wanted privacy so much that they were comfortable with the additional friction, and it doesn’t seem like we’re at that point,” Biondi said.

So how did we get here?  Since the beginning of the internet, the world has wanted software to be free. So, if search is free, and sharing stories and photos with friends is free, these companies had to come up with ways to monetize their software. And what they came up with was to take data on all these users and then offer it to actual advertisers so they could sell them their products and services. “And we call that advertising, but I don’t think that’s advertising, Biondi said. “It’s an indirect, sneaky way of advertising, I guess. If we hadn’t gone down that path, if we’d had said Google will cost $50 a year, I don’t think we’d be having this conversation. I don’t think the Googles and Facebooks of the world would have been forced to be this sort of  dishonest about the way they make money.”

Biondi, like so many, particularly dislikes Facebook’s business methods. “I think Facebook is a company to be concerned about, frankly,” he said. “Facebook has done things with our privacy that hasn’t been directly monetized, like the way Facebook has used our behavior on its platform to learn that the more polarized it can make me on my political beliefs — the more I will engage. The angrier a liberal I am or the angrier a conservative I am, the more I engage with their product. Facebook’s job is to push me further down that continuum little by little, and we find suddenly that we’ve destroyed relationship with our family members and friends that we once considered part of our inner circle because we’ve been sort of I think victimized by Facebook’s privacy violations. They used what they knew about me to radicalize me. By radicalize, I don’t mean we’re terrorists; I just mean like I now identify as a complete snowflake or as an alt-right Nazi where as before I might have been somewhere in the middle of the spectrum. 

“I literally went through a phase where I did not speak to my father-in-law shortly after the 2016 election mostly because of Facebook, until it sort of dawned on me that some of this is being done to me. I’m part of someone else’s intention here in a way that I’m not comfortable with.”

But does any of this actually hurt these companies? “It doesn’t seem to,” Biondi said. “We haven’t reached a tipping point where users are demanding it. There comes a point where people realize they’re being treated unfairly, and they demand to be treated fairly. And we’re not at that tipping point, I don’t think. Although we know our privacy is not being respected, the convenience we get from these products and services outweighs the concern.”

Biondi said Nerdery has come up with a set of principles for designers to follow. First is to understand that design is an expression of intent, and designers need to understand when the intention is dark, or immoral.

“Is it a UX designer’s job for a huge corporate giant to take the principled stance? I don’t know,” he said. “Somebody has to take a principled stance. I honestly believe that companies and designers need to think about these sort of dark patterns and make it everyone’s job to take a principled stance. And if we’re not giving transparency and agency, then we’re probably on the wrong side of the privacy issue.”

One company that Biondi thinks is on the right side of the issue is Apple, which will now enable users to sign into websites and apps with their Apple ID. “They’re just doing it for convenience; they’re not making a business out of it, which is a very intentional decision,” he said. “They could make a business out of it; that path is well-worn, but Apple decided privacy is going to be a differentiator for it, and it hopes it will recoup that money it could have made off  violating people’s privacy is customers flocking towards it because it protects people’s privacy.”

And Amazon recently said it will allow Alexa owners to delete their historical record of their conversations. But that, to Biondi, does not go far enough. “It does give you a form of agency, but we don’t know why they’re keeping them, and we don’t know what they’re doing with them,” he said. “So there’s no transparency. If you think my grandma, or my mom or dad, are going to go in and delete their Alexa files, you are mistaken. That’s not happening.”

So far, it appears that how data is being used is somewhat benign. You search for a pair of shoes, and ads for those shoes pop up over and over on each website you visit. Annoying, but we can deal with it. But what else can be mined from your data, and what could potential bad actors do with that information? 

Until people demand to know why such data as interactions with voice assistants or chats with friends are being saved, and what these companies are doing with them, their privacy will remain compromised. Is that worth the convenience?

The post Industry Watch: Beware the dark patterns of privacy appeared first on SD Times.



Tech Developers

No comments