x Abu Dhabi, UAESaturday 25 November 2017

Social Media regrets: Minds behind addictive apps are having second thoughts 

As end users, we may occasionally ponder the effects of technology on our own lives, but it’s telling when the people instrumental in creating that technology start to develop misgivings

Sean Parker. Bloomberg
Sean Parker. Bloomberg

Most of us will have a few regrets about decisions we’ve made in the workplace, but few of those decisions will have had the same impact as ones made by executives and product designers working in Silicon Valley.

With over 2 billion of us using smartphones and 3 billion connected to the internet, our daily choices are disproportionately affected by a small number of technology companies – perhaps more than the people working for those companies realise.

But a growing number of former employees have been going public with their unease.

The latest is former Facebook president Sean Parker, played by Justin Timberlake in the 2010 film The Social Network.

At an event last week run by American media company Axios, Parker was frank about his disillusionment with the service he helped to create.

“I don’t know if I really understood the consequences... of a network when it grows to a billion or 2 billion people,” he said. “It literally changes your relationship with society, with each other. “God only knows what it’s doing to our children’s brains.”

As end users, we may occasionally ponder the effects of technology on our own lives, but it’s telling when the people instrumental in creating that technology start to develop misgivings.

As Silicon Valley severance packages tend to come with non-disclosure agreements, few former employees feel able to speak out, but some, like former Twitter engineer Leslie Miley, specifically declined a severance package in order to do so. In an interview last week with Bloomberg, Miley recounted how, in his role as product safety and security manager, he expressed fears to management over the huge proliferation of dormant Twitter accounts based in Russia and Ukraine as long ago as 2015, but his concerns weren’t addressed. “They were more concerned with growth numbers than fake and compromised accounts,” he said. Those accounts have since been used to spread pro-Russian propaganda.

The voices of discontent have a similar theme: that growth in numbers is paramount, and any collateral damage suffered as a consequence is of little interest. “Any improvement not based on a hard metric was not a respected use of time,” said former Google software engineer Katy Levinson to Business Insider late last year.

“Usability? Nobody cared. If you couldn’t measure it, nobody was interested in it.” When former Facebook employees came forward last month for an article in Vanity Fair entitled “What Have I Done”, it was a similar story.

“The half-trillion dollar public company,” concluded the writer, Nick Bilton, “is first and foremost a machine that turns users into revenue. “Facebook’s prime directive is to maximise the number of people advertising on its platform.”

It’s the methods by which services have bred a dependence, or even an addiction, that seem to provoke the most guilt in their creators. We’re talking the pull-to-refresh downward swipe that updates our feeds, the hectoring pop-ups and reminders, the bright red notification icons and auto-playing videos.

Justin Rosenstein, the technical lead for the implementation of the Facebook “Like” button, was candid about his feelings in an interview with The Guardian last month.“It is very common for humans to develop things with the best of intentions,” he said, “and for them to have unintended, negative consequences.”

Justin Rosenstein. Bloomberg
Justin Rosenstein. Bloomberg

Those consequences were restated by Sean Parker as he considered the motivations behind Facebook’s rapid development. “How do we consume as much of your time and conscious attention as possible?” he said. “It’s a social-validation feedback loop… you’re exploiting a vulnerability in human psychology. “The inventors understood this consciously. And we did it anyway.”

Sara Wachter-Boettcher, whose new book, Technically Wrong, details many of the ways that technology has failed us, is fascinated by the emergence of people finally reckoning with the things that they helped to create. “[People working in tech] have been given too much of a free pass for unintended consequences,” she says. “There’s a lot of stuff that could have been anticipated had they paid more attention to people who were telling them, years ago, but they’ve been very insular and comfortable with not thinking about people who are unlike themselves.”

Occasionally we glimpse the nature of that Silicon Valley bubble. In 2016, a senior software engineer at Twitter, Brandon Carpenter, came under fire on Twitter for some changes that were being implemented. His response (“Wow people on Twitter are mean”) was baffling to those who had been complaining about the way Twitter’s policies had permitted and sustained abuse for many years, and for many people it summed up the disconnect between the people making the decisions and the people using the products.

Those decisions are seemingly taken, by and large, without wide consultation; Facebook’s founder, Mark Zuckerberg, has frequently been criticised for surrounding himself with yes-men, while other companies – notably Snapchat – are said to be deliberately structured to conceal the purpose of long term goals from its employees.

“I want you to imagine walking into a room,” says former Google employee Tristan Harris at the opening of a TED talk he gave back in April.

“A control room with a bunch of people, a hundred people, hunched over a desk with little dials, and that that control room will shape the thoughts and feelings of a billion people. This might sound like science fiction, but this actually exists right now, today.”

_______________

Read more:

Time for smartphone makers to take a year off

Facebook linked to neurotic behaviour

Why we should all support #Comment_positively

_______________

There’s no doubt that we see benefits from some of the promises and guarantees made by the likes of Google, Facebook and Twitter. But as behavioural designer Nir Eyal writes in his book Hooked: How to Build Habit-Forming Products, we now have infinite distractions competing for our attention. “Companies are learning to master novel tactics to stay relevant in users’ minds,” he says, “[and] their economic value is a function of the strength of habits they create.” Are those habits healthy?

The people who instilled them in more than a billion people have their doubts. For his part, Eyal has installed a timer to cut off his access to the internet for a certain number of hours a day. Go figure.