Facebook come la dipendenza da nicotina

Tim Kendalall, Direttore della Monetizzazione da Facebook, descrive i subdoli meccanismi utilizzati per renderci dipendenti dal social. Dichiarazioni raccolte durante l’audizione alla Commissione per l’Energia e il Commercio.

Thank you, Chairpersons Pallone and Schakowsky, for inviting me to speak today.

When I started working in technology, my hope was to build products that brought people together in new and productive ways. I wanted to improve the world we all lived in.

Instead, the social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding—at worst, I fear we are pushing ourselves to the brink of a civil war.I feel ashamed by this outcome. And I am deeply concerned. And to that end, I am compelled to talk to you about what we can do to limit further damage—and maybe even undo some of it.

My path in technology started at Facebook where I was the first Director of Monetization. I thought my job was to figure out the business model for the company, and presumably one that sought to balance the needs of its stakeholders — its advertisers, its users and its employees. Instead, we sought to mine as much attention as humanly possible and turn into historically unprecedented profits.

To do this, we didn’t simply create something useful and fun. We took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset.

Tobacco companies initially just sought to make nicotine more potent. But eventually that wasn’t enough to grow the business as fast as they wanted. And so they added sugar and menthol to cigarettes so you could hold the smoke in your lungs for longer periods. At Facebook, we added status updates, photo tagging, and likes, which made status and reputation primary and laid the groundwork for a teenage mental health crisis.

Allowing for misinformation, conspiracy theories, and fake news to flourish were like Big Tobacco’s bronchodilators, which allowed the cigarette smoke to cover more surface area of the lungs. But that incendiary content alone wasn’t enough. To continue to grow the user base and in particular, the amount of time and attention users would surrender to Facebook, they needed more.

Tobacco companies added ammonia to cigarettes to increase the speed with which nicotine traveled to the brain. Extreme, incendiary content—think shocking images, graphic videos, and headlines that incite outrage—sowed tribalism and division. And this result has been unprecedented engagement — and profits. Facebook’s ability to deliver this incendiary content to the right person, at the right time, in the exact right way… that is their ammonia.

Social media preys on the most primal parts of your brain. The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions— it aims to provoke, shock, and enrage.

When you see something you agree with, you feel compelled to defend it. When you see something you don’t agree with, you feel compelled to attack it. People on the other side of the issue have the same impulses. The cycle continues with the algorithm in the middle happily dealing arms to both sides in an endless war of words and misinformation. All the while, the technology is getting smarter and better at provoking a response from you.

These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism.

This is not by accident. It’s an algorithmically optimized playbook to maximize user attention — and profits.

And there are limited checks and balances.

In 2016, internal analysis at Facebook found 64% of all extremist group joins were due to their own recommendation tools. Yet repeated attempts to counteract this problem were ignored or shut down.

As you know, Section 230 of the 1996 Communications Decency Act shields Internet companies from liability for third-party content. I can think of few industries that enjoy such broad immunity and none that have profited so greatly from this lack of basic regulation. I’m not a lawyer or legislator but I can’t imagine where we’d be if we hadn’t held tobacco companies accountable for making so many people sick. And yet, that is what we have allowed these companies to do. It has to change.

When it comes to misinformation, these companies hide behind the First Amendment and say they stand for free speech. At the same time, their algorithms continually choose whose voice is actually heard. In truth, it is not free speech they revere. Instead, Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside, raising the voices of division, anger, hate and misinformation to drown out the voices of truth, justice, morality, and peace. When it comes to spreading extremist views and propaganda, these services have content moderation tools—but the tools can never seem to keep up, despite these companies having billions of dollars at their disposal.

Without accountability for the platforms, we can only expect the problem to continue and get worse.

As these platforms continue to addict us and make us more vulnerable, we are going to get more depressed and anxious. Our judgement will continue to weaken and our susceptibility to radicalization will only increase. A Princeton study found that watching 2 minutes of a conspiracy theory video can make people less willing to help others and reduces their belief in established scientific facts.

On a personal level, I’m aware that I’ve benefited from these addictive business models and this deepens my sense of responsibility for where we are and my sense of obligation to help us improve things.

I am not sure I could have ever known at the time where the work that I contributed to would lead. But for my role, I do bear some responsibility. And I regret that. One thing I can do now, however, is what I am doing: dedicate all my time and resources to undo as much damage as I can.

To be clear, social media is not the root cause of every problem we’re facing. But I believe it may be the most powerful accelerant in history.These services are making us sick.

These services are dividing us. It’s time we take account of the damage. It’s time we put in place the necessary measures to protect ourselves—and our country.

Home » ospedale » Facebook come la dipendenza da nicotina

Articoli recenti