Tim Kendalall, Direttore della Monetizzazione da Facebook, descrive i subdoli meccanismi utilizzati per renderci dipendenti dal social. Dichiarazioni raccolte durante l’audizione alla Commissione per l’Energia e il Commercio.
Thank you, Chairpersons Pallone and Schakowsky, for inviting me to speak today.
When I started working in technology, my hope was to build products that brought people together in new and productive ways. I wanted to improve the world we all lived in.
Instead, the social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding—at worst, I fear we are pushing ourselves to the brink of a civil war.I feel ashamed by this outcome. And I am deeply concerned. And to that end, I am compelled to talk to you about what we can do to limit further damage—and maybe even undo some of it.
My path in technology started at Facebook where I was the first Director of Monetization. I thought my job was to figure out the business model for the company, and presumably one that sought to balance the needs of its stakeholders — its advertisers, its users and its employees. Instead, we sought to mine as much attention as humanly possible and turn into historically unprecedented profits.
To do this, we didn’t simply create something useful and fun. We took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset.
Tobacco companies initially just sought to make nicotine more potent. But eventually that wasn’t enough to grow the business as fast as they wanted. And so they added sugar and menthol to cigarettes so you could hold the smoke in your lungs for longer periods. At Facebook, we added status updates, photo tagging, and likes, which made status and reputation primary and laid the groundwork for a teenage mental health crisis.
Allowing for misinformation, conspiracy theories, and fake news to flourish were like Big Tobacco’s bronchodilators, which allowed the cigarette smoke to cover more surface area of the lungs. But that incendiary content alone wasn’t enough. To continue to grow the user base and in particular, the amount of time and attention users would surrender to Facebook, they needed more.
Tobacco companies added ammonia to cigarettes to increase the speed with which nicotine traveled to the brain. Extreme, incendiary content—think shocking images, graphic videos, and headlines that incite outrage—sowed tribalism and division. And this result has been unprecedented engagement — and profits. Facebook’s ability to deliver this incendiary content to the right person, at the right time, in the exact right way… that is their ammonia.
Social media preys on the most primal parts of your brain. The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions— it aims to provoke, shock, and enrage.
When you see something you agree with, you feel compelled to defend it. When you see something you don’t agree with, you feel compelled to attack it. People on the other side of the issue have the same impulses. The cycle continues with the algorithm in the middle happily dealing arms to both sides in an endless war of words and misinformation. All the while, the technology is getting smarter and better at provoking a response from you.
These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism.
This is not by accident. It’s an algorithmically optimized playbook to maximize user attention — and profits.
And there are limited checks and balances.
In 2016, internal analysis at Facebook found 64% of all extremist group joins were due to their own recommendation tools. Yet repeated attempts to counteract this problem were ignored or shut down.
As you know, Section 230 of the 1996 Communications Decency Act shields Internet companies from liability for third-party content. I can think of few industries that enjoy such broad immunity and none that have profited so greatly from this lack of basic regulation. I’m not a lawyer or legislator but I can’t imagine where we’d be if we hadn’t held tobacco companies accountable for making so many people sick. And yet, that is what we have allowed these companies to do. It has to change.
When it comes to misinformation, these companies hide behind the First Amendment and say they stand for free speech. At the same time, their algorithms continually choose whose voice is actually heard. In truth, it is not free speech they revere. Instead, Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside, raising the voices of division, anger, hate and misinformation to drown out the voices of truth, justice, morality, and peace. When it comes to spreading extremist views and propaganda, these services have content moderation tools—but the tools can never seem to keep up, despite these companies having billions of dollars at their disposal.
Without accountability for the platforms, we can only expect the problem to continue and get worse.
As these platforms continue to addict us and make us more vulnerable, we are going to get more depressed and anxious. Our judgement will continue to weaken and our susceptibility to radicalization will only increase. A Princeton study found that watching 2 minutes of a conspiracy theory video can make people less willing to help others and reduces their belief in established scientific facts.
On a personal level, I’m aware that I’ve benefited from these addictive business models and this deepens my sense of responsibility for where we are and my sense of obligation to help us improve things.
I am not sure I could have ever known at the time where the work that I contributed to would lead. But for my role, I do bear some responsibility. And I regret that. One thing I can do now, however, is what I am doing: dedicate all my time and resources to undo as much damage as I can.
To be clear, social media is not the root cause of every problem we’re facing. But I believe it may be the most powerful accelerant in history.These services are making us sick.
These services are dividing us. It’s time we take account of the damage. It’s time we put in place the necessary measures to protect ourselves—and our country.



Articoli recenti
Sottoscritto il Manifesto Menomanels. Copernicani ETS è la prima associazione nazionale a aderire.
Nel panorama politico, sociale e culturale contemporaneo, la predominanza maschile tra gli speaker degli eventi [...]
Apr
Giovedì 2 aprile ore 21:00 – La sfida industriale e realizzativa della Sovranità digitale
Un dialogo copernicano con l’economista Cristina Caffarra e l’esperto di policy EU Innocenza Genna. Il [...]
Mar
Voto via internet: la democrazia non è un atto di fede
Il 16 gennaio 2026 segna un punto di svolta nel dibattito globale sulla sicurezza elettorale. [...]
Gen
Lunedì 2 febbraio ore 21:00 – Data governance e digital sovereignty: come i BRICS definiscono le politiche digitali
I flussi di dati sono centrali nella società dell’informazione odierna ed essenziali per attuare la [...]
Gen
lunedì 19 gennaio alle ore 21.00 – L’astensionismo dagli anni ottanta ad oggi. Cosa sta succedendo?
I Dialoghi copernicani aprono il 2026 con Nando Pagnoncelli lunedì #19 gennaio alle ore 21.00 (zoom e [...]
Gen
Forum Internazionale sulla Democrazia e il Digitale Edizione 2026
La sfida dell’IA per le democrazie liberali Il prossimo 12 e 13 febbraio 2026, nella cornice [...]
Dic
ORBIS e la convergenza tecnologica.
Polis: uno strumento per la democrazia deliberativa. Il progetto ORBIS, cofinanziato dal programma Horizon Europe, [...]
Dic
Oltre l’Algoritmo: L’epilogo di tre anni di ricerca e sperimentazione per disegnare la democrazia digitale in Europa
Il Convegno di Bruxelles con il “nostro” progetto e altri tre finanziati dal programma Horizon [...]
Dic
Lunedì 15 dicembre ore 21:00. Intelligenza umana o artificiale. Chi vince?
Chat GPT e Gemini ci capiscono? Oppure no? Un sondaggio di YouGov commissionato dall’Economist ha [...]
Nov