Skip to content

Durbin: Why is the U.S. Slow to Respond to the Power of Social Media Algorithms?

WASHINGTON – U.S. Senate Majority Whip Dick Durbin (D-IL), Chair of the Senate Judiciary Committee, today questioned Tristan Harris, Co-Founder and President of the Center for Humane Technology, during a Senate Judiciary Subcommittee on Privacy, Technology, and the Law hearing entitled, “Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds.” Durbin asked Harris why the United States has been slow to respond to the issue of highly targeted algorithms that can captivate and persuade users in every aspect of their lives.

“I’ve been reading and trying to understand why the European Union is taking such an apparently bold and innovative approach to this subject, and why we are so slow to respond?” Durbin asked.

Harris answered by stating the United States places a high value on free speech, which enables companies to push content, virally, to different people on a personalized basis, and it can outrage them.  Harris said, “I don’t think that we have had a framework and to your earlier point, one of the quotes we reference often is from E.O Wilson is that the fundamental problem with humanity is that we have Paleolithic emotions, medieval institutions and then accelerating god-like technology. The rate in acceleration of new kinds of threats, new kinds of issues, the growth rate of that is growing far faster than our capacity to respond to those threats.” For example, Harris cited that over the course of a day, hundreds of billions of messages are sent across Facebook and WhatsApp, but the companies only perform about 100 fact checks per day. 

Durbin then asked Harris to explain how the companies are failing to rein in technology that manipulates behavior, even when they say they are doing so. 

“I’ve heard people from Facebook talk about making your Facebook experience more meaningful, and folks from YouTube and Twitter talking about healthy dialogue. But, the bottom line is…there is a factor here where our human behavior is being affected by what we are seeing, what we are reading, and what we are experiencing. And that seems to violate the basic premise of the EU regulation,” Durbin continued.

Without hesitation Harris responded that, “on the manipulation front, that would disqualify just about all of the three companies that are sitting in front of you, including TikTok.” He said the problem begins with the fact that the technologies used by social media companies are designed to be persuasive, and that even as the companies try to design “healthy” or “meaningful” experiences, they are still beholden to a business model based on manipulation of behavior.  

Video of Durbin’s questions in Committee is available here.

Audio of Durbin’s questions in Committee is available here.

Footage of Durbin’s questions in Committee is available here for TV Stations.

-30-