Extremism, Terrorism & Bigotry

Jonathan Greenblatt's Remarks to the World Federation of Advertisers

  • April 20, 2021

April 20, 2021

Good morning and thank you so much for your kind introduction. It is an honor to have an opportunity to speak with all of you today and I look forward to answering your questions after my prepared remarks.

At the ADL, we have been tracking extremist movements and the spread of hatred literally for generations. We have seen all kinds of hateful groups over the decades. But we have never seen a moment like this, not just when tensions seem to be at an all-time high, but at a time when we have seen white supremacist ideologies, conspiracy theories, racism, antisemitism, disinformation, and extremism moving from the margins into the mainstream.

In a moment I’m going to talk to you directly about the important role that I think the world’s leading corporations, businesses and advertisers can play in helping to push back against this rising tide of hate that threatens all of society. But before I do that, I would like to step back and look at some of the broader trends we are seeing across society.

This has been a year unlike any other. Aside from the many challenges we’ve faced as a result of the global pandemic, we have witnessed another sharp escalation in hatred and extremism, rising hate crimes, and increasing inequities across American society.

We also experienced one of the most divisive elections in our nation’s history, complicated not only by the COVID pandemic but also by the spread of rampant disinformation on social media, as well as false allegations of voter fraud, which culminated with the violent and deadly insurrection at our nation’s capitol on Jan. 4, the most predictable terror attack in American history because it literally was planned and promoted out in the open on social media.

The horrifying murders of George Floyd and Breonna Taylor and renewed in recent weeks by the senseless killings of Daunte Wright and Adam Toledo has spurred a long overdue reckoning on issues of racial justice. Literally, the Black Lives Movement has galvanized scores of millions of people across America. It has been described as the largest social movement in US history.

We’ve seen Asian Americans blamed, scapegoated, assaulted and murdered in record numbers – thanks in part to the anti-Chinese and anti-Asian rhetoric spouted by some of the highest leaders in the land and parroted by pundits on cable news and, again, spread across social media.

And, then there is antisemitism.

It should be no surprise that anti-Jewish invective has metastasized and multiplied in this new media environment. Scholars call it as the Oldest Hatred, but it might better described as the Deadliest Virus because it seemingly has no cure and it has infected so many societies throughout history. And today it spreads from Facebook to Twitter, YouTube to Zoom, Tik Tok to Clubhouse. We have seen a torrent of Holocaust Denialism, a tsunami of depraved lies about the Jewish people and the Jewish state. The antisemitism comes from all sides of the spectrum, from all corners of the world.

Indeed, we’ve seen some of the largest and most successful social media companies on the planet continue to fall far short of their responsibilities as their lax management created loopholes and their algorithms amplified the rampant spread of conspiracy theories, disinformation and hate.

These are some of the larger societal trends my team has been focused on over the past year.

For more than a century, ADL has worked to stop the defamation of the Jewish people and to secure justice and fair treatment to all. As part of that we’ve assembled the best team in the world, individuals with unmatched expertise who relentlessly investigate extremist threats, an evil that has intensified and expanded in recent years with devastating consequences.

And during my tenure we have broadened the scope of our work to also look at how those threats are spreading in the digital world and how extremists are using social media and other online platforms to spread propaganda, disinformation and hate. Our Center for Technology and Society in Silicon Valley is working directly with the companies, developing new tools and sharing best practices to identify and expose those who use their platforms for harm -- and hold the platforms accountable when they fail to place people over profits.

Many of you are familiar with one of our biggest initiatives on that front: The Stop Hate for Profit campaign. Last July, when the George Floyd protests were reaching their zenith, many of your brands joined with our coalition of civil rights and advocacy groups in a coordinated effort to convince Facebook that it needed to take action. Our message was clear: it was no longer acceptable to host hatred and extremism on their platform; no longer acceptable to fail to address the rampant spread of disinformation and conspiracy theories; and, no longer acceptable to host extremist content subsidized by your advertising dollars.

And thanks to you, and the more than 1,200 businesses and nonprofits and countless consumers who helped us stage a one-month “ad pause,” we were able to force Facebook’s leadership to finally acknowledge the full extent of the problem. But also, to take a series of steps aimed at making real reforms. As a result of our campaign, some of the most iconic brands in the world pulled millions in ad dollars from the platform.

The success of this campaign was unmistakable. We forced an unprecedented public examination of Facebook’s deep harms to marginalized communities and the health of our democracy. For the first time in the history of the platform, it introduced a series of new policies – taking down armed militias, removing racist content, classifying Holocaust Denialism as antisemitism, and far more. It sent a clear and direct message to Facebook’s leadership: This does not stand with our common values, and society is no longer willing to tolerate your inaction and profiting from hate.

Here’s a powerful video we issued during the campaign to educate the public about our concerns.

Stop Hate for Profit did more than just send a clear message to Mark Zuckerberg and other Facebook executives. Other social media companies heard our message and started to step up as well. For example, during the Stop Hate for Profit campaign Twitter took strong action against promoters of dangerous conspiracy theories, and Reddit took action by updating its content policies to better address hate and removed over 2,000 hate-filled subreddits. YouTube likewise took overdue action against white supremacists who were using their platform.

Does this progress mean our work is done?  Not at all. There’s still a number of changes we need to see on Facebook and other platforms. And we still need you, the major advertisers, to keep pressure on those platforms to ensure that they are continuing on the path to reform.

For example, we’ve called on Facebook to adopt common sense changes to their policies to stop radicalization and hate from spreading. This includes more transparency on how they are dealing with groups focused on white supremacy, violent conspiracies, vaccine misinformation and climate denialism.

We’ve also called on Facebook to change its recommendation algorithms, so it will stop recommending or amplifying groups or content associated with hate, misinformation or conspiracies.

We’ve called on Facebook to provide an audit of those advertisers whose ads were shown next to extremist content that was later removed for terms of service violations. And we believe that any of you whose ads appear alongside hate deserves a full refund.

None of you want your brands to be associated with extremist groups or conspiracy theories. That’s why the marketing community has a vested interest in holding social media platforms to account. As CMOs, you have the ability to ask the right questions of the platforms you choose to advertise on, and to press their leadership on whether they are adequately following through on the public’s demands to rid their networks of hate and extremism.

I admire and respect the important work of WFA’s Global Alliance for Responsible Media, and particularly the work you have done in providing a shared framework for social media companies to step up their game. This is an admirable and important effort. I hope it continues. For there is still much work to be done.

Earlier this year, we released our annual Online Hate and Harassment survey and found 41 percent of Americans had been exposed to online harassment. The poll found that American adults who were harassed experienced the most harassment far and away on Facebook, followed by Twitter, Instagram and YouTube.

Before I pause for Q&A, let me share with you another vivid example of how hate speech and white supremacy is moving from the margins into the mainstream.

Just two weeks ago on his Fox News program, Tucker Carlson openly endorsed the white supremacist “Great Replacement Theory.” If you haven’t heard of it, this is a virulently racist and antisemitic conspiracy theory that holds a secret group of Jewish people are plotting to flood the United States with non-whites and immigrants in order to commit “white genocide.”

This is not a new idea. In fact, it is a longstanding, toxic staple of white supremacist ideology. And yet, in front of the largest primetime audience in America, Mr. Carlson asserted this was happening because the Democratic Party intentionally was engineering a huge influx of immigration to bolster its voter base.

Now, I don’t need to tell you how ugly this rhetoric is – especially when it is repeated by someone with millions of viewers across the country.  But it's also dangerous, literally a prompt to violence.

Replacement Theory led to the chants of “Jews will not replace us” in Charlottesville in 2017, the night before they mowed down and murdered Heather Heyer. Extremists invoked Replacement Theory when they massacred 11 Jews in a synagogue in Pittsburgh in 2018; 51 Muslim worshippers at 2 mosques in Christchurch in 2019, and 22 Latinos in the parking lot of a Wal-Mart in El Paso that same year. So, make no mistake: This is a deeply dangerous message.

Now, immediately after this segment aired, we reached out to the network with a simple appeal: It’s time for Carlson to go. The response we received back from Fox Chairman Lachlan Murdoch could be summarized as follows: We respect you and the ADL, but Tucker didn’t really mean it. He was talking about voting rights, not embracing a white supremacist trope.

But I can assure you that Carlson’s message was not lost on those white supremacists and bigots who are now applauding him for embracing their talking points.

Why do I share this story?

Again, it is an example of how hatred is being mainstreamed in America in 2021. And it is another example where you, advertisers, have a potential role to play.

Through your advertising muscle, you have a distinct and powerful voice in what information networks like Fox and Facebook will and will not allow on their platforms.

You can hold them accountable like few other actors in society because your dollars are the fuel that enables their business model.

And don’t believe that their size insulates them. It’s not just about the revenue pressure you can apply, it’s the reputational pressure that you can bring to bear.

And so we need your help in sending a message when our media networks fail us and allow hatred to spread.

And when I say we need your help, I mean it.

Look around you. Politics are polarized. Congress is gridlocked. Courts move at an excruciating pace. Special interests seem to dominate think tanks and public debates. And, in this moment, CEOs have stepped forward and filled the leadership vacuum in important ways.

Take the recent pushback against laws in Georgia that would restrict the right to vote. Or the leadership on climate shown by so many companies. Now, I’m not taking regulators off the hook - ultimately public problems demand policy solutions and systemic change.  

But you indisputably have a kind of power – and, if you will forgive the cliché, with great power comes great responsibility.

And, as you think about your responsibility, I would submit that the fight against hate is the fight of our time – the future of democracy, the nature of society – hangs in the balance.

As an industry, you are uniquely positioned to push these networks – whether mass media or social media – to do their utmost to ensure that hatred and conspiracy theories are not amplified. Yes, the First Amendment allows for people to espouse fringe theories – but we need to keep them on the fringe.

So, I’ll close with a challenge to you: Commit to this fight. Choose a side.

Choose to pause or even pull your ads, not just from problematic programs, but altogether from networks that don’t respect all people or that repeat baseless conspiracies that endanger all of us.

Choose to use your conversations with the leadership of these companies to push them to do more -- and use your public platforms to demand the same.

Choose one of the unfulfilled Stop Hate for Profit demands at www.stophateforprofit.org and demand that Facebook show real change.

Let me close by noting that I appreciate that your CEOs and many of you have made public statements and internal commitments to diversity, equity and inclusion. Making good on those commitments means doing it all the time – not just in how your hire, or who you promote, but in what values you embed across your value chains.

And I know that you feel accountable to your shareholders...

… to your customers

… to your employees

… to your children.

And so – right here, right now – this is your opportunity.

Through your help and leadership, we can not only Stop Hate for Profit. We can commit – together – to “Fight Hate for Good.”

Thank you.

Extremism, Terrorism & Bigotry