On June 17th, we asked companies to act against hate and disinformation being spread by Facebook in our campaign Stop Hate for Profit. We asked advertisers to temporarily pause advertising on Facebook and Instagram in order to force Mark Zuckerberg to address the catastrophic effect that Facebook has had on our society.
Companies have responded in waves. Unilever, Verizon, Patagonia, Ben and Jerry’s, North Face, Eileen Fisher, Viber and the list goes on and on. These are companies who do not want to associate their brands and their employees with what Facebook has created. And the list is growing every day, every hour, every minute.
Faced with this tidal wave, Mark Zuckerberg responded today with a small number of small changes. He stated that Facebook would apply their hate policy to ads as if it was some new revelation, while not addressing hate more broadly in groups and posts. Voter misinformation may be a bit harder to spread the day of the election (but still will run rampant the rest of the time). And posts that call for violence will still be allowed if they come from someone “newsworthy” but they will now be labeled. None of this will be vetted or verified - or make a dent in the problem on the largest social media platform on the planet.
We have been down this road before with Facebook. They have made apologies in the past. They have taken meager steps after each catastrophe where their platform played a part. But this has to end now.
If Facebook was serious, Mark would have finally announced the following obvious but long-rejected steps. These ten steps will not be enough to address all of Facebook’s problems, but they would be a start:
1. Establish and empower permanent civil rights infrastructure including C-suite level executive with civil rights expertise to evaluate products and policies for discrimination, bias, and hate. This person would make sure that the design and decisions of this platform considered the impact on all communities and the potential for radicalization and hate.
2. Submit to regular, third party, independent audits of identity-based hate and misinformation with summary results published on a publicly accessible website. We simply can no longer trust Facebook’s own claims on what they are or are not doing. A “transparency report” is only as good as its author is independent.
3. Provide audit of and refund to advertisers whose ads were shown next to content that was later removed for violations of terms of service. We have documented many examples of companies’ advertisements running alongside the horrible content that Facebook permits. That is not what most advertisers pay for, and they shouldn’t have to.
4. Find and remove public and private groups focused on white supremacy, militia, antisemitism, violent conspiracies, Holocaust denialism, vaccine misinformation, and climate denialism.
5. Adopting common-sense changes to their policies that will help stem radicalization and hate on the platform.
6. Stop recommending or otherwise amplifying groups or content from groups associated with hate, misinformation or conspiracies to users.
7. Create an internal mechanism to automatically flag hateful content in private groups for human review. Private groups are not small gatherings of friends - but can be hundreds of thousands of people large, which many hateful groups are.
8. Ensure accuracy in political and voting matters by eliminating the politician exemption; removing misinformation related to voting; and prohibiting calls to violence by politicians in any format. Given the importance of political and voting matters for society, Facebook’s carving out an exception in this area is especially dangerous.
9. Create expert teams to review submissions of identity-based hate and harassment. Forty two percent of daily users of Facebook have experienced harassment on the platform, and much of this harassment is based on the individual’s identity. Facebook needs to ensure that their teams understand the different types of harassment faced by different groups in order to adjudicate claims.
10. Enable individuals facing severe hate and harassment to connect with a live Facebook employee. In no other sector does a company not have a way for victims of their product to seek help.
The above are not sufficient, but they are a start. They are all things Mark could have committed to today. Facebook is a company of incredible resources. We hope that they finally understand that society wants them to put more of those resources into doing the hard work of transforming the potential of the largest communication platform in human history into a force for good.