Facebook: Content Governance
WHEREAS: News of Cambridge Analytica’s misappropriation of millions of Facebook users’ data preceded a decline in Facebook’s stock market capitalization of over 100 billion dollars in March 2018. Another 100- billion plus decline in market value—a record-setting drop—came in July after Facebook’s quarterly earnings report reflected increasing costs and decreasing revenue growth.
These abrupt market reactions likely reflect investors’ deep concern over the Company’s inadequate approach to governing content appearing on its platforms. Shareholders are concerned Facebook’s approach to content governance has proven ad hoc, ineffectual, and poses continued risk to shareholder value.
In September 2018 testimony, COO Sheryl Sandberg noted, “Trust is the cornerstone of our business.” Yet, trust appears seriously eroded. Pew Research found 44 percent of young Americans have deleted the Facebook app from their phones in the past year, and 74 percent of users have either deleted the app, taken a break from checking the platform, or adjusted privacy settings.
Despite Facebook’s recent efforts to increase disclosures and enhance internal compliance and enforcement strategies, abuse and misinformation campaigns continue, implicating issues such as democracy, human rights, and freedom of expression.
Facebook has been called repeatedly to testify before Congress. One Congressman noted, "Facebook can be a weapon for those, like Russia and Cambridge Analytica, that seek to harm us and hack our democracy." In August 2018, Facebook found 652 fake accounts spreading misinformation globally. Facebook’s former head of security said misinformation on Facebook shows “America’s adversaries believe that it is still both safe and effective to attack U.S. democracy using American technologies.”
The United Nations says social media played a “determining role” propagating hate speech in Myanmar, where violence against the Rohingya “bears the hallmarks of genocide.” Yet, Facebook “will not reveal exactly how many Burmese speakers are evaluating content.” In Germany, researchers found correlation between right-wing anti-refugee sentiment on Facebook and anti-refugee violence. In Libya, armed groups have used Facebook to find opponents and traffic weapons.
Facebook’s content governance challenges are complex. ProPublica reported inconsistent enforcement of hate speech, and that “racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech.” In August, Facebook censored valid users organizing against white supremacy.
BE IT RESOLVED: The Company publish a report (at reasonable cost, omitting proprietary or legally privileged information) evaluating its strategies and policies on content governance, including the extent to which they address human rights abuses and threats to democracy and freedom of expression, and the reputational, regulatory, and financial risks posed by content governance controversies.
SUPPORTING STATEMENT: Proponents recommend that, in the Company’s discretion, the report should consider the relevance of the Universal Declaration of Human Rights, the United Nations' Special Rapporteur reports on Freedom of Expression, and the Santa Clara Principles, which ask companies to disclose the impact of content policies according to:
Numbers (posts removed, accounts suspended)
Notices (of content removals, account suspensions)
Appeals (for users impacted by removals, suspensions
Resolution Details
Company: Facebook, Inc.
Lead Filers:
Arjuna Capital
Year: 2019
Filing Date:
December 2018
Initiative(s): Truth in Media
Status: 5.70%