SEC complaint: Facebook was aware that the platform was being used to promote human trafficking

For the first time, 60 Minutes is publishing whistleblower complaints filed with the Securities and Exchange Commission against Facebook by former employee Frances Haugen. The filings, made by Haugen’s attorneys, show that “our anonymous client” has submitted original evidence proving that Facebook, Inc., (NASDAQ:FB) violated U.S. Securities laws. This includes statements to potential investors, statements to Congress, filings with SEC, and testimony. “

Haugen’s attorneys have filed at least eight whistleblower complaints with the SEC based on tens of thousands of internal Facebook documents secretly copied by Haugen before she left the social media company in May. 60 Minutes obtained the SEC letters from a Congressional source.

Haugen revealed her identity on Sunday in an interview with 60 Minutes correspondent Scott Pelley. It was her first recorded interview.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen told Pelley. “And Facebook chose, over and again, to maximize for its own interest, such as making more money.

Among the allegations in the SEC filings are claims that Facebook and Instagram were aware in 2019 that the platforms were being used to “promote human trafficking and domestic servitude.” The filings also allege Facebook “failed to deploy internally-recommended or lasting counter-measures” to combat misinformation and violent extremism related to the 2020 election and January 6 insurrection.

Following the 60 Minutes report on Sunday, Facebook Vice President of Integrity, Guy Rosen, said on Twitter “We have the most comprehensive and transparent effort to fight hate speech of any major tech company. “

Facebook declined an on-camera interview with 60 Minutes before the report ran. The company issued a statement that you can read here.

Haugen’s lawyer John Tyle told 60 Minutes that “as a publicly traded company, Facebook is required to not lie to its investors or even withhold material information. “

Tye said his client is provided legal whistleblower protection from lawsuits by the Dodd-Frank Act, which became a federal law in 2010.

Haugen, a 37-year-old data scientist with a degree in computer engineering and a Master’s Degree from Harvard Business School told Pelley that Facebook “picks metrics that are in its own benefit” when it comes to publishing data about hateful content and misinformation.

“The prevalence of hate speech on Facebook is now 0. 05%, and is down by about half over the last three quarters,” Facebook’s Rosen tweeted Sunday night. We can credit a large portion of the decline in prevalence over the last three quarters to our efforts. “

Haugen’s whistleblower complaints, which you can read in full below, make allegations against the $1 trillion social media company and cite some of the internal Facebook documents Haugen copied and provided to federal law enforcement.

“The SEC filings lay out the scope of the internal research that Haugen brought forward,” said 60 Minutes producer Maria Gavrilovic. “It helped 60 Minutes understand the severity of the allegations brought by the whistleblower.”

60 Minutes contacted the SEC regarding Haugen’s allegations and was told it “does not comment on the existence or nonexistence of a possible investigation.”

—–

Facebook’s role in the 2020 election and January 6 insurrection

A whistleblower complaint filed on behalf of former Facebook employee Frances Haugen cites internal documents that reference what she claims was the company’s role in stoking political division [or polarization].

The complaint titled, “Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection” highlights internal Facebook experiments that she says found the company’s algorithm “can veer people interested in conservative topics into radical or polarizing ideas and groups/pages. “

The filing quotes an internal Facebook study that found that new test accounts, created by Facebook, which followed “verified/high quality conservative pages”, including the official pages of Fox News, Donald Trump, and Melania Trump, took one day to devolve towards recommending polarizing content. According to the same Facebook study, page recommendations started to contain conspiracy recommendations within two days. “

In a statement to 60 Minutes, Facebook said:

“We banned hundreds of militarized social movements, took down tens of thousands of QAnon pages, groups and accounts from our apps, and removed the original #StopTheSteal Group. This is in addition to our removal, and repeated disruption of various hate groups, including Proud Boys, which we banned in 2018. The responsibility lies with the people who violated the law and those who incited them. Facebook has made extraordinary efforts to remove harmful content, and will continue to do so. Also, we worked aggressively with law enforcement in both the days and weeks that followed January 6, with the aim of making sure that prosecutors have evidence linking those responsible for January 6 to the crimes. “

Facebook’s removal of hate speech

A whistleblower complaint filed on behalf of former Facebook employee Frances Haugen claims the social media company doesn’t take sufficient action regarding hateful content posted to its platform. The filing refers to an internal Facebook study which stated that only 2% of hate speech is being addressed on Facebook. Recent estimates suggest that unless there is a major change in strategy, it will be very difficult to improve this beyond 10-20% in the short-medium term. “

Another internal Facebook document cited in the whistleblower filing said, “We’re deleting less than 5% of all of the hate speech posted to Facebook. It is a optimistic estimate. “

In a statement issued to 60 Minutes on Friday, Lena Pietsch, Facebook’s director of policy communications, said, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. These complex problems would not have been solved if there had been any research. The tech sector, government, and society could already solve them. Our track record is strong in using both our internal research as well as outside research, and close collaborations with experts and organisations to make improvements to our apps.

Teen and mental health

One of the whistleblower complaints says that Facebook CEO Mark Zuckerberg misled members of Congress in March when he testified about Facebook and Instagram’s effect on the health of young girls. Responding to a question Zuckerberg stated that he didn’t believe the platform hurts children.

The SEC filing cites internal Facebook research that found:

13.5% of teen girls on lnstagram say the platform makes thoughts of “Suicide and Self Injury” worse

17% of teen girl lnstagram users say the platform makes “Eating Issues” (e.g. anorexia and bulimia) worse

“We make body image issues worse for 1 in 3 teen girls. “

In a statement issued to 60 Minutes, a spokesperson for Instagram said, “Contrary to [the] characterization, Instagram’s research shows that on 11 of 12 well-being issues, teenage girls who said they struggled with those difficult issues also said that Instagram made them better rather than worse. “

Human Trafficking

An October 2019, internal Facebook document quoted in Haugen’s whistleblower complaint cited the company’s knowledge of Facebook, Instagram, and WhatsApp being used for what it called “domestic servitude. “

The filing cited an internal Facebook document that said:

“Our investigative findings demonstrate that … our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks … The traffickers, recruiters and facilitators from these ‘agencies’ used FB profiles, IG profiles, Pages, Messenger and WhatsApp …. “

In an exchange last week with Tennessee Senator Marsha Blackburn (R) on allegations of human trafficking on the social media platform, Facebook’s global head of safety Antigone Davis stated “…in fact, we have policies against sex trafficking on our platform.”

Facebook’s algorithms and the promotion of misinformation and hate speech

Another whistleblower complaint filed on behalf of Haugen alleges that Facebook misled investors and the public when it said it prioritizes “meaningful social interactions” (MSI) through its algorithms. According to the complaint, Facebook promotes hate speech and polarizing misinformation.

The filing claims that in 2018, Mark Zuckerberg announced a shift from prioritizing time spent on Facebook to focusing on MSI, emphasizing a focus on showing friends and family content in users’ newsfeeds. Internal Facebook research cited in the complaint shows that prioritizing MSI only encourages misinformation, divisive and low-quality content. According to one report, links that have negative comments are more likely to receive more traffic if they contain more content. “

In a statement to 60 Minutes, Pietsch said, “Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook even existed, and that it is decreasing in other countries where Internet and Facebook use has increased. While we have our part to play, and will make further changes to improve people’s experiences, blaming Facebook for these problems ignores the root causes and research. “

Facebook’s “XCheck” program and the whitelisting of VIPs

A whistleblower complaint filed on behalf of Haugen alleges that Facebook misled investors and the public about equal enforcement of its terms since high-profile users are “whitelisted” under its “XCheck” program.

An internal Facebook report cited by the complaint says that in 2020, “XCheck” (pronounced cross-check) entities were shielded from the majority of integrity actions on the site. The report stated that “that means” only a few people in our community are not following our standards and policies. Unlike the rest of our community, these people can violate our standards without any consequences… since we currently review less than 10% of Checked content. “

In a previous statement to the Wall Street Journal, Facebook spokesperson Andy Stone stated that this system, “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding. “

Global division and ethnic violence

Another whistleblower complaint filed on behalf of Haugen claims Facebook misled investors and the public about bringing “the world closer together.” According to the filing, internal Facebook documents show that Facebook’s language capabilities were inadequate leading to ethnic violence and misinformation. “

According to the complaint, documents show that Facebook’s foreign language capabilities are inadequate. According to the complaint, one study states that “in Afghanistan, the hate speech rate is alarmingly low.” “

The complaint goes on to say Facebook’s written translations do not account for regions where significant numbers of users cannot read, nor do they appropriately manage safety systems for different dialects.

Internal records cited in the complaint show how these linguistic shortcomings may lead to violent and incendiary content: “Anti-Muslim narratives targeted pro-Hindu populations with [violent and incendiary] intent… There were a number of dehumanizing posts comparing Muslims to ‘pigs’ and ‘dogs’ and misinformation claiming the Quran calls for men to rape their female family members. This content has not been flagged and taken action due to a lack of Bengali or Hindi classifiers. “

Facebook responded in a statement to 60 Minutes, stating, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. These complex problems would not have been solved if there had been any research. “

In a 2018 blog post titled “An Independent Assessment of the Human Rights Impact of Facebook in Myanmar,” Alex Warofka, a Facebook product policy manager wrote, “…we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We all agree that more can be done. “

Facebook’s reach

A whistleblower complaint filed on behalf of Haugen claims Facebook misled investors and advertisers about shrinking user bases in important demographics, declining content production and the true number of recipients of “reach and frequency” advertising. The complaint refers to internal records that show teens and young people in developed countries are less using Facebook. The complaint claims that Facebook misrepresented the true number of users it has to advertisers for many years and did not account for multiple users. An internal report cited in the complaint shows that if single users with multiple accounts were properly handled, there would be “audience size reduction” for reach and frequency campaigns as follows “…18% of current R&F revenue using broad targeting… will see a decrease in audience size target than 10%. Broad targeting is the most common method of R&F campaign. The audience size shrinkage will be in the 5-8% area. “

(c) 2021 CBS Interactive Inc. All Rights Reserved.

Read More

Related Posts