New York, Oct 26 (IANS) Whistleblower Frances Haugen began her 158-minute testimony before British parliamentarians on Monday stating that the Facebook transparency report inflates, multiple times over, hate speech it actually takes down.
The chair began by drawing Haugen, former product manager and Harvard Business School alum, into the behemoth's transparency claim that 97 per cent of hate speech they can find is taken down.
Haugen simplified the math pointing to how "what they (Facebook) can find" is the trick phrase here.
The actual hate speech brought down as a percentage of total hate speech on the platform is 3-5 per cent, she said.
"The (97 per cent) stuff they (Facebook) talk about is (hate) stuff that robots got divided by the stuff that robots got plus the stuff that humans reported! That's not the number we want. The fraction we expect to hear is total hate speech caught divided by total hate speech."
Though most of Haugen's testimony repeated what she said before US lawmakers earlier, the British are all ears because they are toying with the idea of an online regulator who can splice through the chaff and ask these kinds of 'right' questions.
A global barrage of scrutiny, basis some 10,000 pieces of internal documentation, is being called, The Facebook Project. A collective of 17 competing media platforms have been creeding in sync.
This on the day Facebook's Q3 is expected at the end of market hours.
"Facebook is expected to report earnings on 10/25/2021 after market close. The report will be for the fiscal Quarter ending Sep 2021. According to Zacks Investment Research, based on 13 analysts' forecasts, the consensus EPS forecast for the quarter is $3.2. The reported EPS for the same quarter last year was $2.71," a Nasdaq notice said.
As analysts wait for the investor call, the trickiest bits may not be about financials, but the whistleblower who warned that Facebook, despite having thousands of decent folks on its staff, will fuel more episodes of violent unrest around the world because of the way its algorithms are designed to promote divisive content.
The former product manager on the civic misinformation team claimed that the social network saw safety as a cost centre, lionised a start-up culture where cutting corners was a good thing, and was "unquestionably" making hatred.
"The events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritizes and amplifies divisive and polarizing extreme content and two it concentrates it," Haugen said.
Haugen argued that the algorithms pushed users towards the extreme. "So someone center left, they'll be pushed to radical left, someone centre right will be pushed to radical right."
Likewise, children being pushed into the pit via accounts that authority figures can't find out.
Facebook CEO Mark Zuckerberg has contested such accusations earlier this month. "The argument that we deliberately push content that makes people angry for profit is deeply illogical," Zuckerberg - ranked 7th in the Forbes real time list of billionaires at $116 billion as of Monday - had said.
Documents cited indicate that Facebook had known that it hadn't hired enough workers who possessed both the language skills and knowledge of local events needed to identify objectionable posts from users in a number of developing countries.
The Facebook Papers project is a unique collaboration among 17 American news organizations, including The Associated Press and Reuters.
A separate consortium of European news outlets had access to the same set of documents, and members of both groups began publishing content at 7 a.m. EDT on Monday, Oct. 25.
That date and time was set by the partner news organizations to give everyone in the consortium an opportunity to fully analyze the documents, report out relevant details, and to give Facebook's public relations staff time to respond to questions and inquiries raised by that reporting.