Facebook’s Algorithm Comes Under Scrutiny

Whistle-blower Frances Haugen’s testimony about an algorithm’s wide-ranging implications for public safety drives home the need for Facebook to open its processes to researchers.

October 8, 2021
image001(Reuters).jpg
Facebook’s CEO, Mark Zuckerberg. Recently released internal documents describe serious social effects resulting from changes the platform made to its News Feed algorithm in 2018. (Reuters)

Scott Simms, who was the Liberal member of Parliament for a riding in northern Newfoundland until he was defeated in last month’s Canadian election, believes that Facebook’s algorithm adjustment of 2018 increased the amount of anger he faced when knocking on doors.

Until 2017, the area he represented was served by the Grand Falls Advertiser, a community newspaper. The Advertiser, which began publication in 1936, closed after losing its advertising revenue to social media. Now central Newfoundland is a news desert, which has increased the importance of Facebook in the lives of the people there.

Simms, an affable former weatherman, quick to make a joke at his own expense, was popular on Parliament Hill across party lines. Over his years in politics, he says, he has observed people getting angrier.

Because of online misinformation, as a case in point, many voters now believe “horrible, outlandish” stories about Justin Trudeau, his party’s leader, and they are difficult to deal with, Simms says.

“It becomes personal because they froth at the mouth at this individual,” Simms said in a recent interview, as he was still coming to grips with his election loss. “There’s a whole litany of reasons why they don’t like [politicians], whether they’re true or not. You become absolutely villainized as human beings.”

Simms believes that Facebook’s algorithm adjustment of 2018, which Facebook made to increase engagement, has made people intolerant of opposing views because — unlike when they were reading the Advertiser — they no longer hear opposing views.

“The feed that you get gives you every argument on one side, and never even suggests that this could be wrong. It’s pure acceptance based on the feed that you get.”

Facebook whistle-blower Frances Haugen, who testified before the US Senate on October 5, has released internal documents that show Facebook was facing a decline in engagement in 2017, which threatened its revenue stream and market dominance.

To avoid that, executives — ultimately, a decision by Mark Zuckerberg — changed the algorithm that controls the “News Feed” of its two billion customers, boosting content that favours user actions such as reshares and comments, rather than content from close friends and relatives.

Facebook referred to this preferential content as “downstream MSI” — or “meaningful social interactions.”

Internal documents released by Haugen show that after the network made the change, boosting downstream MSI, engagement did increase, but at a social cost. The documents show the company received complaints from media outlets and political parties observing that the algorithm, by promoting high-engagement content, was amplifying negative posts that provoked an emotional response from users. These reactions led to complaints of increased political strife in Poland, Spain, Taiwan and India, the documents show.

In 2019, one Facebook data scientist wrote: “While the FB platform offers people the opportunity to connect, share and engage, an unfortunate side effect is that harmful and misinformative content can go viral, often before we can catch it and mitigate its effects. Political operatives and publishers tell us that they rely more on negativity and sensationalism for distribution due to recent algorithmic changes that favor reshares.”

Haugen told senators that Facebook knows it is doing this — and other bad things — but the company chooses not to take action to mitigate the harm, such as properly staffing teams to evaluate and reduce harms.

Haugen, who previously worked for Google, Pinterest and Yelp, testified that she was disturbed by what she saw at Facebook.

“I saw Facebook repeatedly encounter conflicts between its own profits and our safety,” she said in her testimony. “Facebook consistently resolved these conflicts in favor of its own profits. The result has been more division, more harm, more lies, more threats and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people.”

Haugen first shared the Facebook documents with The Wall Street Journal, which produced a revealing, in-depth series, producing evidence that on a number of fronts, Facebook did not take steps that would have reduced serious harms produced by content on the company’s platforms. It did not take steps, for instance, to shut down human trafficking or to minimize the psychological harm that its photo- and video-sharing platform Instagram causes teen girls.

Zuckerberg, who isn’t denying the provenance of the documents, has responded by pointing to the efforts the company is making.

“If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?” Zuckerberg wrote on his Facebook page. “If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing?”

Haugen argues, though, that the company has been anything but transparent. The fact that she leaked documents with wide-ranging implications for public safety would seem to illustrate her point.

US Senators, across partisan lines, appeared to treat her testimony respectfully and appeared skeptical of the company’s protestations. But it is not clear that they will do what she wants them to do, which is not to break up Facebook or to immediately change its regulatory environment, but to force it to be transparent, to open its processes to researchers.

Currently, only Facebook understands what it is doing to society, she said, unlike earlier periods, where independent researchers could study automobile safety or the health risks of cigarettes.

“The public cannot do the same with Facebook,” she said. “We are given no other option than to take their marketing message on blind faith.”

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Stephen Maher is a Harvard Nieman Fellow and a contributing editor at Maclean’s.