A Blog by Jonathan Low

 

Sep 22, 2021

How Facebook's Algorithm Thwarted Its CEO's Desire To Promote Covid Vaccination

The algorithm promotes engagement and connectivity. Anti-vaccine activists took advantage of that - and of the CEO's unwillingness to intervene even when the content being posted was demonstrably harmful. 

In order to control misinformation, Facebook would have to fundamentally change the way it operates its site. It seems prudent to assume that given the trade off between user engagement and tamping down harmful content, this CEO will continue to promote engagement. JL   

Sam Schechner and colleagues report in the Wall Street Journal:

The effort demonstrated the gulf between Zuckerberg's aspirations and reality, where the company’s aims bring it into conflict with its own users. Antivaccine activists took advantage of his (unwillingness) to embrace a more interventionist stance against users. Even when he set a goal, the chief executive couldn’t steer the platform as he wanted. Of randomly sampled English-language comments containing Covid-19-related and vaccine-related phrases two-thirds “were anti-vax.” The prevalence of antivaccine sentiment in the U.S. (is) 40% lower. “We created the machine and we can’t control the machine.”

In mid-March, Mark Zuckerberg used his Facebook page to announce a goal that was both ambitious and personal. He wanted his company to use its formidable resources to push 50 million people toward Covid-19 vaccines.

In a post and a press release, the chief executive discussed Facebook Inc.’s initiatives to promote vaccines. He unveiled collaborations with global health organizations. And he touted that his company had “already connected more than 2 billion people to authoritative Covid-19 information.”

Inside Facebook, staffers were warning that Mr. Zuckerberg’s own platform, the globe-spanning powerhouse built on code he wrote 17 years ago, was compromising his effort.

For more than a month, Facebook researchers warned that comments on vaccine-related posts—often factual posts of the sort Facebook sought to promote—were filled with antivaccine rhetoric aimed at undermining their message, internal documents reviewed by The Wall Street Journal show. The comments ranged from personal objections all the way to debunked falsehoods and conspiracy theories.

The wave of negative comments worried global health institutions, including the World Health Organization and Unicef, the documents say. One internal Facebook memo cited “anti-vaccine commenters that swarm their Pages.”


In the weeks before Mr. Zuckerberg made his announcement, another memo said initial testing concluded that roughly 41% of comments on English-language vaccine-related posts risked discouraging vaccinations. Users were seeing comments on vaccine-related posts 775 million times a day, the memo said, and Facebook researchers worried the large proportion of negative comments could influence perceptions of the vaccines’ safety.

Even authoritative sources of vaccine information were becoming “cesspools of anti-vaccine comments,” the authors wrote. “That’s a huge problem and we need to fix it,” they said.

Facebook’s goal of protecting the rollout of the Covid vaccines, described in one memo as “a top company priority,” was a demonstration of Mr. Zuckerberg’s faith that his creation is a force for social good in the world. But the effort ended up demonstrating the gulf between his aspirations and the practical reality of the world’s largest social platform—where the company’s aims can bring it into conflict with its own users.

Despite Mr. Zuckerberg’s effort, a cadre of antivaccine activists flooded the network with what Facebook calls “barrier to vaccination” content, the memos show. They used Facebook’s own tools to sow doubt about the severity of the pandemic’s threat and the safety of authorities’ main weapon to combat it.

By this summer, the prevalence of false and misleading vaccine information on Facebook prompted a public scolding from President Biden, who said the falsehoods were “killing people.”

The vaccine documents are part of a collection of internal communications reviewed by the Journal that offer an unparalleled picture of how Facebook is acutely aware that the products and systems central to its business success routinely fail and cause harm.

Facebook’s own research lays out in detail how its rules favor elites; its platforms have negative effects on teen mental health; its algorithm fosters discord; and that drug cartels and human traffickers use its services openly.

The documents show that Facebook has often made minimal or ineffectual efforts to address the issues and plays them down in public.


Since the Journal began publishing articles based on the documents, several lawmakers have expressed outrage at the revelations, and two senators have announced an investigation into Facebook’s internal research on how its Instagram service affects young users.

Some Facebook officials have become concerned that Mr. Zuckerberg or Chief Operating Officer Sheryl Sandberg may face questions from lawmakers about how their past public statements on these issues square with the company’s internal assessments, according to people familiar with the matter. The company is also tightening the reins on how information is shared internally, the people said.

The Covid-19 mess in particular strikes at the heart of Facebook’s problem: its users create the content, but their comments, posts and videos are hard to control, given how Facebook built and runs its platform, in ways that are fundamentally different from a company shaping its product or a publisher curating stories. Even when he set a goal, the chief executive couldn’t steer the platform as he wanted.

“We’re focused on outcomes, and the data shows that for people in the U.S. on Facebook, vaccine hesitancy has declined by about 50% since January, and acceptance is high,” Facebook spokesman Aaron Simpson said in a statement. The documents show Facebook’s “routine process for dealing with difficult challenges,” he said. “Narrowly characterizing leaked documents doesn’t accurately represent the problem, and it also ignores the work that’s been underway to make comments on posts about COVID-19 and vaccines safer and more reliable.”

Optimistic view

Mr. Zuckerberg has long espoused the belief that Facebook’s role connecting people makes it a tool to help solve the world’s problems. Former executives say that optimism left him and his company repeatedly ill-prepared when people used the platform in ways it didn’t anticipate.

“The internal narrative is that the platform is by and large good,” said Brian Boland, a former Facebook vice president who managed business relationships and left late last year in part because he said the company wasn’t forthcoming enough about its problems. He credits Mr. Zuckerberg with getting Facebook to work quickly on health initiatives during the pandemic but said his focus on connecting people created a blind spot for company leaders. “There was not a lot of discussion in our circles of, ‘Hey, are people propagating harmful messages on the platform?’ ” he said.

Facebook has similarly struggled with how to handle the spread of inaccuracies on other issues, from QAnon conspiracy theories and other election falsehoods to hoax cancer cures and Holocaust denial. Mr. Zuckerberg initially permitted such denials on the platform on free speech grounds but last year changed his position, citing rising anti-Semitic violence.

Fringe political activists used Facebook Groups, user-run communities devoted to topics and interests, to stir violence, the Journal has reported. The company had heavily promoted the product for years, though it clamped down in the wake of the 2020 U.S. election.

Facebook had plenty of warnings that a campaign to roll out a new vaccine might provoke a backlash. Antivaccine groups had already leveraged social media to gain followers and spread false vaccine claims amid measles outbreaks in parts of the U.S. In 2019, after the issue became the subject of public outcry, Facebook promised a crackdown. Months later, the company struggled to make progress.

Renée DiResta, a leading researcher of online information at Stanford Internet Observatory who has advised Congress and the State Department, said she regularly warned Facebook about the tactics of antivaccine activists long before the pandemic. “People in the company recognized it as a problem,” Ms. DiResta said. “Where is the disconnect?”

Facebook employees had previously flagged comments made on posts as a largely unaddressed problem, according to a former employee and the documents reviewed by the Journal. Research in 2018 and 2019 found that comments were what one memo described as “an important source of misinformation, even on seemingly innocuous articles.”

Mr. Zuckerberg has often stepped in to limit Facebook’s intervention on contentious content, saying it doesn’t take sides in controversial areas like politics and doesn’t want to be the arbiter of truth. On the Covid vaccine, though, Mr. Zuckerberg was clear in his support, and on his desire for Facebook to assist public-health authorities in the vaccination effort.

Long interested in public health, the CEO and his wife, Priscilla Chan, a pediatrician, founded their Chan Zuckerberg Initiative philanthropy in 2015 with that as a central focus. The same year, he posted to Facebook recommending a book that he said explained why vaccine doubts were unfounded.

Mr. Zuckerberg has often described his company as a powerful engine to improve the world. In a 5,700-word essay in 2017, when Facebook was under fire after the 2016 election, he wrote that Facebook’s next mission was building “social infrastructure” in part to make the world more resilient in crises. “Our greatest challenges also need global responses—like ending terrorism, fighting climate change, and preventing pandemics,” Mr. Zuckerberg wrote.

In February 2020, as the coronavirus spread, Facebook opened its Menlo Park, Calif., headquarters to the WHO for a meeting with tech companies including Alphabet Inc.’s Google and Twitter Inc., where a WHO official discussed the companies’ role in spreading “lifesaving health information,” according to the WHO.

Mr. Zuckerberg also emailed Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, to ask how he could personally help fund vaccine trials, according to correspondence provided to the Journal as part of a Freedom of Information Act request.

In subsequent emails to Dr. Fauci, Mr. Zuckerberg offered the government Facebook advertising credits for public-service announcements, as well as aggregated user data to help with decision-making. He also asked whether Dr. Fauci would appear with him in a live Facebook Q&A about the pandemic. The two appeared in one of multiple live Facebook videos four days later.

Mr. Zuckerberg also announced that Facebook would coordinate its approach with global health authorities like the WHO, to whom it would direct users searching for coronavirus information. He said Facebook was removing false claims and conspiracy theories those authorities flagged. The company’s standard at the time was to remove false Covid claims if they could cause imminent harm, such as by promising false protection from the disease.

That April, Guy Rosen, Facebook’s vice president of integrity, said Facebook would prompt users who had reacted to or commented on Covid-related posts the company had later removed to instead share information from the WHO.

“Through this crisis, one of my top priorities is making sure that you see accurate and authoritative information across all of our apps,” Mr. Zuckerberg wrote in an accompanying post that said Facebook had removed hundreds of thousands of false claims around Covid.

Mass postings by ‘big whales’

Facebook went further, and discussed what more it could do to tamp down borderline posts that came just short of violating its rules, many of which it labeled as false but didn’t remove, the documents show.

As part of these discussions, Facebook changed its ranking for health-related content in a way that reduced views of what Facebook called “health misinfo” in posts between 6.7% and 9.9%, according to a June 2020 memo.

Still, false and misleading coronavirus information was rampant on the site. In May 2020, a video titled “Plandemic” advanced false claims such as the notion that masks worsened the coronavirus. It was highly popular on Facebook and promoted via Facebook ads before the company removed it.

In August 2020, a report by advocacy group Avaaz concluded that the top 10 producers of what the group called “health misinformation” were garnering almost four times as many estimated views on Facebook as the top 10 sources of authoritative information. Facebook needed to take harsher measures to beat back “prolific” networks of Covid misinformation purveyors, Avaaz warned.

Mr. Zuckerberg wasn’t ready to embrace a more interventionist approach against its users. While he disagreed with antivaccine activists, his company was committed to removing only content that health officials said posed an imminent threat.

“I think that if someone is pointing out a case where a vaccine caused harm, or that they’re worried about it, that’s a difficult thing to say, from my perspective, that you shouldn’t be allowed to express at all,” Mr. Zuckerberg said in a September interview with Axios on HBO.

As the rollout of the vaccine began early this year, antivaccine activists took advantage of that stance. A later analysis found that a small number of “big whales” were behind many antivaccine posts and groups on the platform. Out of nearly 150,000 posters in Facebook Groups disabled for Covid misinformation, 5% were producing half of all posts, and around 1,400 users were responsible for inviting half the groups’ new members, according to one document.

“We found, like many problems at FB, this is a head-heavy problem with a relatively few number of actors creating a large percentage of the content and growth,” Facebook researchers would write in May, likening the movement to QAnon and efforts to undermine elections.

A Facebook employee also warned that antivaccine forces might be dominating comments on posts, possibly giving users a false impression that such views were widespread.

“I randomly sampled all English-language comments from the past two weeks containing Covid-19-related and vaccine-related phrases,” the researcher wrote early this year, adding that based on his assessment of 110 comments, about two-thirds “were anti-vax.” The memo compared that figure to a poll showing the prevalence of antivaccine sentiment in the U.S. to be 40 points lower.

Two Democratic congressmen wrote to Mr. Zuckerberg complaining that activists had used a Facebook Group to organize a protest that temporarily closed a vaccination center at Dodger Stadium in Los Angeles. “The conspiracy theories exchanged on Facebook on a daily basis are cultivating a perilous environment for our constituents during this public health emergency,” one of them said.

In February, Facebook made a big change to catch up with the wave of antivaccine content. The company said it would now remove a much longer list of false vaccine claims than before—including that vaccines aren’t effective, or that it is safer to get the disease than to be vaccinated—rather than simply labeling them as false.

One memo early in the year found that over a fifth of all vaccine-related posts in English were hostile to inoculations.

The company’s efforts suffered from technical limitations. An integrity staffer circulated a memo about a post that had 53,000 reshares and three million views. It said vaccines “are all experimental & you are in the experiment.” The staffer called it “a bad miss for misinfo”—noting that Facebook’s systems mistakenly thought it was written in Romanian, which is why it wasn’t demoted.

Even when they worked as intended, the systems used to detect vaccine posts for removal or demotion weren’t built to work on comments, the documents show.

Employees improvised, and by late February, two Facebook data scientists came up with a rough way to scan for what they called “vaccine hesitant” comments. They wrote in memos that “vaccine hesitancy in comments is rampant”—twice as prevalent as in posts. One of the scientists pointed out the company’s ability to detect the content in comments was “bad in English, and basically non-existent elsewhere.”

from the files

• Vaccine hesitancy in comments is rampant.

• Our ability to detect vaccine-hesitant comments is bad in English, and basically non-existent elsewhere.

Source: Internal report titled ‘Vaccine Hesitancy in Comments: C19D Lockdown Update’

Mr. Simpson, the Facebook spokesman, said the research cited in the memos was preliminary and “over-states the amount of misleading vaccine content.”

Unicef was among multiple global health groups that expressed their worries to Facebook about antivaccine comments on their posts, the documents show. Two memos included a screenshot of a post from Unicef that promised, “Expert answers to common questions about COVID-19 vaccines.” Below it, a comment with 795 reactions reads “…Posts like this make me very wary. No thanks.” The United Nations agency was described as being “really worried.”

A Unicef staffer said in an interview the group noticed its pro-vaccine posts faced “a huge deluge of antivax sentiment” when they reached a wider-than-normal audience, such as when they featured a famous spokesperson. Facebook’s main advice to Unicef, the staffer said, was to “keep posting information that we know cuts through and targets our key audience.”

“Who knows how much more successful those campaigns might be if they weren’t swarmed by anti-vax comments?” the staffer said.

Unicef said it continued to promote its vaccine posts using Facebook’s ad credits because its surveys showed the campaigns were working to boost vaccine confidence.

Reining in comments

In late March, Facebook rolled out a change to help users address hostile responses to their public posts by turning off comments. Facebook didn’t mention antivaccine content when announcing the change, but Mr. Rosen, the vice president of integrity, in an internal memo included the change when touting recent policies “to combat vaccine discouragement.”

from the files

We know that COVID vaccine hesitancy has the potential to cause severe societal harm. We believe that compounds the necessity of deploying temporary BTG measures and even risking over-enforcement.

Note: BTG= break the glass

Source: Internal report titled ‘“Harmful Non-Violating Narratives” Is a Problem Archetype in Need of Novel Solutions’

Given the research showing a small number of posters and commenters were responsible for a large amount of antivaccine content, Facebook slashed the number of comments a person could make on posts from authoritative health sources to 13 per hour from 300, according to an April 2 internal memo.

In early May, the company activated more emergency responses it called “break the glass” measures to further demote the news feed ranking of content it described as sensationalist, alarmist or even indirectly discouraging vaccines. After a manual review, the content would be demoted in the ranking by 50%, according to an internal memo.

Facebook’s Mr. Simpson said the company also took other steps in the spring to try to contain problems with comments, including lowering the ranking of all vaccine-related comments it deemed sensationalist or discouraging. Facebook also offered commenters on Covid-related posts a choice of pre-written vaccine facts they could append, something users did to about 70,000 comments seen over 10 million times in August, he said.

Company researchers were considering developing other tools, too, according to the documents. “It might be worth creating some classifiers to predict entities that are violating” Facebook policies, such as one for what Facebook calls Dedicated Vaccine Discouragement Entities, they wrote in another internal memo.

Biden administration officials around that time were asking Facebook for information about its handling of false vaccine claims, such as the impact of its downranking changes, a U.S. official said. But with such content still common, the administration became increasingly dissatisfied with Facebook’s responses to requests for information, officials said. In July, Surgeon General Vivek Murthy warned that social-media companies “have enabled misinformation to poison our information environment, with little accountability to their users.”

President Biden said he hoped Facebook “would do something about the misinformation” on the vaccine.

The U.S. official said this month that administration officials remain deeply frustrated about the social-media company’s level of information sharing.

Facebook’s Mr. Rosen said in a public post in July that the company wasn’t responsible for vaccine hesitancy in the U.S., and that it was helping promote vaccines. He cited a survey that showed vaccine acceptance by Facebook users in the U.S. had risen 10 to 15 percentage points since January, and said it had removed or reduced the visibility of more than 185 million pieces of debunked or false Covid content.

Internally, Facebook documents show it had been playing catch-up for months, trying to manage the flood of misleading and false information aimed at undermining the vaccine effort.

At a gathering of Facebook’s leadership in and around Menlo Park early this month, some officials discussed whether Facebook has gotten too big, with too much data flowing to manage all of its content, said people familiar with the gathering. The tone from some participants was, “We created the machine and we can’t control the machine,” one of the people said.

A Facebook spokesman disputed that characterization of the gathering and what was said at it.

In August, the company said it had removed 20 million items that violated its Covid policies.

“If we see harmful misinformation on the platform, then we take it down. It’s against our policy,” Mr. Zuckerberg said in an interview on “CBS This Morning.” “But do we catch everything? Of course, there are mistakes that we make or areas where we need to improve.”

0 comments:

Post a Comment